Data Engineer – STL

Description

Team Daugherty is hiring a Data Engineer to join us in St. Louis. The ideal candidate for this position is a problem solver with the ability to utilize insights, creativity and perspective to drive business success for our clients.

As a Data Engineer you will have the opportunity to:

  • Contribute to the creation and maintenance of optimal data pipeline architectures.
  • Collaborate and work closely with team to build data platforms.
  • Maintain and manage Hadoop clusters in development and production environments.
  • Assemble large, complex data sets that meet functional/non-functional business requirements.
  • Work with team members and functional leads to understand existing data requirements and validation rules to support moving existing data warehouse workloads into a distributed data platform.
  • Create custom software components (e.g. specialized UDFs) and analytics applications.
  • Employ a variety of languages and tools to marry systems together.
  • Recommend ways to improve data reliability, efficiency and quality.
  • Implement & automate high-performance algorithms, prototypes and predictive models.

We are looking for someone with:

  • 1+ years of experience in a similar role.
  • Proven experience working with AWS technologies such as Redshift, RDS, S3, EMR, ADP, Hive, Kinesis, SNS/SQS and QuickSight.
  • Familiarity with Python, R, sh/bash and JVM-based languages including Scala and Java.
  • Familiarity with Hadoop family languages including Pig and Hive.
  • Familiarity with high performance data libraries including Spark, NumPy and TensorFlow.
  • Proven ability to pick up new languages and technologies quickly.
  • Intermediate level of SQL programming and query performance tuning techniques for data integration and consumption using design for optimum performance against large data assets within an OLTP, OLAP and MPP architecture.
  • Knowledge of cloud and distributed systems principles, including load balancing, networks, scaling, and in-memory versus disk.
  • Experience building data pipelines to connect analytics stacks, client data visualization tools and external data sources.
  • Exposure to stream-processing and messaging, such as Storm, Spark-Streaming, Kafka and MQ.
  • Understanding of DevOps and CI/CD toolset, such as Jenkins, GitLab CI, Buildbot, Drone and Bamboo.
  • Some experience with programming Languages, such as Scala, Java, R and Python.

We offer members of Team Daugherty:

  • Excellent health, dental and vision insurance.
  • Revenue sharing and a 401(k) retirement savings plan.
  • Life, disability and long-term care insurance.
  • Little to no travel.
  • Robust career development and training.

Do you think you’re a good fit for Team Daugherty? Apply now and find out why working here satisfies the smart, the talented and the curious!

Apply Online  
Apply
Drop files here browse files ...

Related Jobs