Tier4 Group

Tier4 Group

Website

Data Engineer 5063

Role

Data Engineer 5063

Job type

Contractor

Posted

5 days ago

Salary

Not disclosed by employer

Job description

Title: Data Engineer (DevOps / AWS Migration)

Location: Milwaukee, WI

Type: Hybrid (3 days onsite per week)

Duration: ASAP - 12/31/2026

Perks: Benefits, free daily lunch when onsite

Job Description

We are seeking a Data Engineer to support cloud-based data solutions and AWS migration initiatives within HR workforce analytics. In this role, you will design, build, deploy, and maintain scalable data and software solutions across the full development lifecycle. You’ll partner closely with cross‑functional teams to solve complex technical challenges, influence architecture decisions, and continuously improve development and integration practices.

This role requires strong experience with AWS, Python, Spark, SQL, modern data integration patterns, and DevOps methodologies, along with a passion for data quality, operational excellence, and production stability.

What You’ll Do

  • Design, develop, deploy, and support cloud-based applications using established SDLC and CI/CD practices
  • Troubleshoot and resolve technical issues during development and deployment
  • Conduct thorough code reviews to ensure quality, security, and adherence to best practices
  • Collaborate with engineers and stakeholders to align on technical approaches and architecture
  • Contribute to system‑wide technical and architectural discussions
  • Recommend improvements to development pipelines, tooling, and integration practices
  • Ensure high standards of data quality, reliability, and operational performance

Required Qualifications

  • Bachelor’s degree or equivalent practical experience
  • 4+ years of professional experience working with AWS
  • 4+ years of experience with modern engineering tools, languages, and development practices
  • 2+ years of experience with data integration patterns and tooling, including:ETL / ELT
  • Event streaming and real‑time processing
  • Replication and virtualization
  • Strong coding experience in Python, Apache Spark, and SQL
  • Experience working in Agile and DevOps environments
  • Solid understanding of database concepts, data modeling, and data quality principles
  • Experience with cloud-based development (PaaS/SaaS), containerization (Docker and/or Kubernetes), Infrastructure as Code, and Terraform
  • Familiarity with centrally governed CI/CD pipelines
  • Understanding of automated testing practices, including unit testing and Test‑Driven Development
  • Strong communication skills with the ability to explain complex technical concepts to both technical and non‑technical audiences

Nice to Have

  • Strong passion for operational excellence, ownership, and problem‑solving
  • Experience delivering reliable, high‑performance production systems
  • Ability to break down complex solutions into actionable work for agile teams
  • Experience refining features, defining solutions, and driving continuous improvement initiatives
  • 3–5 years of professional software development experience
  • 3–5+ years working with AWS services such as Lambda and Kubernetes
  • Proficiency in domain data modeling and API‑first design
  • Proven track record of designing and delivering impactful technology solutions
Resume ExampleCover Letter Example

Explore more

Similar jobs