Nasscomm

Nasscomm

Data Engineer (W2 Only)

Cupertino, California, USContractor2 days agovia LinkedIn

Job description

Title: Data Engineer|W2 Only

Locations: Cupertino, CA; New York City, NY; Austin, TX – (candidates within commuting distance) - 3 days a week client Office

Contract: 9+ Months

Job Description:

The Data Foundations Engineer designs and scales modern data architectures powering Wallet, Payments, and Commerce products.

This role focuses on building high-performance data pipelines and enabling analytics and ML use cases, with strong fundamentals in data modeling and scalable systems.

Key Responsibilities
• Data Engineering & Architecture
• Design and implement scalable batch and near-real-time data pipelines.
• Develop ETL/ELT workflows optimized for performance and cost.
• Implement dimensional data models and standardize business metrics.
• Instrument APIs and user journeys to capture behavioral and transactional data.
• Data Governance & Quality
• Ensure data integrity, governance, privacy, and compliance.
• Maintain reliability and availability of mission-critical systems.

Required Qualifications
• 6+ years of experience in data engineering for analytics or ML systems.
• Strong SQL proficiency.
• Experience in Python, Scala, or Java.
• Hands-on experience with Spark, Kafka, and Airflow (or similar).
• Strong understanding of data modeling and lakehouse architectures (e.g., Iceberg).
• Experience with AWS, Azure, or GCP.
• Comfortable participating in rotating on-call.
• Experience with Snowflake, Databricks, Trino, OLAP/NRT systems, Superset or Tableau.
• Familiarity with CI/CD, data observability, infrastructure-as-code.
• Exposure to MLOps and GenAI/RAG pipelines.
• Hands-on experience with LLMs (prompt engineering, fine-tuning, RAG).
• Experience in FinTech, Wallet, or Payments domain.

Responsibilities

  • The Data Foundations Engineer designs and scales modern data architectures powering Wallet, Payments, and Commerce products
  • This role focuses on building high-performance data pipelines and enabling analytics and ML use cases, with strong fundamentals in data modeling and scalable systems
  • Data Engineering & Architecture
  • Design and implement scalable batch and near-real-time data pipelines
  • Develop ETL/ELT workflows optimized for performance and cost
  • Implement dimensional data models and standardize business metrics
  • Instrument APIs and user journeys to capture behavioral and transactional data
  • Data Governance & Quality
  • Ensure data integrity, governance, privacy, and compliance
  • Maintain reliability and availability of mission-critical systems

Qualifications

  • 6+ years of experience in data engineering for analytics or ML systems
  • Strong SQL proficiency
  • Experience in Python, Scala, or Java
  • Hands-on experience with Spark, Kafka, and Airflow (or similar)
  • Strong understanding of data modeling and lakehouse architectures (e.g., Iceberg)
  • Experience with AWS, Azure, or GCP
  • Comfortable participating in rotating on-call
  • Experience with Snowflake, Databricks, Trino, OLAP/NRT systems, Superset or Tableau
  • Familiarity with CI/CD, data observability, infrastructure-as-code
  • Exposure to MLOps and GenAI/RAG pipelines
  • Hands-on experience with LLMs (prompt engineering, fine-tuning, RAG)
  • Experience in FinTech, Wallet, or Payments domain

Track your job applications with Mokaru

Save jobs, track applications, and let AI tailor your resume for each position.

Similar jobs

Ready to land your next role?

Join thousands of professionals who use Mokaru to manage their job search. AI-powered resume tailoring, application tracking, and more.

Create Free Resume