Jobs via Dice

Jobs via Dice

W2- Cupertino, CA / New York City, NY / Austin, TX (Hybrid) :: Data Engineer with MLOps, LLM and GenAI/RAG Exp. (Only G.C / U.S.C)

Cupertino, California, USFull-time2 days agovia LinkedIn

Job description

Dice is the leading career destination for tech experts at every stage of their careers. Our client, Bitsoft International, Inc., is seeking the following. Apply via Dice today!

Data Engineer with MLOps, LLM and GenAI/RAG Exp. (Only G.C / U.S.C)

6+Months

Cupertino, CA / New York City, NY / Austin, TX (Hybrid, Onsite 3 days a week)

Core Competence: Data Engineering with a track record of 6 to 9 years

Role Insight: The incumbent will spearhead the development of advanced data infrastructures that drive the functionality of Wallet, Payments, and Commerce initiatives. The engineer will concentrate on crafting and optimizing high-caliber data pipelines to support analytics and machine learning objectives, ensuring robust data structure design and system scalability.

PRINCIPAL ACCOUNTABILITIES:

Data Engineering & Systematic Blueprint:

Craft and scale both batch and real-time proximate data processing systems.

Perfect ETL/ELT mechanisms to ensure optimized execution and economic feasibility.

Deploy structured data representation and consistently define business analytics.

Enhance application interfaces and consumer interactions through comprehensive data collection.

Data Stewardship & Integrity:

Uphold stringent data accuracy, regime, privacy, and regulation adherence.

Guarantee the steadfastness and accessibility of vital organizational platforms.

PROFESSIONAL PREREQUISITES:

Over half a decade of expertise in data engineering, targeting analytical or machine learning endeavors.

Exceptional command over SQL.

Proficiency in programming with Python, Scala, or Java.

Practical skill set with big data tools like Spark, Kafka, and workflow orchestrators like Airflow or equivalents.

Deep familiarity with contemporary data construction and Lakehouse architecture concepts (e.g., Iceberg).

Experience managing cloud environments such as AWS, Azure, or Google Cloud Platform.

Willingness to support systems on a rotating call basis.

Applied knowledge of data warehousing solutions like Snowflake, Databricks, fast analytics on Trino, OLAP/NRT databases, visualization with Superset or Tableau.

Grounding in Continuous Integration/Continuous Deployment (CI/CD), data monitoring, and declarative infrastructure setup.

Proficiency in MLOps practices and involvement with Generative AI/RAG infrastructure.

Practical engagement with Large Language Models, including prompting strategies, fine-tuning practices, and RAG.

Background in Financial Technology, Wallet solutions, or Payment systems.

Responsibilities

  • Role Insight: The incumbent will spearhead the development of advanced data infrastructures that drive the functionality of Wallet, Payments, and Commerce initiatives
  • The engineer will concentrate on crafting and optimizing high-caliber data pipelines to support analytics and machine learning objectives, ensuring robust data structure design and system scalability
  • Data Engineering & Systematic Blueprint:
  • Craft and scale both batch and real-time proximate data processing systems
  • Perfect ETL/ELT mechanisms to ensure optimized execution and economic feasibility
  • Deploy structured data representation and consistently define business analytics
  • Enhance application interfaces and consumer interactions through comprehensive data collection
  • Data Stewardship & Integrity:
  • Uphold stringent data accuracy, regime, privacy, and regulation adherence
  • Proficiency in MLOps practices and involvement with Generative AI/RAG infrastructure

Qualifications

  • Core Competence: Data Engineering with a track record of 6 to 9 years
  • Guarantee the steadfastness and accessibility of vital organizational platforms
  • Over half a decade of expertise in data engineering, targeting analytical or machine learning endeavors
  • Exceptional command over SQL
  • Proficiency in programming with Python, Scala, or Java
  • Practical skill set with big data tools like Spark, Kafka, and workflow orchestrators like Airflow or equivalents
  • Deep familiarity with contemporary data construction and Lakehouse architecture concepts (e.g., Iceberg)
  • Experience managing cloud environments such as AWS, Azure, or Google Cloud Platform
  • Willingness to support systems on a rotating call basis
  • Applied knowledge of data warehousing solutions like Snowflake, Databricks, fast analytics on Trino, OLAP/NRT databases, visualization with Superset or Tableau
  • Grounding in Continuous Integration/Continuous Deployment (CI/CD), data monitoring, and declarative infrastructure setup
  • Practical engagement with Large Language Models, including prompting strategies, fine-tuning practices, and RAG
  • Background in Financial Technology, Wallet solutions, or Payment systems

Track your job applications with Mokaru

Save jobs, track applications, and let AI tailor your resume for each position.

Similar jobs

Ready to land your next role?

Join thousands of professionals who use Mokaru to manage their job search. AI-powered resume tailoring, application tracking, and more.

Create Free Resume