Inetum2
Technical Leader Data Engineer
Company
Role
Technical Leader Data Engineer
Location
Job type
Full-time
Posted
2 days ago
Salary
Job description
As a Technical Leader Data Engineer, you will design, build, and maintain scalable data solutions on Google Cloud Platform (GCP), supporting analytics, business use cases, and AI/ML initiatives.
You will work on a greenfield migration from on‑premise to cloud, creating data architecture and models from scratch. The role combines hands‑on engineering with architectural thinking, close collaboration with business stakeholders, and ownership of data quality, performance, and cost efficiency.
You will help shape standards, best practices, and the long‑term direction of the data platform in a modern, cloud‑native environment.
- Strong GCP experience – mandatory.
- 5+ years of experience as a Data Engineer, including experience in a lead or senior role.
- Hands-on experience with the GCP ecosystem, especially:
- BigQuery – advanced SQL, cost management, performance optimization.
- Cloud Storage – data management and versioning.
- Pub/Sub and Dataflow / Apache Beam or Dataproc / Spark.
- Cloud Composer (Airflow) for data pipeline orchestration and scheduling.
- Understanding of GCP security fundamentals: IAM, KMS, DLP.
- Strong Python skills (e.g. pandas, PySpark, automated testing).
- Experience in data modeling (Kimball, Data Vault, Dimensional Modeling) and multi‑layered data architectures (raw/bronze, curated/silver, semantic/gold).
- Knowledge of CI/CD tools (e.g. GitLab CI, Cloud Build).
- Practical experience in data quality (testing, monitoring, alerting).
- Familiarity with monitoring and troubleshooting tools (Cloud Monitoring, Cloud Logging).
- Experience with technical documentation and code reviews.
- Strong communication skills and ability to collaborate with business, technical, and analytics teams.
- Proactive mindset, ownership, and a strong focus on data quality.
Nice‑to‑Have:
- Experience with data management and transformation tools, such as:
- dbt, Dataform.
- Dataplex, Data Catalog.
- Experience with analytics and BI tools (e.g. Looker / Looker Studio).
- Knowledge of MLOps tools and platforms, including Vertex AI and Feature Store.
- Advanced experience with Apache Kafka or Apache Pulsar for streaming data.
- Familiarity with GDPR, ISO 27001, and data security policies.
- Experience with data quality frameworks (e.g. Great Expectations, Soda).
- Ability to design product‑oriented data architectures, tailored to specific business domains.
- Experience in on‑premise to cloud data migrations.
- Knowledge of Oracle (PL/SQL).
Hybrid work: 2 days a week at the office in Warsaw.


