Come and See

Sr. Data Engineer

Colorado Springs, Colorado, USFull-time$107k - $140k/YEARYesterdayvia ZipRecruiter

Job description

Salary: $107k to $140k annually

Company Overview

The Come and See Foundation is dedicated to supporting and expanding the global reach of The Chosen, the first-ever multi-season television series about the life of Jesus Christ. The Chosen has captivated millions around the world with its powerful storytelling, high production values, and authentic portrayal of the life and ministry of Jesus. As a faith-driven initiative, the Foundation plays a pivotal role in funding, distributing, and engaging audiences to ensure that the message of The Chosen continues to inspire and touch lives across diverse cultures and communities. By fostering a collaborative and innovative work environment, the Come and See Foundation is committed to amplifying the impact of The Chosen and creating transformative experiences that resonate with audiences worldwide.

Position Summary

The Senior Data Engineer / Data Architect is responsible for designing, building, and governing the data infrastructure that powers Come and See Foundations analytics and business intelligence capabilities. This senior technical leader owns the architecture of our Snowflake-based data platform, ensures data flows reliably from source to insight, and partners with stakeholders across the organization to build a scalable, secure, and well-governed data ecosystem. Snowflake is the core of our stack, serving as both the data warehouse and the primary transformation layer. The ideal candidate is a seasoned engineer with deep Snowflake expertise, strong architectural instincts, and a passion for using technology in service of a mission-driven organization.

Key Responsibilities

Data Architecture & Platform Engineering
• Design, implement, and continuously improve the organizations Snowflake data warehouse architecture, including schema design, data modeling, performance optimization, and native transformation workflows
• Own Snowflake as the primary transformation layer, leveraging native features (Snowpark, tasks, streams, dynamic tables) in place of external transformation tooling
• Establish data architecture standards, patterns, and best practices across the data platform
• Lead the evaluation and selection of new data technologies and tools that align with organizational strategy
• Architect and maintain scalable, fault-tolerant data pipelines that reliably process high volumes of structured and unstructured data
• Design and implement data models (dimensional, relational, or data vault approaches as appropriate) that support BI and analytics use cases at scale

Data Pipeline & ELT Development
• Build, optimize, and maintain robust ELT pipelines to integrate data from Virtuous, FundraiseUp, HubSpot, RudderStack, Google Analytics, and other sources into Snowflake
• Develop and maintain data ingestion frameworks using Python and SQL
• Implement Airflow-based orchestration for pipeline scheduling, monitoring, alerting, and failure recovery
• Manage platform migrations and technology transitions with minimal disruption to downstream analytics
• Implement automation for routine data engineering tasks to improve reliability and reduce manual overhead

Cloud Infrastructure & Integration
• Manage and optimize cloud data infrastructure on Snowflake, ensuring cost-efficiency, performance, and scalability
• Develop and maintain APIs for data access and interoperability across organizational systems
• Implement and manage RudderStack event streaming pipelines, ensuring clean, reliable event data flows into Snowflake
• Maintain integrations with CRM platforms (Virtuous, HubSpot), fundraising tools (FundraiseUp), and analytics platforms (Google Analytics, Tableau)
• Collaborate with IT and engineering partners to define data access protocols, authentication, and authorization patterns
• Evaluate and implement containerization (Docker, Kubernetes) strategies for deploying and scaling data engineering applications where appropriate

Data Governance & Quality
• Implement and enforce data governance policies, including data classification, lineage tracking, and stewardship across the data lifecycle
• Lead data quality assurance efforts, establishing validation frameworks and automated quality checks within pipelines
• Enforce role-based access controls and data security protocols to protect sensitive donor and constituent data
• Ensure compliance with applicable data privacy regulations (GDPR, CCPA) in all data handling practices
• Maintain thorough documentation for data models, pipeline architecture, and operational processes in Git/GitHub

BI Enablement & Stakeholder Partnership
• Partner closely with Data Analysts and Data Scientists to ensure the Snowflake platform meets analytical requirements and enables high-quality Tableau reporting
• Collaborate with the Director of Data & BI/Analytics, COO, and other senior leaders to align data infrastructure with organizational and ministry strategy
• Serve as a key liaison between technical data teams and business stakeholders, translating requirements into sound engineering solutions
• Support the Customer 360 intelligence platform, including donor scoring models, self-serve dashboards, and AI-powered analytics infrastructure
• Provide technical mentorship to junior data engineering staff and foster a culture of engineering excellence
• Evangelize data best practices and data literacy across the organization

Project Management & Continuous Improvement
• Manage data engineering projects from design through delivery using Scrum methodology and 2-week sprint cycles
• Continuously evaluate and improve data engineering processes, adopting emerging best practices and Snowflake capabilities
• Identify cost optimization opportunities within the Snowflake platform and implement efficiency improvements
• Maintain version control discipline using Git/GitHub across all pipeline and transformation code

Knowledge, Skills and Abilities

Required Technical Skills
• Expert-level proficiency in Snowflake, including architecture, performance tuning, Snowpark, native transformation features (tasks, streams, dynamic tables), data sharing, and cost management
• Advanced SQL skills with experience in complex data modeling, warehouse design, and query optimization at scale
• Strong programming skills in Python for pipeline development, automation, and data engineering tasks
• Experience with Airflow or similar orchestration tools for pipeline scheduling and management
• Proficiency with cloud platforms (AWS, Azure, or GCP) and cloud-native data services
• Experience integrating with CRM platforms such as Virtuous or HubSpot
• Familiarity with digital fundraising platforms such as FundraiseUp
• Experience with RudderStack or other customer data platforms (CDPs) for event stream ingestion
• Experience with Google Analytics data ingestion and integration
• Experience with Git/GitHub for version control and collaborative development
• Ability to work within an Agile/Scrum environment with 2-week sprint cycles

Preferred Technical Skills
• Experience with containerization technologies (Docker, Kubernetes)
• Familiarity with Tableau or other BI tools, with ability to support analytics team data requirements
• Working knowledge of machine learning workflows, MLOps, and supporting data science teams (XGBoost, LightGBM, scikit-learn)
• Experience managing large-scale data migrations or platform transitions
• Exposure to data governance frameworks and data cataloging tools

Core Competencies
• Architectural Thinking: Designs solutions that are scalable, maintainable, and aligned with long-term organizational goals
• Analytical Thinking: Methodical and diligent with outstanding planning and problem-solving capabilities
• Communication: Clearly communicates technical concepts to both technical and non-technical audiences
• Collaboration: Partners effectively across functional teams including IT, analytics, programs, and executive leadership
• Data Evangelist: Enthusiastic about data and committed to building a data-driven culture
• Integrity: Handles sensitive donor and constituent data with the highest standards of ethics and transparency
• Independence: Self-directed and capable of leading technical work with minimal supervision
• High Standards: Holds self and team to excellence in code quality, documentation, and architecture
• Adaptability: Thrives in a dynamic environment and adjusts quickly to evolving priorities and technology

Education and Experience
• Bachelors degree in Computer Science, Data Science, Software Engineering, or a related technical field; advanced degree preferred
• Minimum 5-7 years of experience in data engineering, data architecture, or a related role
• Minimum 2-3 years of hands-on Snowflake experience in a production environment, including native transformation capabilities
• Demonstrated experience designing and delivering enterprise-grade data platforms at scale
• Experience in nonprofit, faith-based, or mission-driven organizations a plus

Physical Demands/Working Conditions
• The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
• Working Conditions: Office setting with a forty-hour (+) workweek. This position preferably works out of our Colorado Springs headquarters.
• Physical Demands: Frequent sitting and use of standard office equipment. Ability to lift approximately 15-20 pounds. Ability to hear phone calls and clearly communicate verbally.

Our Beliefs, Culture, and Commitment

At Come and See, every staff member is a critical and valuable part of our mission and ministry. We consider ministry readiness and an individual's capacity to represent our culture and Christian beliefs during the selection process for all staff positions. An essential function within every position held by a staff member at Come and See is to uphold and represent Jesus in how we live and behave.

While we unite around our mission, we know unity doesn't mean uniformity. Our calling is too great, and our mission is too important not to be intentional about strengthening our team with people from all backgrounds. We believe that varied perspectivesacross race, ethnicity, life experience, age, and genderenrich our ability to share the story of Jesus with the world and help everyone, everywhere, experience the authentic Jesus.

Database Architect Resume Example

See a professional resume example for this role with key skills, action verbs, and ATS-friendly formatting.

View resume example

Responsibilities

  • The Senior Data Engineer / Data Architect is responsible for designing, building, and governing the data infrastructure that powers Come and See Foundations analytics and business intelligence capabilities
  • Data Architecture & Platform Engineering
  • Design, implement, and continuously improve the organizations Snowflake data warehouse architecture, including schema design, data modeling, performance optimization, and native transformation workflows
  • Own Snowflake as the primary transformation layer, leveraging native features (Snowpark, tasks, streams, dynamic tables) in place of external transformation tooling
  • Establish data architecture standards, patterns, and best practices across the data platform
  • Lead the evaluation and selection of new data technologies and tools that align with organizational strategy
  • Architect and maintain scalable, fault-tolerant data pipelines that reliably process high volumes of structured and unstructured data
  • Design and implement data models (dimensional, relational, or data vault approaches as appropriate) that support BI and analytics use cases at scale
  • Data Pipeline & ELT Development
  • Build, optimize, and maintain robust ELT pipelines to integrate data from Virtuous, FundraiseUp, HubSpot, RudderStack, Google Analytics, and other sources into Snowflake
  • Develop and maintain data ingestion frameworks using Python and SQL
  • Implement Airflow-based orchestration for pipeline scheduling, monitoring, alerting, and failure recovery
  • Manage platform migrations and technology transitions with minimal disruption to downstream analytics
  • Implement automation for routine data engineering tasks to improve reliability and reduce manual overhead
  • Cloud Infrastructure & Integration
  • Manage and optimize cloud data infrastructure on Snowflake, ensuring cost-efficiency, performance, and scalability
  • Develop and maintain APIs for data access and interoperability across organizational systems
  • Implement and manage RudderStack event streaming pipelines, ensuring clean, reliable event data flows into Snowflake
  • Maintain integrations with CRM platforms (Virtuous, HubSpot), fundraising tools (FundraiseUp), and analytics platforms (Google Analytics, Tableau)
  • Collaborate with IT and engineering partners to define data access protocols, authentication, and authorization patterns
  • Evaluate and implement containerization (Docker, Kubernetes) strategies for deploying and scaling data engineering applications where appropriate
  • Data Governance & Quality
  • Implement and enforce data governance policies, including data classification, lineage tracking, and stewardship across the data lifecycle
  • Lead data quality assurance efforts, establishing validation frameworks and automated quality checks within pipelines
  • Enforce role-based access controls and data security protocols to protect sensitive donor and constituent data
  • Ensure compliance with applicable data privacy regulations (GDPR, CCPA) in all data handling practices
  • Maintain thorough documentation for data models, pipeline architecture, and operational processes in Git/GitHub
  • BI Enablement & Stakeholder Partnership
  • Partner closely with Data Analysts and Data Scientists to ensure the Snowflake platform meets analytical requirements and enables high-quality Tableau reporting
  • Collaborate with the Director of Data & BI/Analytics, COO, and other senior leaders to align data infrastructure with organizational and ministry strategy
  • Serve as a key liaison between technical data teams and business stakeholders, translating requirements into sound engineering solutions
  • Support the Customer 360 intelligence platform, including donor scoring models, self-serve dashboards, and AI-powered analytics infrastructure
  • Provide technical mentorship to junior data engineering staff and foster a culture of engineering excellence
  • Evangelize data best practices and data literacy across the organization
  • Project Management & Continuous Improvement
  • Manage data engineering projects from design through delivery using Scrum methodology and 2-week sprint cycles
  • Continuously evaluate and improve data engineering processes, adopting emerging best practices and Snowflake capabilities
  • Identify cost optimization opportunities within the Snowflake platform and implement efficiency improvements
  • Maintain version control discipline using Git/GitHub across all pipeline and transformation code
  • The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job
  • Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions

Qualifications

  • Knowledge, Skills and Abilities
  • Required Technical Skills
  • Expert-level proficiency in Snowflake, including architecture, performance tuning, Snowpark, native transformation features (tasks, streams, dynamic tables), data sharing, and cost management
  • Advanced SQL skills with experience in complex data modeling, warehouse design, and query optimization at scale
  • Strong programming skills in Python for pipeline development, automation, and data engineering tasks
  • Experience with Airflow or similar orchestration tools for pipeline scheduling and management
  • Proficiency with cloud platforms (AWS, Azure, or GCP) and cloud-native data services
  • Experience integrating with CRM platforms such as Virtuous or HubSpot
  • Familiarity with digital fundraising platforms such as FundraiseUp
  • Experience with RudderStack or other customer data platforms (CDPs) for event stream ingestion
  • Experience with Google Analytics data ingestion and integration
  • Experience with Git/GitHub for version control and collaborative development
  • Ability to work within an Agile/Scrum environment with 2-week sprint cycles
  • Architectural Thinking: Designs solutions that are scalable, maintainable, and aligned with long-term organizational goals
  • Analytical Thinking: Methodical and diligent with outstanding planning and problem-solving capabilities
  • Communication: Clearly communicates technical concepts to both technical and non-technical audiences
  • Collaboration: Partners effectively across functional teams including IT, analytics, programs, and executive leadership
  • Data Evangelist: Enthusiastic about data and committed to building a data-driven culture
  • Integrity: Handles sensitive donor and constituent data with the highest standards of ethics and transparency
  • Independence: Self-directed and capable of leading technical work with minimal supervision
  • High Standards: Holds self and team to excellence in code quality, documentation, and architecture
  • Adaptability: Thrives in a dynamic environment and adjusts quickly to evolving priorities and technology
  • Minimum 5-7 years of experience in data engineering, data architecture, or a related role
  • Minimum 2-3 years of hands-on Snowflake experience in a production environment, including native transformation capabilities
  • Demonstrated experience designing and delivering enterprise-grade data platforms at scale
  • Working Conditions: Office setting with a forty-hour (+) workweek
  • Physical Demands: Frequent sitting and use of standard office equipment
  • Ability to lift approximately 15-20 pounds
  • Ability to hear phone calls and clearly communicate verbally

Benefits

  • Salary: $107k to $140k annually

Track your job applications with Mokaru

Save jobs, track applications, and let AI tailor your resume for each position.

Similar jobs

Ready to land your next role?

Join thousands of professionals who use Mokaru to manage their job search. AI-powered resume tailoring, application tracking, and more.

Create Free Resume