Together for Talent is a faith-based nonprofit organization dedicated to helping individuals grow through meaningful service and community. They are seeking a highly technical and detail-oriented Data Engineer to manage the end-to-end development of ETL/ELT pipelines, ensuring the accuracy and reliability of data operations.
Responsibilities:
- ETL/ELT Execution: Under the guidance of the Director of Data, execute the full data lifecycle -Extract, Transform, Load—utilizing industry-standard tools like Fivetran to automate ingestion
- Orchestration & Workflow Management: Help manage and scale complex data workflows using tools like Apache Airflow, ensuring task dependencies are met and pipelines run with 100% reliability
- Data Cleansing & Modeling: Perform rigorous data cleansing and transformation within the SQL layer to turn raw, messy data into "Gold-standard" KPI-ready logic
- Integrity & Reliability Control: Function as the primary monitor for the data environment. Execute automated monitoring protocols to identify schema drift or data anomalies before they reach a stakeholder
- Warehouse & Lake Support: Assist the Director of Data in the maintenance and optimization of cloud data warehouses, specifically BigQuery, ensuring performance and cost-efficiency
- Cross-Functional Data Sourcing: Partner with Marketing and Content teams to normalize data from complex stacks, including GA4, PostHog, Facebook Ads, Reddit, and others, and custom website tracking
Requirements:
- Execute the full data lifecycle - Extract, Transform, Load—utilizing industry-standard tools like Fivetran to automate ingestion
- Help manage and scale complex data workflows using tools like Apache Airflow, ensuring task dependencies are met and pipelines run with 100% reliability
- Perform rigorous data cleansing and transformation within the SQL layer to turn raw, messy data into 'Gold-standard' KPI-ready logic
- Function as the primary monitor for the data environment. Execute automated monitoring protocols to identify schema drift or data anomalies before they reach a stakeholder
- Assist the Director of Data in the maintenance and optimization of cloud data warehouses, specifically BigQuery, ensuring performance and cost-efficiency
- Partner with Marketing and Content teams to normalize data from complex stacks, including GA4, PostHog, Facebook Ads, Reddit, and others, and custom website tracking
- You are a 'black belt' in SQL. You write complex joins, window functions, and CTEs that are both performant and easily maintainable
- Proven experience building and managing data pipelines at scale using Fivetran or similar automated ingestion tools
- Experience with Apache Airflow (or similar tools like Prefect/Dagster) for managing complex pipeline dependencies and scheduling
- Strong technical proficiency in BigQuery and Redshift, including an understanding of partitioning, clustering, and cost-optimization
- You have a natural 'whiz' for spotting when data is incorrect. You see the anomalies that indicate a tracking break, a sync error, or an API failure instantly
- A deep understanding of the nuances of GA4 and/or PostHog, web tracking schemas, and media pixel implementation
- Experience sourcing and modeling data from Content Management Systems (CMS) to track content performance and attribution
- Familiarity with how Marketing teams utilize data for campaign optimization and customer journey mapping