Design, build, and maintain scalable ETL/ELT pipelines.
Implement and manage reverse ETL workflows that sync data from the data warehouse (e.g., Snowflake) into operational systems (CRMs, marketing tools, internal applications, etc.).
Optimize data models to support both analytics and activation use cases.
Ensure data quality, validation, and monitoring across pipelines.
Collaborate with cross-functional teams to translate business requirements into reliable data solutions.
Support performance tuning and cost optimization of warehouse workloads.
Maintain documentation and best practices across data workflows.
Requirements
3–5+ years (Mid-level) or 5–8+ years (Senior-level) experience in Data Engineering
Strong hands-on experience with SQL and data modeling
Proven past experience implementing reverse ETL solutions (hands-on ownership, not just exposure)
Experience working with modern data warehouses (Snowflake preferred)
Experience building ETL/ELT pipelines using Python and/or SQL-based tools
Experience with orchestration tools (Airflow, dbt, or similar)
Experience working in cloud environments (AWS, GCP, or Azure)
Strong understanding of data quality and monitoring practices
Tech Stack
Airflow
AWS
Azure
Cloud
ETL
Google Cloud Platform
Python
SQL
Benefits
Monetary compensation
Year-end Bonus
IMSS, AFORE, INFONAVIT
Major Medical Expenses Insurance
Minor Medical Expenses Insurance
Life Insurance
Funeral Expenses Insurance
Preferential rates for car insurance
TDU Membership
Holidays and Vacations
Sick days
Bereavement days
Civil Marriage days
Maternity & Paternity leave
English and Spanish classes
Performance Management Framework
Certifications
TALISIS Agreement: Discounts at ADVENIO, Harmon Hall, U-ERRE, UNID