Teladoc Health is leading the next evolution of virtual care and is seeking a Senior Data Engineer to contribute to their data platform modernization initiative. This role focuses on designing and operationalizing data pipelines on Snowflake using dbt, while collaborating with various teams to enhance data solutions and governance practices.
Responsibilities:
- Build and maintain scalable ELT pipelines using dbt Core/Cloud on Snowflake
- Develop performant data models with optimized materializations and incremental processing
- Lead migration from legacy Redshift pipelines to Snowflake/dbt, ensuring functional parity, improved performance, and full test coverage
- Implement data quality tests, observability frameworks, and CDC/streaming ingestion patterns
- Enforce GitHub‑based development practices (branching, PR review, CI/CD, versioning)
- Maintain repo structure, documentation, and automated workflows
- Build CI/CD pipelines using GitHub Actions for dbt build/test/deploy
- Support IaC practices for Snowflake resource provisioning
- Use GitHub Copilot/Copilot Chat to accelerate coding, testing, documentation, and review
- Analyze and modernize legacy SQL with AI tooling
- Experiment with emerging agentic development tools (Copilot Workspace, Claude Code)
- Create and share prompt libraries and AI workflow patterns that improve team velocity and consistency in pipeline development, testing, and documentation tasks
- Maintain thorough documentation of all pipelines, data models, and business logic in dbt docs, Confluence, or equivalent tools — ensuring knowledge is accessible and transferable
- Monitor pipeline performance and resolve SLA issues using observability tools (Metaplane, dbt Cloud, Snowflake Query History)
- Participate in architecture processes and cross-functional design reviews for data platform decisions
Requirements:
- 8+ years of experience in data engineering, analytics engineering, or SQL-intensive data development roles
- 3+ years of hands-on, production experience with dbt (Core or Cloud) and Snowflake as primary platforms
- Deep GitHub expertise (branching, pull request-based development, Actions for CI/CD)
- Strong SQL expertise including query optimization, window functions, recursive CTEs, and performance tuning on large-scale cloud data warehouses
- Experience with ELT orchestration (Airflow/Astronomer, dbt Cloud, or equivalent)
- Familiarity with CDC patterns, event streaming architectures (Kafka, Kinesis, or equivalent), and real-time data ingestion use cases
- Proven track record migrating legacy ETL/SQL logic (Redshift, Oracle, Teradata) into modern ELT architectures with full test coverage and documentation
- Excellent communication skills with the ability to convey technical concepts clearly to non-technical stakeholders
- Experience using GitHub Copilot, Copilot Chat, and/or other agentic coding tools (e.g., Claude Code, Cursor, Copilot Workspace) to accelerate development, including creating internal prompt libraries and AI workflow patterns, is highly preferred
- Experience with Salesforce Data360 (Data Cloud) or Salesforce Marketing Cloud for audience segmentation and activation is a plus
- Familiarity with data quality and observability platforms such as Metaplane, Monte Carlo, or Great Expectations
- Knowledge of HIPAA compliance requirements and healthcare data standards (HL7, FHIR) in data engineering contexts
- Snowflake SnowPro Core or Advanced certification, or dbt Certified Developer credential