National Debt Relief (NDR) is seeking an experienced Senior Cloud Data Engineer to join our Data Engineering team. In this role, you will own the orchestration, automation, and optimization of data workflows across our existing Snowflake-based enterprise data platform.
Responsibilities:
- Contribute to the design and orchestration of data pipelines in Dagster to improve ingestion, transformation, and data quality workflows across the enterprise data platform
- Develop and maintain Python-based ingestion pipelines where there isn’t already a connector from Fivetran, integrating data from APIs and third-party systems
- Manage Snowflake infrastructure using IaC (Terraform or similar), while adhering to Data Engineering best practices
- Design, maintain, and optimize dbt transformation workflows to support curated and trusted data models for analytics and operations
- Focus on optimizing Snowflake performance and reducing compute spend through warehouse tuning, efficient query design, and resource utilization monitoring
- Respond to morning load failures to minimize impact to the business (east coast working hours)
- Implement robust data security and access controls within Snowflake, ensuring compliance with governance and privacy standards
- Develop and maintain CI/CD workflows for data pipelines, including automated testing, deployment, and version control practices
- Implement observability frameworks for data pipelines, including freshness checks, data contract enforcement, and automated alerting for anomalies
- Document system architectures, workflows, and configurations to support governance, reproducibility, and transparency
- Drive consistent, visible deliverables that demonstrate progress and impact, ensuring projects remain on track
Requirements:
- Bachelor's degree required
- 7 years of experience in data engineering or data warehouse development, with a focus on cloud environments
- Expertise in SQL and dbt for building and maintaining curated datasets and data transformation pipelines
- Hands-on expertise with Snowflake, including experience managing infrastructure with IaC tools (Terraform or equivalent)
- Demonstrated experience with Dagster (or Airflow, with a strong desire to work in Dagster) for managing event-driven pipelines and orchestrated assets
- Proficiency in Python for developing pipelines, APIs, and automation solutions
- Proven track record of implementing CI/CD workflows and automated testing for data pipelines
- Experience designing and implementing observability frameworks for data freshness, quality, and reliability
- Proactive ownership mindset with the ability to work independently and deliver results with minimal oversight
- Clear, timely, and proactive communication, including experience collaborating with leadership stakeholders
- Ability to manage multiple priorities and projects, ensuring progress stays visible and deliverables are met
- Strong troubleshooting and problem-solving skills, with attention to detail when working with sensitive systems and processes
- Strong collaboration and communication skills to partner effectively across data engineering, analytics, and product teams
- Self-starter with the ability to define and establish data orchestration standards in a growing environment
- Bachelor's degree in computer science, Data Engineering, a related field or advanced degree preferred
- Experience in financial services or related industries
- Expertise deploying and maintaining orchestration systems at scale (Dagster, Airflow)