Avetta is a SaaS company that connects leading organizations with qualified suppliers and vendors, enhancing supply chain risk and compliance through cloud technology. The Data Engineer will be responsible for managing and evolving the analytics engineering platform, focusing on DBT and Fivetran to ensure reliable data pipelines for analytics and business intelligence.
Responsibilities:
- Own the DBT and Fivetran platforms, including configuration, environment setup, access controls, and ongoing maintenance
- Manage DBT Cloud environments, including job configuration, scheduling, orchestration, and dependency management
- Configure and manage Fivetran connectors, sync schedules, schemas, transformations, and performance tuning
- Build, operate, and optimize end-to-end ELT pipelines from source systems through Snowflake using Fivetran and DBT
- Design and maintain robust orchestration patterns for DBT runs, incremental models, and downstream dependencies
- Ensure pipelines are reliable, scalable, and cost-efficient
- Own and manage data platform infrastructure using Terraform to create Snowflake resources
- Implement, maintain and own CI/CD pipelines using GitHub Actions for DBT deployments, testing, and environment promotion
- Implement and maintain data quality solutions (e.g., freshness checks, anomaly detection)
- Build and manage monitoring and observability solutions for pipelines, jobs, and data SLAs
- Define and track platform and data reliability metrics, alerting on failures, delays, or data issues
- Own Snowflake administration, including warehouses, roles, permissions, resource monitoring, and cost optimization
- Tune performance for DBT models and queries through clustering, warehouse sizing, and query optimization
- Ensure security, governance, and best practices across Snowflake environments
Requirements:
- Own the DBT and Fivetran platforms, including configuration, environment setup, access controls, and ongoing maintenance
- Manage DBT Cloud environments, including job configuration, scheduling, orchestration, and dependency management
- Configure and manage Fivetran connectors, sync schedules, schemas, transformations, and performance tuning
- Build, operate, and optimize end-to-end ELT pipelines from source systems through Snowflake using Fivetran and DBT
- Design and maintain robust orchestration patterns for DBT runs, incremental models, and downstream dependencies
- Ensure pipelines are reliable, scalable, and cost-efficient
- Own and manage data platform infrastructure using Terraform to create Snowflake resources
- Implement, maintain and own CI/CD pipelines using GitHub Actions for DBT deployments, testing, and environment promotion
- Implement and maintain data quality solutions (e.g., freshness checks, anomaly detection)
- Build and manage monitoring and observability solutions for pipelines, jobs, and data SLAs
- Define and track platform and data reliability metrics, alerting on failures, delays, or data issues
- Own Snowflake administration, including warehouses, roles, permissions, resource monitoring, and cost optimization
- Tune performance for DBT models and queries through clustering, warehouse sizing, and query optimization
- Ensure security, governance, and best practices across Snowflake environments
- 6 years of experience in data engineering or analytics engineering, with demonstrated senior-level ownership
- Bachelor's or Master's Degree in Data Engineering, Computer Science, or related field
- Strong documentation and stakeholder communication skills
- Aptitude for Agile delivery (e.g., JIRA) and backlog management in cross-functional data teams
- 6 years of experience in data engineering or analytics engineering, with demonstrated senior-level ownership is preferred
- Bachelor's or Master's Degree in Data Engineering, Computer Science, or related field is preferred
- Certifications in DBT (Advanced), Snowflake (SnowPro), GitHub Actions Certification, AWS Certified Solutions Architect – Professional is preferred
- Comfortable in supporting analytics-ready DBT models, including staging, intermediate, and mart layers
- Experience implementing data observability or data quality frameworks
- Familiarity with modern metrics layers or semantic models
- Experience supporting analytics and BI