Avetta is a SaaS platform that connects organizations with qualified suppliers and vendors, enhancing supply chain risk and compliance through cloud-based technology. The Business Intelligence Engineer will design and maintain data analytics pipelines and create scalable data models to support various analytics use cases.
Responsibilities:
- Use DBT to design modular data models, enforce data quality, document transformations, collaborate via Git, and deploy DBT pipelines in production
- Architect layered DBT models (staging → intermediate → marts) for finance and customer master data
- Develop a unified data model by integrating and transforming data from diverse sources
- Build and maintain DBT models that serve as business rules frameworks, reducing reliance on external rules engines and streamlining data governance
- Design and maintain standardized and conformed models (e.g., Client, Customer, Billing entities) that serve as the foundation for Gold analytics and semantic models
- Delivery of certified data products and contributes to the semantic layer implementation in DBT (reusable dimensions, certified metrics, exposures, and macros)
- Delivery of self-service analytics capabilities in Snowflake
- Create and maintain technical metadata standards (i.e. lineage, technical definitions, etc.)
- Implement DBT best practices: modular packages, Jinja logic, quality tests, and documentation standards
- Collaborate with BI product owners, data analysts, and governance teams to apply consistent business logic across reporting tiers
- Define and implement business-aligned data quality rules and certification standards
- Partner with data governance to steward definitions, metrics, and lineage
Requirements:
- 4 years of experience with engineering analytics pipelines, preferably in a SaaS environment
- 1 year of experience using DBT to design modular models, enforce data quality, document transformations, collaborate via Git, and deploy DBT pipelines in production
- Advanced proficiency in SQL (e.g., CTEs, window functions, recursion)
- Experience with modern cloud data warehouses (e.g., Snowflake, BigQuery, Redshift, Databricks)
- Strong understanding of different data modeling techniques (e.g., star schema, Snowflake schema, data vault)
- Bachelor's degree in Data Analytics, Computer Science, Information Technology, Finance, or a related field
- Strong understanding of DBT Fundamentals
- Exposure to or training in Snowflake data warehousing concepts (e.g., Snowflake Data Warehousing Workshop – Badge 1)
- Experience with DBT Cloud IDE
- Experience with data visualization tools, Power BI preferred
- Familiarity with Snowflake and DBT CI/CD automation with GitHub Actions
- Aptitude for Agile delivery, backlog management (JIRA), and cross-functional stakeholder leadership
- Experience with business systems such as Salesforce, NetSuite, or Zuora