Vanta is a company dedicated to helping businesses earn and prove trust through continuous security monitoring. As GTM’s first Analytics Engineer, you will build foundational data infrastructure to enable scalable insights across the go-to-market organization, transforming raw data into reusable models and collaborating with various teams to enhance self-service analytics and metric governance.
Responsibilities:
- Build Revenue Data Marts: Design and implement dbt-based data marts for core GTM metrics (ARR, NRR, GRR, pipeline, quota, bookings), ensuring 100% reconciliation with Finance's source of truth
- Establish Semantic Layer: Encode business logic and metric definitions into a governed, version-controlled semantic layer that standardizes how Vanta defines and calculates key metrics
- Partner with GTM Analysts: Translate business requirements from analysts who understand stakeholder needs into scalable, well-tested data models that enable self-service insights
- Drive Metric Governance: Collaborate with Finance, RevOps, and business leaders to resolve definitional conflicts, establish clear ownership of metrics, and maintain cross-functional alignment on KPIs
- Enable AI-Powered Analytics: Create clean, well-documented datasets that can safely power AI assistants, advanced analytics, and automated reporting tools
- Influence Platform Roadmap: Work with Data Engineering to shape data platform capabilities, optimize pipeline performance, and ensure data quality through comprehensive testing frameworks
- Champion Best Practices: Establish standards for dbt development, documentation, testing, and deployment that scale as Vanta's data needs grow
Requirements:
- At least 4 years in Analytics Engineering, Data Engineering, or equivalent roles focused on transforming and modeling data for analytics
- Strong preference for candidates with 2+ years of deep dbt experience architecting projects, not just writing models
- Advanced proficiency with dbt (dbt Cloud preferred)—you've built or significantly contributed to dbt projects, managed configurations, implemented testing frameworks, and established governance patterns
- Strong dimensional modeling skills (facts, dimensions, star schemas, data marts) and experience designing scalable analytical data architectures in modern cloud warehouses (Snowflake, BigQuery, Redshift)
- Solid understanding of go-to-market and financial metrics such as ARR, NRR, GRR, ACV, pipeline coverage, conversion rates, and quota attainment
- Ability to translate business definitions into technical models
- Advanced SQL skills for complex transformations, aggregations, window functions, and performance optimization over large-scale datasets
- Proven ability to work effectively with cross-functional partners, including business analysts, finance teams, data engineers, and business stakeholders
- You can translate technical concepts for non-technical audiences and vice versa
- Obsessed with correctness - you proactively implement testing, reconciliation processes, and data quality checks to prevent issues before they impact stakeholders
- Excellent written and verbal communication skills
- You document your work clearly and can explain complex data models to both technical and business audiences
- Open to using AI to amplify their skills and strengthen their work - demonstrating curiosity, a willingness to learn, and sound judgment in applying AI responsibly to improve efficiency and impact
- Bachelor's degree in Computer Science, Data Science, Statistics, Mathematics, or related technical field preferred
- Experience with semantic layers or metrics layers (dbt Metrics, MetricFlow, Transform, etc.)
- Exposure to AI/ML workflows
- Familiarity with Salesforce data models