Amaze is a fast-moving startup looking for a hands-on Analytics Engineer who is passionate about clean data and scalable pipelines. The role involves owning the Snowflake data warehouse, managing data pipelines, and collaborating with Marketing and Finance to provide actionable insights and support data-driven decisions.
Responsibilities:
- Own and maintain our Snowflake data warehouse, including warehouse sizing, cost management, and query optimization
- Build, maintain, and expand dbt Cloud data pipelines — from raw ingestion through staging, intermediate, and mart layers
- Manage ELT pipelines via Fivetran and Hightouch (reverse ETL), ensuring data is accurate, timely, and well-documented
- Partner with Engineering to ensure new product features are instrumented correctly and data flows cleanly from source to warehouse
- Lead the continued buildout of our dbt semantic layer — defining shared metrics, governance standards, and source-of-truth definitions across business units
- Enable AI-powered self-service analytics tools (e.g., Slack-based chatbots) by ensuring the semantic layer is accurate, consistent, and well-governed
- Design metric definitions that serve both technical and non-technical consumers, so stakeholders can trust and explore data independently
- Serve as the primary data partner for Marketing and Finance — the two teams with the highest data demand at Amaze
- Build and maintain Tableau dashboards that translate complex data into clear, actionable insights for non-technical stakeholders
- Support conversion rate analysis, funnel optimization, and marketing performance measurement
- Develop and maintain executive-level reporting, including dashboards tracking company-wide KPIs
- Establish data definitions, documentation standards, and a request process so stakeholders know how to engage with the data team
- Drive self-service data literacy — reducing ad-hoc request volume by empowering teams with the tools and training to answer their own questions
- Ensure data considerations are part of every product and feature release, not an afterthought
Requirements:
- 5+ years of hands-on experience in data engineering, analytics engineering, or a closely related role
- Strong, production-level experience with Snowflake — warehouse management, cost optimization, query tuning, and data modeling
- Deep experience with dbt Cloud — building and maintaining transformation pipelines, writing tests and documentation, and ideally hands-on work with the dbt Semantic Layer
- Proficiency in SQL at an advanced level; comfort with Python for scripting and pipeline support
- Experience with Tableau (or comparable BI tools) building dashboards for both technical and non-technical audiences
- Familiarity with ELT tooling such as Fivetran, and reverse ETL tools such as Hightouch
- Startup experience — you know how to operate without perfect requirements, prioritize ruthlessly, and move fast without breaking things
- Experience with semantic layer tooling (dbt Semantic Layer, Cube, LookML, AtScale, or similar) — defining shared metrics and governing change across business units
- Experience supporting marketing analytics — attribution, funnel analysis, campaign performance, GA4/BigQuery pipelines
- Familiarity with AI-enabled analytics — LLM-powered self-service tools, natural language to SQL, or similar
- Experience with Google Analytics 360 / GA4 and integrating behavioral data into a warehouse
- Background integrating CRM data (e.g., HubSpot) and supporting Finance reporting workflows