BetterHelp is on a mission to make mental health care more accessible to everyone. As a Senior Business Intelligence and Analytics Engineer, you will develop data tools and collaborate with cross-functional teams to deliver impactful data solutions that support the company's growth and decision-making processes.
Responsibilities:
- Act as a subject matter expert (SME) and deliver training to the cross functional teams to enable business users to make data-driven decisions
- Deliver direct analytical insights like dashboards and ad-hoc analyses to business stakeholders
- Collaborate with the teams across the company to understand their use cases and deliver high value data tools
- Live and breathe SQL
- Ensure data quality and freshness at every step of the pipeline for data trust and consistency
- Create reverse ETL flows to make modeled data directly to stakeholders in the tools they use to foster fast and informed decision making
- Define and build robust DataOps pipelines and data expectations to ensure the effective delivery of data to all internal data services
- Explore, propose, and integrate new data sources and software solutions into the reporting environment
- Contribute to data-driven culture at BetterHelp by directly training stakeholders as well as creating resources such as documentation for empowering others to perform their own analyses
- Enjoy great teamwork, have lots of fun, and take pride in building a world-class product that makes a difference in people's lives
- Partner with data and machine learning engineers and work with a modern data stack: Airflow, FiveTran, Snowflake, dbt, and Looker
Requirements:
- BSc/MA in a quantitative discipline such as Computer Science, Statistics, Math, or Engineering
- At least 4 years of experience in an analytics engineering, data engineering, data analytics, or a related field
- Advanced experience working with data pipeline tools, particularly in-warehouse transformation tools (preferably dbt)
- Advanced experience with SQL
- Experience querying and designing performant tables within OLAP data warehouses (e.g. Snowflake, BigQuery)
- At least 1 year experience working & developing with BI Tools (preferably Looker)
- Familiarity with frameworks and tradeoffs in building complex data models, preferably working work dimensional modeling frameworks (e.g. Kimball, Data Vault)
- Excellent communication (oral and written), attention to detail, time management and organizational skills
- Experience and comfort using the command line
- Experience using git for version control
- Familiarity with patterns for handling large-scale data transformations (e.g. dbt incremental models, horizontal/vertical scaling big data patterns)
- Experience working with Python-based workflow orchestration tools (e.g. Airflow)
- Experience working with Docker and container orchestration (e.g. Kubernetes, ECS)
- Experience with web-based B2C, marketplace products
- Familiarity with web event tracking tools like Snowplow
- Experience defining and building DataOps practices and deployment strategies
- Experience defining data governance procedures & working with data catalogs