Scaling the Warehouse: Partner with our Analytics Engineer to refactor and optimize our dbt pipelines, ensuring they remain fast and cost-efficient as our data volume grows.
Internalizing Ingestion: Bridge the gap between product data and BI by maintaining and extending ingestion scripts (Go/Python) to ensure we have the data we need, when we need it.
Data Intelligence: Build the pipelines that power internal "intelligence" features, such as automated customer segmentation, sentiment classification, and qualitative research tools.
Operational Excellence: Strengthen our data activation layer, ensuring our CRM and operational tools stay in sync with the warehouse.
Governance & Trust: Drive initiatives around data lineage, documentation, and quality testing to maintain high trust in our reporting.
Advanced dbt Modeling: Build modular, well-tested SQL models using dbt. You’ll focus on the "heavy lifting" of the transformation layer—macros, incremental strategies, and performance tuning.
Pipeline Engineering: Maintain and optimize custom ingestion workflows and orchestration, ensuring high availability and reliability.
Intelligence & Automation: Implement internal automation workflows that leverage LLMs for data classification and insights generation within the warehouse.
Data Activation: Own the "Reverse ETL" flows that push critical business signals back into GTM tools for Sales, Marketing, and Success.
Technical Standards: Contribute to our engineering culture through code reviews, documentation, and the enforcement of data contracts with upstream providers.
Requirements
The dbt Specialist: You have significant experience building and managing complex dbt projects. You believe in "Analytics as Code" and practice CI/CD for data.
Data Warehouse Expert: Deep experience with Google BigQuery (or similar cloud warehouses), including a strong understanding of partitioning, clustering, and cost optimization.
Polyglot Developer: Proficient in SQL and Python, with the ability (or strong interest) to work with Go to maintain ingestion scripts.
The Governance Advocate: You understand that data is only as good as the trust people have in it. You are disciplined about documentation, data contracts, and establishing clear lineage to ensure "one version of the truth."
Ops-Minded: You care as much about data freshness and reliability as you do about the logic itself. You proactively identify and resolve issues before they impact the business. Experience with Airflow and containerization (Docker) is a significant plus.
Automation Enthusiast: You are excited about using modern tools (like BQML or Vertex AI) to automate internal data classification and insights and you actively explore new approaches as the data and AI landscape evolves.
Collaborative Team Player: You are a strong communicator who enjoys partnering with cross-functional stakeholders (Marketing, Sales, Product) to translate ambiguous business needs into robust technical solutions. You value peer reviews and knowledge sharing. You are comfortable working in a remote environment, and you help guide stakeholders toward better decisions—not just execute on requests.
Tech Stack
Airflow
BigQuery
Cloud
Docker
ETL
Python
SQL
Go
Benefits
Private life and health insurance plan
Fully remote work if you prefer to work from home, apart from when we have team meetings a few times per year
Your personal annual training budget
An annual home office allowance to set up your personal space
Company laptop
23 days of paid time off
3 early summer Fridays in July and August
Access to AI tools at work
A free LearnWorlds School to build and sell your own courses
Work in one of the globally top 5 e-learning courses platform
An opportunity to grow alongside us and shape the look and feel of tomorrow's e-learning
An entrepreneurial, international, and highly motivated team with a flat hierarchy that will both challenge you and help you reach your highest potential
Annual company retreats (see the video of our latest retreat).