Build scalable, reliable data systems while maintaining SLOs for business-critical pipelines.
Partner with other internal teams such as Product, Sales Ops, and Finance to deliver high-impact data solutions.
Support self-service tooling and a data-driven culture across Pantheon's teams.
Contribute to the technical strategy and operations of Pantheon's data platform.
Stay up-to-date with industry trends and technologies in data engineering, analytics, and modern data platforms.
Continuously improve our standard of engineering excellence by implementing best practices for data architecture, testing, pipeline reliability, and documentation.
Requirements
3-5+ years of experience building production services and features, ideally with a mix of Data Engineering expertise.
Ability to build maintainable components in Python and/or Go.
Comfortable using SQL with relational databases (e.g. MySQL, Postgres) and data warehouses (e.g. Snowflake, BigQuery).
Experience working with containerization (e.g., Docker, OCI), Terraform, and Kubernetes (K8s).
Familiarity with modern data stacks (e.g. Snowflake, dbt, Airflow, Looker)
Tech Stack
Airflow
BigQuery
Docker
Kubernetes
MySQL
Postgres
Python
SQL
Terraform
Go
Benefits
Industry competitive compensation and equity plan
Paid Time Off (PTO), Paid Sick Leave (PSL) and 12 Paid Company Holidays
Full medical coverage (Extended health care, dental, vision)
Top-of-line equipment
In-office workspace (Vancouver, BC Canada)
Monthly allowance for wellness, reading and access to LinkedIn Learning for continued development
Events and activities both team-based and company wide that inspire, educate and cultivate