Odyssey Information Services is seeking a Remote Data Engineer to help build a world-class data product. In this role, you will take ownership of end-to-end data streams in BigQuery and collaborate with Product and Analytics teams to ensure high-integrity data pipelines.
Responsibilities:
- Own the Pipeline: Build and scale ELT flows from raw ingestion to clean, actionable data marts
- Master the Stack: Use dbt, Airflow, and BigQuery to solve complex modeling problems with high technical craft
- Ensure Integrity: Implement automated testing and observability to catch issues before the business does
- Collaborate & Grow: Work directly with Product and Analytics teams while being mentored by Senior and Staff engineers to level up your architectural skills
Requirements:
- Must be a US citizen or Green Card Holder
- 3–5 years of experience in data infrastructure
- Deep respect for 'clean code' and documentation
- Expertise in SQL and Python
- Ability to optimize queries for cost and performance
- Production-scale experience with dbt and Apache Airflow (or similar orchestrators)
- Comfortable navigating BigQuery, Cloud Storage, and Cloud Composer
- Ability to thrive in growth-stage environments and turn ambiguity into reliable systems
- Experience with Terraform or Infrastructure-as-Code
- Familiarity with streaming data (Pub/Sub, Kafka)
- Google Cloud Professional Data Engineer certification