GameSquare Holdings Inc. is a cutting-edge media, entertainment, and technology company transforming how brands and publishers connect with Gen Z, Gen Alpha, and Millennial audiences. They are seeking a Data Engineer to build and operate TubeBuddy’s ETL/ELT pipelines and lakehouse/warehouse models, ensuring data quality and collaborating with cross-functional teams.
Responsibilities:
- Design, build, and maintain ETL/ELT batch pipelines landing raw data into our S3 data lake and promoting curated datasets into Databricks
- Implement reliable backfills and reprocessing workflows to keep historical data correct
- Build and maintain dbt models (staging through marts) with clean layering, documentation, and automated tests
- Partner with stakeholders to define canonical metrics and ensure consistent definitions across reporting
- Own data quality expectations (freshness, completeness, and correctness) for core datasets
- Partner with Engineering, Product, and Analytics to deliver high-quality datasets for reporting, experimentation, and analysis
- Support product analytics event data in the warehouse (Segment → Databricks), including identity/joins and schema stability
- Contribute to product-facing engineering work when needed, including light backend and application development support
Requirements:
- 3–5+ years of data engineering experience shipping production pipelines
- Bachelor's degree, or Master's degree in computer science, statistics, information systems, or a related, technical field
- Strong programming experience with Python and SQL
- Hands-on experience with cloud storage and lakehouse/warehouse patterns (we use S3 and Databricks)
- Strong experience with dbt for transformations, testing, and documentation
- Proven ability to operate pipelines in production, including backfills, reprocessing, and incident response
- Ability to turn data requirements from stakeholders into actionable plans
- Experience collaborating cross-functionally and being accountable in a small team
- Comfortable using AI-assisted tooling to accelerate development, paired with strong habits around validation, testing, and documentation
- Familiarity with product/event analytics data pipelines (Segment → warehouse) and modeling event schemas for analysis
- Experience working in a multi-cloud environment (AWS + Azure)
- Experience with orchestration and operational tooling (scheduling, retries, SLAs)
- JavaScript/TypeScript familiarity (various customer-facing aspects of our products)
- Comfort participating in production support rotations or incident response when needed