Lead the technical evolution of our data platform for a period of 8 Months Fixed-Term Contract.
Own the design and implementation of our Databricks Lakehouse.
Drive the migration of complex legacy SQL into clean, governed pipelines.
Build the semantic layer that powers self-service analytics in Omni.
Design and implement the Unity Catalog structure.
Lead the migration of complex business logic from legacy systems.
Architect our internal transformation framework using open-source tooling.
Serve as the resident query performance expert.
Govern our Databricks compute footprint through strategic application of various optimization techniques.
Build and maintain CI/CD pipelines to automate testing, validation, and deployment of data models.
Occasionally take on the BI Developer role, building executive-level dashboards.
Requirements
8+ years in Data Engineering or Data Architecture, with deep, hands-on experience in the Databricks ecosystem (Unity Catalog, Delta Lake, SQL Warehouses).
Expert-level SQL and distributed computing skills.
Demonstrated experience in query optimization and data platform migration.
Proven experience building data transformation workflows using Delta Live Tables or custom Python/SQL Spark pipelines.
Strong command of dimensional modeling (Star and Snowflake schemas).
Hands-on Unity Catalog experience, including permissions management.
Fluency with Delta Lake internals.
Hands-on experience with Omni or a comparable modern BI semantic layer.
Experience with CI/CD pipelines and version control workflows.