Own the design and implementation of our Databricks Lakehouse
Drive the migration of complex legacy SQL into clean, governed pipelines
Build the semantic layer that powers self-service analytics
Design and implement the Unity Catalog structure
Lead the migration of complex business logic
Architect internal transformation framework using open-source tooling
Analyze Spark execution plans and diagnose bottlenecks
Build and maintain CI/CD pipelines to automate testing and deployment
Design data models built for self-service reporting
Partner with cross-functional stakeholders to translate business questions into scalable data solutions
Requirements
8+ years in Data Engineering or Data Architecture, with deep, hands-on experience in the Databricks ecosystem (Unity Catalog, Delta Lake, SQL Warehouses)
Expert-level SQL and distributed computing skills
Demonstrated experience in query optimization and data platform migration
Proven experience building data transformation workflows
Strong command of dimensional modeling (Star and Snowflake schemas)
Hands-on Unity Catalog experience
Fluency with Delta Lake internals
Hands-on experience with Omni or a comparable modern BI semantic layer
Experience with CI/CD pipelines and version control workflows