Design and oversee the migration/optimization of our data lakehouse architecture using Databricks.
Serve as the escalation point for the platform, leveraging a proven record of handling production issues to ensure 24/7 reliability of mission-critical pipelines.
Work directly with the SVP of Tech to translate product roadmaps into technical requirements.
Drive the adoption of Databricks Asset Bundles (DABs) to standardize deployments across dev, staging, and production.
Provide high-level guidance to engineering squads in India and the US, defining schemas and governance models for BigQuery and Databricks.
Requirements
Expert-Level Databricks: Minimum 4+ years of hands-on experience specifically within the Databricks ecosystem (Delta Lake, Unity Catalog, Photon).
The Stack: Deep proficiency in Spark (PySpark/Scala), Python, and SQL. Experience with GCP (BigQuery) is a major plus.
Independent Execution: Must be a self-starter capable of taking a high-level concept from the SVP and driving it to completion with minimal supervision.
Experience: 10+ years in Data Engineering, with at least 3 years in a Principal or Architect capacity.