Own the architecture, delivery, and adoption of a governed, reusable Data → Knowledge → Skills stack that powers analytics, AI agents, and real-time decisioning across the company.
Design and scale the semantic and operational layers on top of our Data Lakehouse (Databricks), enforce strong governance and lineage, and productize capabilities so both human analysts and machine agents reason and act consistently.
Ensure the platform supports rapid experimentation, intelligent automation, and high-quality decision-making, transforming raw data into durable knowledge and deployable skills across the organization.
Partner closely with Data Science, AI engineering, Product, and Risk teams to ensure gaps are identified and addressed for effective utilization of data.
Requirements
15+ years in data or platform engineering, with 5+ years leading engineering teams in complex, multi-region environments.
Proven experience architecting and scaling modern data platforms that extend beyond storage to include semantic layers, metadata systems, or knowledge graphs.
Strong hands-on expertise in data modeling (transactional, event-driven, dimensional, feature tables) with robust data quality, lineage, and observability practices.
Experience designing and operationalizing a governed semantic layer with standardized, auditable business metric definitions.
Track record of building reusable, programmatic capabilities on top of data platforms (APIs, orchestration frameworks, analytical workflows, agent-callable tools).
Deep familiarity with modern lakehouse architectures, particularly within a Databricks environment.
Experience implementing regional data localization and residency strategies while maintaining a unified global data model.
Strong understanding of data governance, privacy, and regulatory controls in multi-jurisdictional contexts.
Platform-as-product mindset, with experience driving adoption, improving developer experience, and operating with clear SLAs/SLOs and measurable impact.