Lead and drive client working sessions focused on data architecture and data mapping, bringing clarity and direction to complex discussions with senior stakeholders.
Design and own end-to-end Databricks Lakehouse architecture for marketing data ingestion, transformation, storage, and consumption—leveraging Unity Catalog, Delta Lake, Delta Live Tables, and Databricks SQL Warehouse.
Design integrations between Databricks and martech platforms (Braze, Segment, CDPs) to enable customer 360 views, churn prediction, media mix modeling, dynamic pricing, and lifetime value analysis.
Design and implement robust data models—including medallion architecture, Star Schema, and Data Vault—optimized for marketing analytics workloads.
Enable marketers to query data conversationally and generate actionable insights without SQL through Databricks Genie and AI-powered analytics.
Support the design and implementation of AI agents using Agent Bricks and the Mosaic AI Agent Framework.
Review and interpret technical capability maps and frameworks, connecting them to data architecture and design decisions.
Architect data governance, security, and access control policies using Unity Catalog, including lineage tracking, audit logging, PII masking, and marketing data privacy compliance.
Integrate Databricks workflows into CI/CD pipelines using Terraform, Git, and Azure DevOps; implement infrastructure-as-code and automated deployment for notebooks, jobs, and clusters.
Partner with clients on roadmaps, POCs, and migrations; support pre-sales efforts with technical proposals, whiteboarding sessions, and project estimation.
Manage your time to meet billable targets of 36+ hours per week.
Travel up to 20% for client meetings, workshops, and onsite engagements.
Requirements
8+ years of experience in data engineering or data architecture.
3+ years of hands-on Databricks experience, including PySpark, SQL, and Scala.
Proven experience integrating Databricks with Braze or similar martech platforms for marketing analytics (Salesforce, HubSpot, Responsys, Adobe).
Deep expertise in Delta Lake, Unity Catalog, Delta Live Tables, structured streaming, and MLflow.
Strong data modeling skills across medallion architecture, Star Schema, and Data Vault.
Cloud platform experience across AWS, Azure, or GCP, with proficiency in IaC tooling such as Terraform and ARM templates.
Track record of leading and mentoring engineering teams in consulting environments.
Client-facing experience within an agency or consultancy required.
Must be eligible to work in the United States or Canada without visa sponsorship, now or in the future.
**Certifications
Databricks Data Engineer Professional and/or Platform Architect accreditation (required or in progress).
Braze certifications are a plus—we develop platform expertise internally.
Tech Stack
AWS
Azure
Cloud
Google Cloud Platform
PySpark
Scala
SQL
Terraform
Unity
Vault
Benefits
Medical, dental, vision, and life insurance
401k with company match
Flexible PTO policy
Monthly tech stipend
Paid parental leave
In-person onboarding experience at our HQ in Indianapolis, Indiana