Own, optimize, and secure the Databricks environment, ensuring scalable, reliable, and well-monitored performance for data and ML workloads.
Enable and optimize data and ML workflows in Databricks, enhancing self-service capabilities and supporting Unity Catalog for governance and discovery.
Implement and maintain Databricks security, access controls, and compliance practices, ensuring least-privilege governance and alignment with enterprise standards.
Partner with data and ML teams to improve the Databricks platform, drive best practices and automation, and coordinate with DevOps on cloud infrastructure and integrations.
Requirements
5+ years of experience administering or engineering on Databricks, or in a closely related data platform role.
Strong working knowledge of Databricks workspace administration
cluster policies, permissions, Unity Catalog, Jobs/Workflows, and networking.
Familiarity with AWS services that underpin Databricks deployments (S3, IAM, VPC, EC2).
Experience with data pipeline and orchestration tools such as dbt, Airflow, Fivetran, or Delta Live Tables.
Proficiency in Python and SQL. Familiarity with Spark is expected.
Understanding of modern data governance and access control patterns
Unity Catalog, RBAC, data classification.
Preferred: Experience in financial services, investment management, or other highly regulated industries.
Preferred: Experience with Snowflake and AWS engineering.
Preferred: Hands-on experience with MLflow, model serving, or feature store capabilities within Databricks.
Preferred: Familiarity with IaC tools (Terraform) for managing Databricks resources programmatically.
Preferred: Hands-on experience with AI Engineering and LLMOps tools
LLM observability, eval pipelines, building and supporting agentic workflows.
Preferred: Experience with cost optimization strategies for Databricks (cluster policies, spot instances, photon, serverless SQL).
Preferred: Knowledge of distributed compute frameworks (Spark, Ray, Dask).
Tech Stack
Airflow
AWS
Cloud
EC2
Python
Ray
Spark
SQL
Terraform
Unity
Benefits
Innovative Environment: Work with cutting-edge technologies and industry leaders in data engineering and AI.
Customer Impact: Make a real difference in how businesses leverage data for strategic decision-making.
Career Growth: Opportunities for professional development and career advancement.
Collaborative Culture: Join a supportive team that values collaboration and knowledge sharing.