Own, optimize, and secure the Databricks environment, ensuring scalable, reliable, and well-monitored performance for data and ML workloads.
Enable and optimize data and ML workflows in Databricks, enhancing self-service capabilities and supporting Unity Catalog for governance and discovery.
Implement and maintain Databricks security, access controls, and compliance practices, ensuring least-privilege governance and alignment with enterprise standards.
Partner with data and ML teams to improve the Databricks platform, drive best practices and automation, and coordinate with DevOps on cloud infrastructure and integrations.
Requirements
5+ years of experience administering or engineering on Databricks, or in a closely related data platform role.
Strong working knowledge of Databricks workspace administration
cluster policies, permissions, Unity Catalog, Jobs/Workflows, and networking.
Familiarity with AWS services that underpin Databricks deployments (S3, IAM, VPC, EC2).
Experience with data pipeline and orchestration tools such as dbt, Airflow, Fivetran, or Delta Live Tables.
Proficiency in Python and SQL. Familiarity with Spark is expected.
Understanding of modern data governance and access control patterns
Unity Catalog, RBAC, data classification.
Tech Stack
Airflow
AWS
Cloud
EC2
Python
Spark
SQL
Unity
Benefits
Innovative Environment: Work with cutting-edge technologies and industry leaders in data engineering and AI.
Customer Impact: Make a real difference in how businesses leverage data for strategic decision-making.
Career Growth: Opportunities for professional development and career advancement.
Collaborative Culture: Join a supportive team that values collaboration and knowledge sharing.