Databricks is building the world's best and most secure platform for data and AI. As a member of the Trust and Safety Data Science team, you will develop and implement machine learning models to detect anomalous activity and collaborate with various teams to enhance security and compliance for the Databricks platform.
Responsibilities:
- You will develop and implement Machine Learning models to detect anomalous activity in products that we offer
- You will analyze the performance and pricing of security-related features and work with product and engineering teams to identify important opportunities
- You will collaborate with security engineers, trust and safety experts, and machine learning engineers to build a variety of systems and tools that protect Databricks and our customers from threats
- You will create solutions and frameworks to meet compliance requirements at Databricks
- You will gather requirements, define project OKRs and milestones, and communicate progress to both technical and non-technical audiences
- You will guide junior data scientists and interns on the team by helping with project planning, technical decisions, and code and document review
- You will represent the data science discipline throughout the organization, using your powerful voice to make us more data-driven
- You will represent Databricks at academic and industrial conferences and events