Databricks is a leading data and AI company that provides a unified platform for data analytics and AI solutions. As a Sr. Solutions Engineer, you will partner with customers to design scalable data architectures and engage in complex technology discussions to drive the value of Databricks' platform throughout the sales lifecycle.
Responsibilities:
- You will work with Sales and other essential partners to develop account strategies for your assigned accounts to grow their usage of the platform
- Establish the Databricks Lakehouse architecture as the standard data architecture for customers through excellent technical account planning
- You will build and present reference architectures and demo applications for prospects to help them understand how Databricks can be used to achieve their goals to land new users and use cases
- Capture the technical win by consulting on big data architectures, data engineering pipelines, and data science/machine learning projects; prove out the Databricks technology for strategic customer projects; and validate integrations with cloud services and other 3rd party applications
- Become an expert in, and promote Databricks inspired open-source projects (Spark, Delta Lake, MLflow, and Koalas) across developer communities through meetups, conferences, and webinars
Requirements:
- 3+ years in a customer-facing pre-sales, technical architecture, or consulting role with expertise in at least one of the following technologies: Big data engineering (Ex: Spark, Hadoop, Kafka), Data Warehousing & ETL (Ex: SQL, OLTP/OLAP/DSS), Data Science and Machine Learning (Ex: pandas, scikit-learn, HPO), Data Applications (Ex: Logs Analysis, Threat Detection, Real-time Systems Monitoring, Risk Analysis and more)
- Experience translating a customer's business needs to technology solutions, including establishing buy-in with essential customer stakeholders at all levels of the business
- Experienced at designing, architecting, and presenting data systems for customers and managing the delivery of production solutions of those data architectures
- Fluent in SQL and database technology
- Debug and development experience in at least one of the following languages: Python, Scala, Java, or R
- Travel to customers in your region up to 30% of the time
- Built solutions with public cloud providers such as AWS, Azure, or GCP
- Degree in a quantitative discipline (Computer Science, Applied Mathematics, Operations Research)