Databricks is a leading data and AI company that helps organizations unify and democratize data, analytics, and AI. As a Sr. Specialist Solutions Engineer in Data Warehousing, you will guide customers through their cloud data warehousing transformations, providing technical leadership and expertise in large-scale data warehousing technologies and lake house architecture.
Responsibilities:
- Provide technical leadership to guide strategic customers to successful cloud transformations on large scale data warehousing workloads - ranging from evaluation to architecture design to production deployment
- Prove the value of the Databricks Intelligence Platform for customer workloads by architecting production workloads, including end-to-end pipeline load performance testing and optimization
- Become a technical expert in an area such as competitive data warehousing evaluations or helping set up successful workload migrations
- Assist Solution Architects with more advanced aspects of the technical sale including custom proof of concept content, estimating workload sizing and performance, and tuning workloads for production
- Provide tutorials and training to improve community adoption (including hackathons and conference presentations)
- Contribute to the Databricks Community
Requirements:
- 5+ years experience in a technical role with expertise in data warehousing - such as query tuning, performance tuning, troubleshooting, data governance, debugging MPP data warehouses or other big data solutions, or migration workloads from EDW systems
- Experience with design and implementation of data warehousing technologies including relational databases, SQL, data analytics, NoSQL, MPP, OLTP, and OLAP
- Deep Specialty Expertise in at least one of the following areas: Experience scaling large analytical data workloads in the cloud that are performant and cost-effective
- Maintained, extended, or migrated a production data warehouse system to evolve with complex needs, including data modeling, data governance needs, and integration with business intelligence tools
- Experience successfully migrating on-premise EDW workloads to the public cloud
- Bachelor's degree in Computer Science, Information Systems, Engineering, or equivalent experience through work experience
- Production programming experience in SQL and Python, Scala, or Java
- Experience with the AWS, Azure, or GCP clouds is highly desirable
- 3 years professional experience with data warehousing and big data technologies (Ex: SQL, Redshift, SAP, Synapse, EMR, OLAP & OLTP workloads)
- 3 years customer-facing experience in a pre-sales or post-sales role
- Can meet expectations for technical training and role-specific outcomes within 6 months of hire