Databricks is the data and AI company, and they are seeking a Specialist Solutions Architect - Data Warehousing to guide customers in their cloud data warehousing transformation. This role involves providing technical leadership for cloud transformations, assisting with architecture design and production deployment, and serving as a technical expert in data warehousing evaluations and workload migrations.
Responsibilities:
- Provide technical leadership to guide strategic customers to successful cloud transformations on large-scale data warehousing workloads - ranging from evaluation to architecture design to production deployment
- Prove the value of the Databricks Intelligence Platform for customer workloads by architecting production workloads, including end-to-end pipeline load performance testing and optimization
- Become a technical expert in an area such as data warehousing evaluations or helping set up successful workload migrations
- Assist Solution Architects with more advanced aspects of the technical sale including custom proof of concept content, estimating workload sizing and performance, and tuning workloads for production
- Provide tutorials and training to improve community adoption (including hackathons and conference presentations)
- Contribute to the Databricks Community
Requirements:
- 5+ years experience in a technical role with expertise in data warehousing - such as query tuning, performance tuning, troubleshooting, data governance, debugging MPP data warehouses or other big data solutions, or migration workloads from EDWother systems
- Experience with design and implementation of data warehousing technologies including relational databases, SQL, data analytics, NoSQL, MPP, OLTP, and OLAP
- Deep Specialty Expertise in at least one of the following areas: Experience scaling large analytical data workloads in the cloud that are performant and cost-effective
- Maintained, extended, or migrated a production data warehouse system to evolve with complex needs, including data modeling, data governance needs, and integration with business intelligence tools
- Experience migrating on-premise EDW workloads to the public cloud
- Bachelor's degree in Computer Science, Information Systems, Engineering, or equivalent experience through work experience
- Production programming experience in SQL and Python, Scala, or Java
- Experience with the AWS, Azure, or GCP clouds
- 2 years professional experience with data warehousing and big data technologies (Ex: SQL, Redshift, SAP, Synapse, EMR, OLAP & OLTP workloads)
- 2 years customer-facing experience in a pre-sales or post-sales role
- Can meet expectations for technical training and role-specific outcomes within 6 months of hire
- Can travel up to 30% when needed