Snowflake is a company focused on empowering enterprises to achieve their full potential through innovative technology. They are seeking a Data Engineering Specialist to provide technical leadership in designing and architecting the Snowflake Cloud Data Platform, working closely with sales teams and customers to demonstrate the platform's capabilities.
Responsibilities:
- Apply your multi-cloud data architecture expertise while presenting Snowflake technology and vision to executives and technical contributors at strategic prospects, customers, and partners
- Work hands-on with prospects and customers to demonstrate and communicate the value of Snowflake technology throughout the sales cycle, from demo to proof of concept to design and implementation
- Immerse yourself in the ever-evolving industry, maintaining a deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them
- Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing
Requirements:
- 10+ years of architecture and data engineering experience within the Enterprise Data space
- 5+ years experience within a pre-sales environment (Sales Engineer, Solutions Engineer, Solutions Architect, etc…)
- Outstanding presentation skills to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos
- Ability to connect a customer's specific business problems and Snowflake's solutions
- Ability to do deep discovery of customer's architecture framework and connect those with Snowflake Data Architecture
- Broad range of experience within large-scale Database and/or Data Warehouse technology, ETL, analytics and cloud technologies. For example, Data Lake, Data Mesh, Data Fabric
- Hands on Development experience with technologies such as SQL, Python, Pandas, Spark, PySpark, Hadoop, Hive and any other Big data technologies
- Deep understanding of data integration services and tools for building ETL and ELT data pipelines such as Apache NiFi, Matillion, Fivetran, Qlik, or Informatica
- Familiarity with streaming technologies (ex. Kafka, Flink, Spark Streaming, Kinesis) and real-time or near real time use cases (ex. CDC)
- Experience designing interoperable data lakehouse architectures and experience working with Iceberg, Delta, and Parquet
- Strong architectural expertise in data engineering to confidently present and demo to business executives and technical audiences, and effectively handle any impromptu questions
- Bachelor's Degree required
- Masters Degree in computer science, engineering, mathematics or related fields, or equivalent experience preferred