ApacheAWSAzureCloudDistributed SystemsHadoopInformaticaOraclePySparkPythonSparkSQLTableauVaultAIMLNatural Language ProcessingData EngineeringData LakeAnalyticsBusiness IntelligenceSnowflakeDatabricksdbtStatistical AnalysisSQL ServerProject ManagementSales
About this role
Role Overview
Engage with several Snowflake customers.
Lead technical streams of client data platform implementations and on-boarding efforts.
Work with the other Customer Engagement and Delivery matrix resources to ensure proper technical guidance, project management and functional support.
Collaborate as needed with different Snowflake organisations: Engineering, Support, Sales and Marketing.
Help clients troubleshoot their implementations and integrate the product within their ecosystem.
Identify, document, triage and track issues to ensure resolution.
Actively contribute to the growth and scalability of the Customer Delivery and Training and Education teams through robust documentation, continuous process optimization, and capability cross-training.
Apply industry and technology expertise to client implementations.
Gather intelligent product feedback and recommendations from customers to design and inform new features and capabilities.
Requirements
5+ years of experience in a customer facing technical role.
Strong track record in building best in class cloud data platforms.
Distributed systems and massively parallel processing technologies and concepts such as Snowflake, Teradata, Spark, Databricks, Hadoop, Oracle, SQL Server, and performance optimisation.
Data strategies and methodologies such as Data Mesh, Data Vault, Data Fabric, Data Governance, Data Management, Enterprise Architecture.
Data organisation and modeling concepts and techniques such as Data Lake, Data Warehouse, Medallion architecture, Kimball dimensional modeling, and 3NF database normalisation.
Infrastructure concepts such as the Cloud Hyperscalers (e.g. AWS and Azure) fundamentals of IaaS, PaaS, Networking, Security, Encryption, Identity and Access Management, and Disaster Recovery Planning.
Data Engineering concepts and frameworks such as batch processing, stream processing, replication, SQL, DBT, Talend, Informatica, Python, Snowpark, PySpark, DataFrames, storage formats (e.g Parquet, Avro, Apache Iceberg, Delta Lake), Orchestration and DevOps.
Business Intelligence and analytics solutions such as Tableau, PowerBI, MicroStrategy, Thoughtspot, SAS, Streamlit, and techniques such as time series analysis, Advanced SQL, and statistical analysis (e.g linear regression, variance analysis, modeling, and forecasting).
AI/ML fundamental understanding of key concepts such as classification, regression, clustering, dimensionality reduction, Natural Language Processing and Language Models.
Tech Stack
Apache
AWS
Azure
Cloud
Distributed Systems
Hadoop
Informatica
Oracle
PySpark
Python
Spark
SQL
Tableau
Vault
Benefits
Private medical and insurance plans are part of our benefits, in markets where this is a customary employment benefit.
Strong private pension plans.
Very attractive parental leave and support with adoption, surrogacy, pregnancy or infant care related topics.
Ample opportunity to change the future of data analytics by joining us.
Competitive salary plus equity on top.
Importance of work life balance, with annual vacation at the top of the market.
Help each other to stay on top of the game and up-to-date with the latest features in the Snowflake AI Data Cloud.