Sogeti is part of the Capgemini Group, focusing on delivering innovative solutions through technology. They are seeking an experienced AWS Snowflake Data Engineer to design, develop, and optimize scalable data pipelines, primarily migrating Oracle-based data systems to Snowflake on AWS.
Responsibilities:
- Design, build, and maintain scalable data pipelines on AWS and Snowflake
- Lead and execute migration projects from Oracle to Snowflake
- Develop ETL/ELT processes using Python, AWS Glue, or similar frameworks
- Optimize Snowflake performance through query tuning, warehouse sizing, and data partitioning
- Collaborate with data analysts, architects, and stakeholders to ensure high-quality, reliable data delivery
- Implement best practices for data security, governance, and automation in AWS
- Monitor and troubleshoot data workloads, resolving performance and reliability issues
Requirements:
- 5+ years of experience in data engineering or related roles
- Strong hands-on experience with AWS (S3, Glue, Lambda, EC2, IAM)
- Proven experience with Snowflake architecture, development, and optimization
- Proficiency in Python for data processing, automation, and integration
- Demonstrated experience with Oracle database migrations to Snowflake on AWS
- Strong SQL skills with focus on performance tuning
- Familiarity with CI/CD workflows, Teraform and version control (Git)
- AWS Certified Data Analytics – Specialty or AWS Solutions Architect Certification
- Snowflake SnowPro Certification
- Experience with Infrastructure as Code (Terraform or CloudFormation)
- Knowledge of Airflow or similar orchestration tools