ETLHadoopInformaticaNoSQLPythonSparkSQLTableauRELTData EngineeringData WarehousingBIBusiness IntelligencePower BI
About this role
Role Overview
Participate in the design and implementation of Data Warehousing solutions, including relational and dimensional modeling.
Work on ETL/ELT processes to reliably and efficiently structure and transform data.
Use and optimize SQL, NoSQL and Big Data environments (Hadoop, Spark) to meet business requirements.
Develop and maintain models that comply with Data Governance standards, including metadata, normalization and dimensional schemas.
Analyze, structure and formalize requirements to ensure alignment between business goals and delivered data models.
Requirements
Master’s degree (Bac+5) in Data Engineering, Computer Science, Business Intelligence, Statistics/Data Science, or a related field focused on data modeling and management.
Design and develop data models suitable for Data Lakes and Data Warehouses.
Define data structures, schemas and relationships between entities to ensure a robust and scalable architecture.
Optimize models to ensure performance, scalability and quality of analytical processing.
Collaborate with business teams to understand their needs and translate them into relevant data models.
Ensure data quality, integrity and consistency across analytical pipelines.
Strong command of technologies: SQL, Python, R, as well as visualization tools Tableau and Power BI.
Proven experience with ETL solutions and data management and governance platforms: Informatica, Collibra, Talend.
Familiarity with Big Data environments, notably Hadoop and Spark.
Tech Stack
ETL
Hadoop
Informatica
NoSQL
Python
Spark
SQL
Tableau
Benefits
Our commitment and priorities: Capgemini promotes an inclusive culture within a multicultural and disability-friendly environment. By joining us, you become part of a team that values diversity, develops its talent, engages in solidarity initiatives with partners, and works to reduce its environmental impact across all sites and with its clients.