You develop sustainable data solutions, from requirements analysis through implementation and commissioning
You support the development of modern, high-quality data platforms — especially in cloud environments such as Azure, AWS, or GCP
You work daily with modern data management technologies (for example Databricks, Snowflake, or Microsoft Fabric) and continuously deepen your expertise in them
You implement ETL processes and optimize them for performance and scalability
You support clients in developing complex processing pipelines and in deploying data services/data products
You work closely with various stakeholders — in German and English
You focus on the continuous improvement and automation of existing systems
Requirements
You have hands-on experience implementing cloud data platform projects, e.g., with Snowflake, Databricks, Microsoft Fabric, Amazon Redshift, or Google BigQuery
You have experience with various data management tools, such as dbt, Azure Data Factory, AWS Glue, or GCP Dataflow
You have expertise in SQL and experience with programming languages like Python
You are experienced working with CI/CD pipelines
You enjoy continuous learning and are enthusiastic about new technologies and trends
You are a team player and possess excellent communication skills — in German and English
Tech Stack
Amazon Redshift
AWS
Azure
BigQuery
Cloud
ETL
Google Cloud Platform
Python
SQL
Benefits
Individual career development tailored to your personal interests and areas of focus
Active knowledge transfer and exchange through internal and external training programs, certifications, and workshops on the latest technologies
A dynamic, interdisciplinary team that is currently being built, where you can actively help shape the organization
We value work–life balance: flexible working hours, a centrally located office, flexible home office arrangements, and workation options
Work across a variety of client projects addressing innovative questions, with a technology-agnostic approach and the use of modern tools