Designing and building Big Data / Analytics / AI platforms, including DevOps workflows, Infrastructure as Code (IaC), CI/CD, Identity and Access Management (IAM), and monitoring & alerting
Designing, developing and scaling data pipelines across various cloud architectures, incorporating data quality measures as well as designing and implementing data models
Operationalizing statistical and ML/AI models and data scientists' results to make them available for production use and to automate them
Selecting and advising clients on the pros and cons of different technology stacks, aligned with use cases and the company's context
Taking a leading role in client communication and leading engineering projects with technical responsibility for a development team and project deliverables
Building data engineering expertise within the company, e.g., by delivering internal trainings, leading internal projects and actively participating in the engineering community
Requirements
Degree in Computer Science, Data Engineering, Informatics, or another mathematical/scientific discipline, or equivalent qualification
Strong communication skills (fluent) in German and English and willingness to travel as required for consulting
Minimum 6 years of full-time professional experience as a Data Engineer in the design & implementation of data platforms and data management processes, or in a similar field
At least 5 years of hands-on experience with common programming languages such as Python, Scala, SQL and developer tools like Git
At least 4 years of implementation experience with Big Data technologies, particularly Spark / Kafka, distributed storage systems, open table formats, and cluster computing
At least 4 years' experience working with various cloud components (object storage, databases, messaging services, ETL/ELT engines, security, scheduling, operations, monitoring, etc.) in AWS / Azure / GCP environments
Experience with at least one of the following technologies: Databricks, Snowflake, AWS SageMaker, Azure Synapse, or similar data analytics platforms
Experience with IaC tools such as Terraform, AWS CDK, etc., as well as common CI/CD processes and tools
Nice to have: experience with UNIX systems and network configuration
Tech Stack
AWS
Azure
Cloud
ETL
Google Cloud Platform
Kafka
Python
Scala
Spark
SQL
Terraform
Unix
Benefits
Work-life balance with trust-based working hours and flexible scheduling — work fully remote or from one of our modern city-center offices (including rooftop terrace). Workation: use the option to work from within the EU
Unique team atmosphere, flat hierarchies including direct access to our CEO Alex, and an open feedback culture; annual team workshops at our Data.Castle in the Zillertal; lived Data.Musketeer principle — “one for all, all for one!”; our @Buddy program for better networking; regular professional and social events; dog-friendly offices
Intensive onboarding and training process, personal development plan and individual training opportunities; a wide range of workshops and training within the Data.Academy led by our experienced Data.Musketeers and external providers; career paths in leadership, project management and expert tracks
Childcare subsidy, company pension plan with 20% employer contribution, numerous corporate benefits & employee offers (e.g., for events and travel), starter credit in our internal merchandise shop, competitive salary with variable components
Mental health & wellbeing support including coaching and meditation via nilo.health; fitness and yoga rooms in the Munich office; regular employee surveys; EGYM Well Pass membership with Plus1 option; bicycle leasing via Jobrad after probation; internal groups for sports activities; free hot and cold beverages and fresh fruit in the office; rooftop terrace (barbecue)