Home
Jobs
Saved
Resumes
Data Engineer at workidentity GmbH | JobVerse
JobVerse
Home
Jobs
Recruiters
Companies
Pricing
Blog
Jobs
/
Data Engineer
workidentity GmbH
Remote
Website
LinkedIn
Data Engineer
Germany
Full Time
2 hours ago
$70,000 - $90,000 EUR
No Sponsorship
Apply Now
Key skills
Airflow
Apache
AWS
Azure
Cloud
ETL
Hadoop
IoT
Kafka
Python
Spark
SQL
TypeScript
Analytics
BI
Power BI
Snowflake
Apache Airflow
Google Cloud
About this role
Role Overview
Build and maintain efficient, scalable data pipelines to ingest, process and store data from various sources.
Ensure data quality, integrity and availability.
Design and implement database schemas and structures that enable optimal storage, querying and processing of data.
Develop, manage and optimize ETL processes to extract, transform and load data into target systems.
Ensure timely and accurate delivery of data.
Use Snowflake and Azure Data Factory to manage cloud-based data warehouse and data integration processes.
Ensure efficient data storage and accessibility.
Integrate IoT data using Azure IoT Hub to enable real-time data processing and analytics.
Implement and manage data workflows with Apache Airflow to automate and orchestrate ETL processes and other data-related tasks.
Requirements
Bachelor's degree in Computer Science, Information Technology, Engineering or a related field.
At least 5 years of experience as a Data Engineer or in a comparable role.
Solid experience with data pipeline tools and techniques.
Excellent knowledge of Snowflake and Azure Data Factory.
Experience integrating IoT data via Azure IoT Hub.
Hands-on experience with Apache Airflow for workflow automation.
Expert knowledge of ETL processes and related tools.
Strong SQL skills and experience with relational databases.
Familiar with data modeling principles and best practices.
Experience with data visualization tools, particularly Microsoft Power BI.
Knowledge of cloud computing concepts, especially Microsoft Azure.
Experience with other cloud platforms such as AWS or Google Cloud.
Familiarity with big data technologies like Hadoop, Spark or Kafka.
Experience with Python or other scripting languages.
Knowledge of data governance and data quality frameworks.
Excellent German and English language skills required.
Tech Stack
Airflow
Apache
AWS
Azure
Cloud
ETL
Hadoop
IoT
Kafka
Python
Spark
SQL
TypeScript
Benefits
Permanent full-time position (37.5 hours/week for full-time)
Flexible working hours
Hybrid remote-work arrangement (2–4 days working from home per week possible)
30+ days of vacation
Internal and external training and development opportunities
Holiday pay, bonus payments, company pension scheme (BAV) & capital-forming benefits (VWL)
On-site gym, subsidized canteen, commuting allowance, company bike (JobRad), employee & team events
Apply Now
Home
Jobs
Saved
Resumes