Design and evolve the architecture of the data platform and storage systems.
Build reliable ETL/ELT processes and develop scalable data pipelines for delivering data into a centralized analytical warehouse.
Maintain and further develop the Data Warehouse.
Collaborate with engineering and analytics teams on system design and architectural decisions.
Ensure data governance and maintain high standards of data quality.
Write and optimize queries for MongoDB and ClickHouse.
Manage and maintain workflows in Airflow.
Requirements
7+ years of experience as a Data Engineer.
5+ years of experience with Python, TypeScript, Node.js, Kafka.
Strong understanding of data processing algorithms and principles.
Hands-on experience with ClickHouse, Airflow, Amazon S3, Git, Docker.
Solid understanding of Data Lake and Data Warehouse architectures.
Experience working with large-scale data and query optimization.
Ability to work collaboratively toward shared goals.
Strong sense of ownership, responsibility, and proactivity.
English level: B1+.
Tech Stack
Airflow
Docker
ETL
JavaScript
Kafka
MongoDB
Node.js
Python
TypeScript
Benefits
Private medical insurance for the employee and their family
22 paid vacation days per year
Up to 14 paid public holidays per year
5 company-paid sick leave days
English learning courses.
Relevant professional education
Gym or swimming pool
Home Office Setup Assistance: the company offers assistance with purchasing furniture (office chair, office desk, monitor) and other items to create a comfortable workspace
Co-working
€50 monthly allowance to cover internet and mobile phone expenses