Design and evolve the architecture of the data platform and storage systems
Build reliable ETL/ELT processes and develop scalable data pipelines for delivering data into a centralized analytical warehouse
Maintain and further develop the Data Warehouse
Collaborate with engineering and analytics teams on system design and architectural decisions
Ensure data governance and maintain high standards of data quality
Write and optimize queries for MongoDB and ClickHouse
Manage and maintain workflows in Airflow
Requirements
7+ years of experience as a Data Engineer
5+ years of experience with Python, TypeScript, Node.js, Kafka
Strong understanding of data processing algorithms and principles
Hands-on experience with ClickHouse, Airflow, Amazon S3, Git, Docker
Solid understanding of Data Lake and Data Warehouse architectures
Experience working with large-scale data and query optimization
Ability to work collaboratively toward shared goals
Strong sense of ownership, responsibility, and proactivity
English level: B1+
Tech Stack
Airflow
Docker
ETL
JavaScript
Kafka
MongoDB
Node.js
Python
TypeScript
Benefits
31 days off
100% paid telemedicine plan
Home Office Setup Assistance: the company offers assistance with purchasing furniture (office chair, office desk, monitor) and other items to create a comfortable workspace