Design, develop, and maintain scalable data pipelines, ETL/ELT processes, and data integration solutions with moderate independence.
Translate business requirements into technical designs and contribute to solution architecture discussions.
Build and enhance business-to-business (B2B) integrations and internal data processing workflows.
Diagnose and resolve data quality issues, pipeline failures, and performance bottlenecks; propose optimizations.
Develop clear documentation for data models, workflows, and engineering solutions.
Collaborate with cross-functional teams including data architects, analysts, application developers, and business stakeholders to ensure effective data delivery.
Contribute to project workstreams, ensuring tasks are delivered with quality and within timelines.
Adhere to and promote engineering best practices, coding standards, and data governance guidelines.
Requirements
+5 years of experience in similar roles.
Strong programming experience in Python AND/OR Java.
Advanced SQL Experience.
+4 years of experience with Databricks.
Experience in Agile work environment.
Tech Stack
ETL
Java
Python
SQL
Benefits
30 days of Christmas bonus
40% vacation premium
12 vacation days plus 2 floating days, your birthday and sick days.