Architect and evolve Kolomolo’s lakehouse-based data platform (Databricks, Delta Lake, cloud-native storage)
Define domain-oriented data products aligned with data mesh principles
Lead the design and rollout of data platform capabilities, which involves ingestion, transformation, cataloging, governance, observability, and lineage
Partner with business domain teams to enable self-service data access and stewardship
Design, build and maintain data pipelines, data modeling and transformation workflows on the Databricks platform (batch & streaming)
Lead the architecture and roadmap for the data platform and its evolution toward a federated, domain-driven model
Work with cloud services (e.g., AWS / Azure / GCP) and infrastructure tools (Terraform, CI/CD pipelines) to deploy and manage data platform components
Implement and optimise data models, schemas, and storage (e.g., Delta Lake) for structured and semi-structured datasets
Promote platform observability
logging, monitoring, cost optimization, and data quality tracking
Collaborate closely with data scientists, analysts, BI teams and business stakeholders to translate business requirements into technical solutions
Ensure high data quality, governance, security and observability in all data artefacts and processes
Deal with high-quality documentation, and data governance
Troubleshoot performance, scalability, reliability issues of data pipelines, tuning Spark/Databricks job performance as required
Contribute to best practices around data engineering (e.g., code reviews, version control, documentation, test automation)
Mentor junior engineers and help build a culture of continuous improvement in data platform engineering
Requirements
7+ years of professional experience in data platform and platform architecture, with at least 3 years of hands-on Databricks experience
Strong experience with Databricks (e.g., PySpark/Scala, Spark SQL, Delta Lake) in production environments
Proficiency in Python, SQL, and preferably experience with Scala or Java
Deep understanding of cloud platforms (AWS/Azure/GCP) including data/analytics services, storage, compute, networking
Hands-on with infrastructure-as-code tools (e.g., Terraform), CI/CD pipelines and DevOps practices
Experience with both batch and streaming data processing and modern data architectures (e.g., lakehouse, Medallion architecture)
Solid problem-solving skills, attention to detail, and ability to work in a fast-moving, collaborative environment
Excellent communication skills and ability to interface with technical and non-technical stakeholders
Tech Stack
AWS
Azure
Cloud
Google Cloud Platform
Java
PySpark
Python
Scala
Spark
SQL
Terraform
Benefits
Competitive salary and benefits
Career development opportunities in a growing tech company
Continuous learning culture: mentorship, internal training, and certifications
Flexible, agile work environment (remote from Poland or Hybrid in Stockholm, Sweden)
Office perks: great coffee, tea, fresh fruit, snacks, and a fun atmosphere
Flat management structure, where your voice matters
Regular team events and a social, supportive work culture