AzureCloudDockerPySparkPythonSQLELTData LakeDatabricksServerlessAzure FunctionsApp Service
About this role
Role Overview
Design and implement modern data platforms (Data Lake/Lakehouse/Warehouse) on Microsoft Fabric and/or Azure, and bring them to production-grade quality.
Build and run complex pipelines (batch/near-real-time) and integrate data from various sources reliably.
Do data modelling and ensure real-world readiness: quality, performance, security, and cost awareness.
Build Azure foundations for data workloads: storage solutions, identity & access, networking principles & architectures, and secure-by-default execution.
Package components as containers (Docker) and run them on Azure based on the best-fit runtime.
Use serverless and PaaS patterns when they make sense.
Requirements
Minimum 2 years of experience in the relevant topics.
Fluency in Finnish (mandatory) and English.
Strong Python (and/or PySpark) skills and solid cloud development practices.
Strong SQL and understanding of modern warehousing/lakehouse concepts and patterns.
Hands-on experience with integrations and data pipelines (ELT) + APIs (REST patterns, auth, pagination, webhooks).
Experience with Microsoft Fabric and/or Azure Databricks implementations and development.