Derevo is a company that empowers organizations by unlocking the value of their data and enhancing the talent that transforms it. They are seeking a Senior Fabric Data Engineer to design, build, and optimize enterprise-grade data solutions using Microsoft Fabric, focusing on Lakehouse architecture and real-time analytics.
Responsibilities:
- Design and development of Lakehouse architectures using Microsoft Fabric, including OneLake, Delta tables, and Medallion architecture
- Development of Dataflows Gen2, Notebooks (PySpark / SQL), and Pipelines for data ingestion, transformation, and orchestration
- Implementation of end-to-end data pipelines using Data Factory (Fabric), Spark notebooks, and streaming workloads
- Development of semantic models and governed datasets to support enterprise analytics
- Collaboration with Power BI developers to design DAX measures, relationships, and optimized data models
- Implementation of Git integration and deployment pipelines for Fabric workloads
Requirements:
- 7–10+ years of experience in Data Engineering
- Strong hands-on experience with Microsoft Fabric, including at least one production implementation
- Data Engineering & Lakehouse Development: Design and development of Lakehouse architectures using Microsoft Fabric, including OneLake, Delta tables, and Medallion architecture
- Data Integration & Orchestration: Development of Dataflows Gen2, Notebooks (PySpark / SQL), and Pipelines for data ingestion, transformation, and orchestration
- ETL / ELT Pipelines: Implementation of end-to-end data pipelines using Data Factory (Fabric), Spark notebooks, and streaming workloads
- Languages: SQL (Advanced/Expert) and PySpark / Spark SQL
- Architecture Design: Design of high-performance data pipelines for batch and real-time workloads
- Analytics & Semantic Modeling: Development of semantic models and governed datasets to support enterprise analytics
- Power BI Integration: Collaboration with Power BI developers to design DAX measures, relationships, and optimized data models
- CI/CD & DevOps: Implementation of Git integration and deployment pipelines for Fabric workloads
- Advanced English (required)
- Experience working with cross-functional teams, including Data Scientists, BI Engineers, Domain Owners, and Business Stakeholders
- Ability to mentor junior engineers and contribute to engineering best practices
- Experience supporting modern data platform initiatives and contributing to Centers of Excellence, reusable frameworks, and architectural standards
- Real-Time Analytics: Experience with Eventstream, Real-Time Hub, or KQL
- Data Governance: Experience implementing data governance frameworks, lineage, cataloging, and metadata management using tools such as Purview
- Experience with the Azure Data Platform ecosystem (ADLS, Azure Data Factory, Synapse, Databricks)
- Familiarity with Agile delivery environments and product-oriented data teams
- Exposure to Machine Learning pipelines or ML integration patterns