Design and implement data models and data architecture solutions to support scalable analytics platforms
Develop and maintain ELT pipelines, data integration workflows, and data migration processes, particularly using Azure Data Factory
Design and implement data quality systems and governance processes
Support DevOps practices, including CI/CD deployments for data solutions
Develop and maintain solutions within Microsoft Fabric environments
Analyze business requirements and translate them into technical specifications
Document solutions including data models, configurations, and deployment setups
Work with large datasets to support analytics, reporting, and information analysis initiatives
Requirements
4+ Years of experience in Data Engineering
Strong hands-on experience implementing data migration and processing solutions using Azure services (the more areas experience the better) such as: Azure Storage, Azure SQL DB / Data Warehouse, Azure Data Factory, Azure Stream Analytics, Azure Analysis Services, Azure Databricks, Azure Data Catalogue, Event Hub, Cosmos DB, Azure Functions, ARM Templates
Hands-on experience with Microsoft Fabrics
Experience working with serverless architectures and cloud-based data platforms
Knowledge of big data, analytics, database management, and information processing
Excellent communication skills, able to clearly interact with clients and peers.
Experience presenting to groups and facilitating workshops.
Experience leading and mentoring technical staff.
Background in consulting and experience working on projects.
Consulting or client service delivery experience on Azure
Ability to work remotely via Microsoft Teams
Aptitude for critical and analytical thinking, as well as the capacity to leverage out-of the box solutions
Keen for a fast-paced environment with opportunities to learn, grow, and be exposed to a variety of problems, clients, and technologies.