Act as an expert in Azure Data Platform, including Azure Data Factory (ADF), Azure Databricks, Synapse Analytics, Event Hub, Blob Storage
Design and build scalable data pipelines for ingestion, transformation, and curation/normalisation of data
Implement data quality checks, error handling, and monitoring frameworks to ensure reliability
Apply strong knowledge of data modelling and data warehouse concepts to optimise architecture
Troubleshoot issues during development and testing phases, ensuring smooth deployment of data pipelines
Collaborate with data scientists, analysts, and business stakeholders to deliver robust solutions
Support pre-sales activities and RFP responses with technical input when required
Contribute to knowledge sharing and best practices within the team
Requirements
Bachelor’s degree in Computer Science, Engineering, or related field
Hands-on experience with Azure data services: Data Factory, Synapse, Databricks, Event Hub, Blob Storage
Proficiency in SQL and Python; strong understanding of ETL/ELT processes
Hands-on experience with Power BI
Knowledge of data quality frameworks and error handling techniques
Strong analytical and problem-solving skills; effective communicator
Preferred certifications: Microsoft Certified: Azure Data Engineer Associate (DP-203); Microsoft Certified: Fabric Analytics Engineer Associate; Databricks Certified Data Engineer Associate