Build, maintain, and optimise data pipelines across lakes, lakehouses, and warehouses
Write, test, and maintain high-quality code; develop transformations for reporting, analytics, and AI/ML
Apply data modelling principles; troubleshoot and resolve issues efficiently
Work with Databricks, Delta Live Tables, Lakeflow, Git, and CI/CD pipelines
Implement automated testing, data validation, and performance optimisation
Lead unit and integration testing; ensure solutions meet quality and documentation standards
Support knowledge sharing and improvement of internal frameworks and templates
Act as primary engineering contact, leading technical discussions and translating business needs into technical tasks
Communicate progress, risks, and blockers; contribute to estimation, scoping, and planning
Mentor and coach junior/intermediate team members; conduct peer reviews
Ensure accountability for code quality, documentation, and adherence to best practices
Collaborate with architects on technical solutions and delivery planning
Participate in learning sessions and knowledge sharing; adopt modern data concepts (AI/ML, data visualisation, cloud-native)
Contribute to process, template, and documentation improvements; pursue relevant training and certification
Requirements
6+ years of professional experience in data engineering or a related field
At least 4 years of hands-on experience developing data pipelines and transformations using Databricks, including Delta Live Tables, notebooks, and workflows
Familiarity with Microsoft Fabric concepts and experience contributing to projects that incorporate Fabric components
Proficiency in Python and SQL for data transformation and analysis
Understanding of data modelling including Star and Snowflake schema design for analytics workloads
Knowledge of Medallion architecture principles and well-architected, scalable data solution design
Experience applying DevOps practices such as version control, CI/CD, and environment automation
Experience with testing and data quality frameworks to ensure reliable, accurate data delivery
Experience with visualisation and data consumption concepts, including Power BI or similar tools
Relevant industry certifications (Databricks Professional and Databricks Associate) or a demonstrated commitment to ongoing professional development