lead the technical delivery of large-scale data platform and BI solutions
design, implement and own the delivery of robust Data Platform solutions and end-to-end components
work closely with principal consultants, practice leads, project managers, and client stakeholders and teams
gather, clarify requirements and business rules by running workshops with stakeholders
contribute to the technical architecture, design and delivery for new data platforms
provide overall thought leadership to clients regarding data profiling and integration recommendations
perform performance tuning and optimisation on new and/or existing data warehouse solutions
provide detailed documentation, end user training and knowledge-transfer services to customers, and internal teams
Requirements
Bachelor's degree in Computer Science or related discipline
Snowpro and Coalesce certified or Microsoft Azure/Fabric certified data engineer
Proven track-record of designing and building modern data platforms
Advanced skills in Azure, Fabric, and Snowflake pipelines, with experience in tools such as Azure Data Factory, Coalesce, Matillion, dbt, Fivetran, and Airflow (must have)
Advanced ETL/ELT process design and implementation
Advanced skills in SQL performance tuning, query plan, query plan analysis, indexing, table partitioning
Must have strong skills and experience in Python development in the data space and commonly used data analysis and processing related python packages
Strong skills in cloud platforms such as Azure and AWS
Strong skills in data platform implementation, configuration, performance and cost management tuning (i.e. Azure, Fabric, Snowflake)
Strong skills in Power BI and semantic data modelling
Experience in EDW design and implementation
Experience in ETL architecting and design
Experience in data modelling (Star schema & Snowflake schema)