Working with clients (businesses users) to gather, clarify requirements and business rules by running workshops with stakeholders.
Contribute to the technical architecture, design and delivery for a new data platforms.
Provide overall thought leadership to our clients regarding data profiling and integration recommendations.
Data modelling (relational and dimensional modelling)
Managing development life cycle using various source code management tools.
Ability to perform performance tuning and optimisation on new and/or existing data warehouse solutions.
Provide detailed documentation, end user training and knowledge-transfer services to customers, and internal teams.
Requirements
Excellent consultation and stakeholder management skills.
Excellent verbal and written communication skills.
Proven track-record of designing and building modern data platforms
Advanced skills in Azure, Fabric, and Snowflake pipelines, with experience in tools such as Azure Data Factory, Coalesce, Matillion, dbt, Fivetran, and Airflow (preferred)
Advanced ETL/ELT process design and implementation.
Advanced skills in SQL performance tuning, query plan, query plan analysis, indexing, table partitioning.
Must have strong skills and experience in Python development in the data space and commonly used data analysis and processing related python packages.
Strong skills in cloud platforms such as Azure and AWS.
Strong skills in data platform implementation, configuration, performance and cost management tuning (i.e. Azure, Fabric, Snowflake)
Strong skills in Power BI and semantic data modelling.
Bachelor's degree in Computer Science or related discipline
Snowpro and Coalesce certified or Microsoft Azure/Fabric certified data engineer
Experience in EDW design and implementation.
Experience in ETL architecting and design.
Experience in data modelling (Star schema & Snowflake schema)