Participate in the design, development and delivery of efficient, effective and secure data flows with your team
Participate in the implementation of industry best practices based on your experience
Be an agent of change who will grow our data engineering practice to make our lab an industry benchmark
Requirements
A bachelor’s degree in computer science, data science, software engineering, or any equivalent combination of training and experience
Your 3+ years experience in Data engineering / Software Engineering / ETL development / Data Warehousing
Your demonstrated experience using Databricks, Snowflake and AWS to build data pipelines
Strong proficiency in the Python programming language and the PySpark Framework
Your experience in Big Data and Cloud technologies for processing large volumes of data
Basic understanding of machine learning and artificial intelligence
Bilingualism is required for candidates located in Quebec considering the necessity to interact on a regular basis with English-speaking colleagues across the country
Tech Stack
AWS
Cloud
ETL
PySpark
Python
Benefits
Flexible work arrangements and a hybrid work model
Possibility to purchase up to 5 extra days off per year
Multiple benefits offered to support physical and mental wellbeing, including telemedicine, Wellness account and much more
Share plan & other savings: up to 12% of salary or even more (ask how you could earn guaranteed income for life)