Royal Credit Union is seeking a Data Engineer to join their team and contribute to the development of their cloud-based Business Intelligence platform. The role involves transforming raw data into meaningful insights and building data systems that support various business initiatives.
Responsibilities:
- Help build our cloud-based Business Intelligence platform, solve complex analytical challenges, and create data systems that empower teams across the organization
- Work in a team that is building the Business Intelligence platform at Royal
- Analyze and transform raw data into useful cloud-based data systems
- Involve increasingly complex level of data and statistical analysis in support of creating and shaping data systems, pipelines, and other business-related activities
- Support line of business for all quantitative needs to include leading mid-size initiatives
Requirements:
- Bachelor's Degree or the Equivalent in Experience in software engineering, data engineering, data science, information systems, computer science, mathematics, machine learning, or related field of study
- Three years' experience in data engineering, data science, or software engineering experience, including knowledge of extensive data ecosystem
- Must be bondable
- Advanced data architecture and system design expertise, including experience working with multiple file structures (flat files, JSON, XML), building and maintaining cloud-based data systems, and developing scalable data infrastructures, pipelines, and warehouses
- Proficiency in modern programming and analytical tools, including strong skills in SQL, R, and/or Python, data management principles, and implementing cloud-based analytic solutions (AWS, Azure, etc.)
- Strong data connectivity and integration capabilities, including working with APIs (REST/SOAP), ODBC/JDBC, HTTP webhooks, and overseeing data cleansing, standardization, and advanced analytic solution development
- Effective communication and collaboration abilities, including summarizing technical concepts for non‑technical audiences, working within Agile environments (Azure DevOps), using version control (Git, Kubernetes), and contributing to cross-departmental documentation and process improvement
- Exceptional problem‑solving and adaptability, with the ability to perform root‑cause analysis, handle ambiguity, juggle multiple projects, embrace change, and leverage technology solutions while respecting confidentiality and security standards
- Experienced in implementing Data and Advanced Analytic solutions, or related experience in the Cloud (AWS, Azure, etc.)
- Experience implementing an end-to-end Cloud native platform for Datawarehouse (Snowflake, Databricks, Big Query, Redshift, etc.)
- Experience with data connectivity methodologies such as APIs (Rest/Soap), ODBC/JDBC, HTTP web hooks, JSON, etc
- Intermediate experience with data management principles