Support internal organizations in implementing and delivering their strategy and results
Build, manage, and optimize reusable enterprise data pipelines
Create, maintain and reuse existing ETL processes
Identify and implement internal process improvements
Work with stakeholders including Product, Data and Business teams for data-related issues
Create datasets for operational reports and analytics
Write, debug and implement complex queries
Create and maintain technical design documentation
Participate in requirements gathering
Collaborate with the Enterprise Architecture team to ensure alignment on data standards
Requirements
Bachelor of Science in Computer Science, Information Technology or equivalent
3+ years of experience in a data/cloud engineering role
3+ years of experience working and creating datasets for a data warehouse
Clear understanding of data modeling patterns (relational and dimensional)
3+ years of experience with ETL development tools, Informatica, Databricks or Azure Data Factory (ADF) preferred
3+ years of cloud experience, Azure preferred
Working knowledge and experience in using best practices in designing, building, and managing data pipelines, including data transformations and workload management
Knowledge and experience in working with large datasets using various integration technologies
Demonstrate creativity with analytical and problem-solving skills