Corel is a beloved and trusted industry titan fueled by make-everything-easier flexibility. They are seeking an experienced Senior Data Engineer to develop and maintain data infrastructure and data flows, ensuring continuous data validation and telemetry for all data processes.
Responsibilities:
- Design, develop and implement large scale, high-volume, high-performance data infrastructure and pipelines for Data Lake and Data Warehouse
- Build and implement ETL frameworks to improve code quality and reliability
- Build and enforce common design patterns to increase code maintainability
- Ensure accuracy and consistency of data processing, results, and reporting
- Design cloud-native data pipelines, automation routines, and database schemas that can be leveraged to do predictive and prescriptive machine learning
- Communicate ideas clearly, both verbally and through concise documentation, to various business sponsors, business analysts and technical resources
- Guide and mentor other Data Engineers as a technical owner of parts of the data platform
Requirements:
- Expert knowledge of Python
- Expert knowledge of SQL
- Experience with DevOps mode of work
- 7+ years of professional experience
- 5+ years of experience working in data engineering, business intelligence, or a similar role
- 5+ years of experience in ETL orchestration and workflow management tools like Airflow, flink, etc. using AWS/GCP
- 3+ years of experience with the distributed data processing tools like Spark, Presto, etc. and streaming technologies such as Kafka/Flink
- Expertise with containerization orchestration engines (Kubernetes)
- BS in Computer Science, Software Engineering, or relevant field acceptable
- 3+ years of experience with SnowFlake (preferred) or another big data database platform
- 3+ years of experience with cloud service providers: Amazon AWS (preferred), or one of the other major clouds
- MS in Computer Science, Software Engineering, or relevant field preferred