Partner with technical and non-technical colleagues to understand data and reporting requirements
Contribute to the design of table structures and defining ETL pipelines to build performant Data solutions that are reliable and scalable in a fast growing data ecosystem
Develop Data Quality checks, write code, complete programming, write tests, perform testing and debug code
Develop and maintain ETL routines using ETL and orchestration tools such as Airflow
Perform ad hoc analysis as necessary
Requirements
3+ years of relevant data engineering experience
Good understanding of data modeling principles including Dimensional modeling, data normalization principles
Good understanding of SQL Engines and able to conduct advanced performance tuning
Experience using analytic SQL, working with traditional relational databases and/or distributed systems (Snowflake or Redshift)
Strong communication skills – written and verbal presentations
Experience working with Python
Bachelor’s Degree in computer science, information systems, or related field or equivalent work experience
Tech Stack
Airflow
Amazon Redshift
Distributed Systems
ETL
Python
SQL
Benefits
A bonus and/or long-term incentive units may be provided as part of the compensation package