Snowrelic Inc is seeking a Data Engineer to design, build, and modernize data pipelines and platforms in a cloud environment. This role will focus on migrating legacy data systems to AWS and Snowflake, developing scalable ETL/ELT pipelines, and supporting analytics through strong data modeling and architecture.
Responsibilities:
- Design, build, and maintain scalable data pipelines
- Migrate legacy pipelines (Informatica, Teradata) to AWS/Snowflake
- Develop and optimize ETL/ELT processes
- Build and manage data warehouse and data models (star schema, dimensional)
- Work with RDBMS (Oracle, SQL Server, RDS) and MongoDB
- Implement data governance, security, and quality standards
- Collaborate with business stakeholders and data teams
- Develop and test data features and integrations
- Monitor performance and optimize cloud costs
- Contribute to Agile development processes
Requirements:
- Strong experience with AWS and Snowflake
- Proficiency in Python for data engineering
- Experience building ETL/ELT pipelines
- Data warehousing and data modeling experience
- Experience with relational databases (Oracle, SQL Server)
- Git-based version control
- MongoDB or document database experience
- Experience migrating legacy data platforms
- AWS Glue and CDK
- Agile tools (JIRA, Confluence)
- Experience mentoring junior engineers