Oreva Technologies, Inc. is seeking a Data Engineer to join their team. The role involves collaborating with software engineers and business stakeholders to develop scalable data solutions, including ETL pipelines and workflows, while ensuring data models are optimized for performance and reliability.
Responsibilities:
- Partner with software engineers, business stakeholders, and subject matter experts to translate requirements into scalable data solutions
- Develop, implement, and deploy ETL pipelines and workflows
- Preprocess and analyze large datasets to uncover meaningful insights
- Validate, refine, and optimize data models for performance and reliability
- Monitor and maintain data pipelines in production, identifying improvements and refining workflows
- Document development processes, workflows, and best practices to support team knowledge sharing
Requirements:
- Strong programming proficiency in Python, PySpark, and SQL
- Ability to craft and optimize complex SQL queries and stored procedures
- Experience developing and maintaining scalable, high-performing data models
- Hands-on expertise with Snowflake, including Snowpark for data processing
- Exposure to API integrations to support data workflows
- Experience implementing CI/CD pipelines through DevOps platforms