Design, build, and optimize high-volume, high-performance ELT pipelines for centralized data warehousing
Collaborate with Product Managers, ML Engineers, Data Scientists, and DevOps to define and enforce a reliable, scalable, and secure data platform architecture
Ensure adherence to data warehousing standards, data quality best practices, and metadata management processes
Take responsibility for assigned data warehouse components, contributing to enhancements in performance and reliability
Conduct architecture reviews and participate in system design discussions to provide technical input
Support and collaborate with peers, sharing knowledge and promoting a culture of quality and continuous learning
Requirements
At least 3-4 years of experience in data engineering, building scalable data systems and pipelines
Strong SQL skills
Working proficiency in Python
Experience building or maintaining data pipelines and working with a data warehouse or analytics stack
Strong computer science fundamentals
Knowledge of system architecture with emphasis on reliability, availability, and performance
Focus on delivering production-ready solutions
Attention to detail and commitment to data quality
Clear communication skills
Comfortable working in a fast-paced, evolving environment
Upper-Intermediate level of spoken and written English