KPG99 INC is seeking a GCP Data Engineer to develop and maintain data pipelines and ETL/ELT workflows for enterprise data platforms. The role involves integrating enterprise data from multiple systems into cloud data warehouse environments and collaborating with cross-functional teams to deliver scalable and reliable data solutions.
Responsibilities:
- Develop and maintain data pipelines and ETL/ELT workflows for enterprise data platforms
- Build and optimize data models and reporting datasets for financial analytics and reporting
- Integrate enterprise data from multiple systems into cloud data warehouse environments
- Maintain and enhance existing data pipelines, ensuring reliable daily data loads and SLA compliance
- Troubleshoot data quality issues and performance bottlenecks
- Implement CI/CD practices for data engineering workflows
- Collaborate with cross-functional teams to deliver scalable and reliable data solutions
Requirements:
- Strong SQL expertise including advanced querying, performance tuning, and optimization
- Experience with GCP and BigQuery
- Experience building ETL/ELT data pipelines and data transformations
- Knowledge of CI/CD tools and data pipeline orchestration
- Programming experience in Python
- Experience with data modeling and analytics datasets
- Strong communication skills and ability to work independently
- Experience working in remote, distributed teams
- Experience with semantic layers or BI tools
- Familiarity with AtScale