HYR Global Source Inc is seeking a highly skilled GCP Data Engineer to support a large-scale Snowflake to Google Cloud Platform (GCP) migration initiative. This role involves transforming and validating data models to ensure accuracy, scalability, and performance across systems.
Responsibilities:
- Lead and support Snowflake-to-GCP migration activities
- Perform comprehensive data validations, comparing legacy vs. transformed models
- Design and implement automated validation pipelines
- Build scalable and reliable data pipelines using GCP services
- Collaborate with cross-functional teams to translate complex data findings into actionable insights
- Integrate GCP and Palantir Foundry systems using REST APIs and secure data transfer mechanisms
- Work with modern AI and cloud-native tools to enhance data workflows
Requirements:
- 3+ years of programming experience in Python
- 3+ years of programming experience in PySpark
- 3+ years of programming experience in SQL
- Strong hands-on experience with Google Cloud Platform (GCP) services, including BigQuery, Vertex AI, Cloud Functions, Cloud Storage, Looker, GCP Agent Development Kit (ADK)
- Experience implementing scalable data pipelines and system integrations
- Strong understanding of data modeling concepts (incremental models, transformations, dataset consolidation)
- Experience building automated validation frameworks
- Experience with Palantir Foundry (especially Foundry AIP)
- Exposure to modern AI frameworks (e.g., Vertex AI, Gemini)
- Experience integrating GCP with external platforms via REST APIs
- Knowledge of cloud networking concepts (egress/ingress policies)