Diverse Lynx is seeking a highly skilled Data Engineer with expertise in Snowflake and Google Cloud Platform (GCP). The role focuses on designing, building, and optimizing scalable data platforms and analytics solutions while managing cloud data warehouses and developing robust data pipelines.
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions
- Implement and optimize Snowflake objects including databases, schemas, tables, views, and stages
- Develop and manage Snowflake SQL, stored procedures, tasks, and streams
- Optimize query performance, storage, and compute usage
- Implement data sharing, security roles, and access controls in Snowflake
- Support data modeling for analytical and reporting use cases
- Design and build end to end data pipelines on Google Cloud Platform
- Develop ETL/ELT pipelines using BigQuery, Cloud Storage, Dataflow / Dataproc
- Integrate data from multiple sources (applications, APIs, files, streaming sources)
- Ensure scalability, reliability, and cost optimization of cloud data solutions
- Apply best practices for data governance, security, and compliance on GCP
- Perform data ingestion, transformation, and validation
- Design dimensional and analytical data models for reporting and BI
- Handle structured and semi structured data (CSV, JSON, Parquet, etc.)
- Ensure data quality checks, reconciliation, and monitoring
- Work closely with analytics, reporting, and business teams to understand data requirements
- Support UAT, production deployments, and ongoing enhancements
- Document data pipelines, models, and technical design
- Participate in Agile ceremonies and sprint-based delivery
Requirements:
- Strong hands-on experience with Snowflake
- Strong hands-on experience with Google Cloud Platform (GCP)
- BigQuery, Cloud Storage, Dataflow / Dataproc
- Advanced SQL (performance tuning, complex queries)
- Experience with ETL / ELT frameworks
- Data modeling experience (dimensional / analytical)
- Experience with version control tools (Git)
- Python (or similar) for data processing and automation
- Experience with orchestration tools (e.g., Airflow / Cloud Composer)
- Experience working with BI tools (Looker, Tableau, Power BI, Qlik)
- Exposure to CI/CD for data pipelines
- Healthcare / Financial / Large enterprise data platform experience