CloudIngest is seeking a Data Engineer / Data Architect to lead the migration of Snowflake datasets to Databricks Unity Catalog-managed Iceberg tables. The role focuses on ensuring data accessibility across platforms and supporting advanced analytics and AI-driven workloads.
Responsibilities:
- Lead and execute the migration of Snowflake tables to Databricks Unity Catalog (Iceberg format)
- Assess existing Snowflake data models, pipelines, and dependencies
- Establish dual-access capabilities (Snowflake external access and Databricks)
- Identify and convert queries to Spark SQL-compatible syntax
- Work closely with data platform and datahub teams to ensure seamless onboarding
- Perform data validation, reconciliation, and consistency checks post-migration
- Enhance and optimize data pipelines and query performance in Databricks
- Ensure adherence to data governance, access control, and Unity Catalog best practices
- Troubleshoot and resolve migration-related issues efficiently
- Develop and contribute to automation frameworks and reusable migration components
Requirements:
- Strong expertise in Snowflake and Databricks (Spark, Unity Catalog)
- Hands-on experience with Apache Iceberg, Delta Lake, or other open table formats
- Proficiency in SQL and Spark SQL
- Experience in data migration, ETL/ELT processes, and data modeling
- Familiarity with AWS (S3, IAM, networking) or equivalent cloud environments
- Solid understanding of data governance and access management
- Ability to troubleshoot and optimize performance across distributed systems
- Experience with cross-platform data sharing (Snowflake + Databricks)
- Knowledge of REST catalog integration and Iceberg external tables
- Exposure to AI/BI workloads and analytics ecosystems
- Understanding of enterprise data platforms and data mesh architecture