Nasscomm is seeking a Data Engineer for a 3.5 months contract role. The Data Engineer will be responsible for designing, building, testing, and promoting data pipelines and products for Cambridge’s Snowflake-based data platform, working across various data processes and integration tasks.
Responsibilities:
- Build and maintain source-to-target pipelines using Azure Data Factory, ADLS, and Snowflake to support bronze, silver, and gold data layers
- Develop and enhance Snowflake objects and transformation logic, including tables, views, stages, and stored procedures for client, account, compensation, and other priority use cases
- Support API and integration delivery, including API payload updates, APIM-related work, Azure Functions connectivity, and test automation
- Implement and support Kafka / delta / CDC processing for ongoing client and account data movement and near-real-time integration patterns
- Execute testing, validation, and promotion activities across dev, UAT, and production, including QA support, unit testing, and deployment readiness
Requirements:
- Strong experience with Snowflake, SQL, ADF, ADLS, APIs, and modern data pipeline delivery
- Comfortable acting as a hands-on builder who can move from design to implementation to testing and deployment
- Finance company experience