
WNS, part of Capgemini, is an Agentic AI-powered leader in intelligent operations and transformation, serving more than 700 clients across 10 industries, including Banking and Financial Services, Healthcare, Insurance, Shipping and Logistics, and Travel and Hospitality. We bring together deep domain excellence – WNS’ core differentiator – with AI-powered platforms and analytics to help businesses innovate, scale, adapt and build resilience in a world defined by disruption.
Our purpose is clear: to enable lasting business value by designing intelligent, human-led solutions that deliver sustainable outcomes and a differentiated impact. With three global headquarters across four continents, operations in 13 countries, 65 delivery centers and more than 66,000 employees, WNS combines scale, expertise and execution to create meaningful, measurable impact.
Role Summary:
We are looking for a Lead Engineer to provide technical architecture oversight and solution design for our data products. This role ensures that engineering best practices are followed, systems are scalable, and technical debt is minimized. You will guide the engineering team and ensure alignment with client’s wider data architecture strategy.
Key Responsibilities:
• Define and oversee the technical architecture for data pipelines, integrations, and services.
• Conduct code reviews, enforce coding standards, and ensure adherence to security/compliance guidelines.
• Collaborate with the client’s Architecture teams to align on technology stacks and future-state roadmaps.
• Mentor engineers and troubleshoot complex technical issues.
• Translate product requirements into technical specifications and high-level design documents.
• Design, develop, and optimize ETL/ELT pipelines to move data from source systems to data warehouses/lakes.
• Implement data quality checks and monitoring to ensure data accuracy and integrity.
• Collaborate with Data Analysts and Scientists to understand data requirements and ensure data availability.
• Maintain and improve existing data infrastructure and workflows.
• Troubleshoot pipeline failures and performance bottlenecks.
Required Skills & Experience:
• 6 to 8 years of experience in data engineering or software architecture.
• Deep expertise in cloud platforms (AWS, Azure, or GCP) and modern data stack tools.
• Strong knowledge of data modeling, ETL/ELT processes, and orchestration frameworks (Airflow, Dagster).
• Experience with programming languages such as Python or SQL.
• Experience with Fivetran and DBT
• Excellent problem-solving skills and ability to make trade-off decisions between speed and quality.
• Experience with Snowflake cloud data warehouses
• Hands-on experience with orchestration tools (Apache Airflow, Prefect).
• Familiarity with version control (Git) and CI/CD principles.
Graduate/Post Graduate