Cognizant is seeking a Lead Data Engineer who will serve as an engineering lead within the Technology & Data organization. This role involves providing technical direction, mentoring other engineers, and designing scalable data models and pipelines to support business analytics.
Responsibilities:
- Design, test, and maintain both conceptual and analytical data models, including logical, physical, and dimensional (star/snowflake) schemas, to support business analytics and reporting, optimize performance through partitioning and indexing, and ensure quality and consistency via SQL best practices and peer code reviews
- Design, build, and optimize scalable, maintainable, and resilient end-to-end ETL/ELT data pipelines
- Leverage orchestration tools for scheduling, dependency management, and error handling, while extracting and transforming data from source systems
- Utilize platforms like dbt to ensure robust data flow, conduct UAT, and manage code releases for reliable system performance
- Implementing logging, metrics, and alerting for data pipelines
- Collaborate with analysts, analytics and data engineers, and solution architects to align data solutions with strategic goals
- Communicate effectively across technical and non-technical teams
- Actively participate in agile ceremonies and contribute to sprint goals
- Apply agile principles to respond to change, iterate quickly, and continuously improve team processes and product outcomes
Requirements:
- Deep expertise in both analytics engineering and data engineering
- Ability to provide technical direction and thought leadership across teams
- Mentoring and supporting analytics and data engineers
- Conducting code reviews and setting high standards for technical quality
- Design, test, and maintain both conceptual and analytical data models, including logical, physical, and dimensional (star/snowflake) schemas
- Optimize performance through partitioning and indexing
- Ensure quality and consistency via SQL best practices and peer code reviews
- Design, build, and optimize scalable, maintainable, and resilient end-to-end ETL/ELT data pipelines
- Leverage orchestration tools for scheduling, dependency management, and error handling
- Extracting and transforming data from source systems
- Utilize platforms like dbt to ensure robust data flow
- Conduct UAT and manage code releases for reliable system performance
- Mastery of SQL skills
- Skilled in version control systems for collaborative development and code management (i.e Azure DevOps, GitHub)
- Implementing logging, metrics, and alerting for data pipelines
- Strong communication and collaboration skills
- Collaborate with analysts, analytics and data engineers, and solution architects
- Communicate effectively across technical and non-technical teams
- Actively participate in agile ceremonies and contribute to sprint goals
- Apply agile principles to respond to change, iterate quickly, and continuously improve team processes and product outcomes
- Data Build Tool
- Snowflake
- Advanced Data Modelling experts
- Fivetran
- Excellent communication skills