New York Life Insurance Company is seeking a highly skilled and versatile data engineer/architect to design, build, and optimize their next-generation actuarial data platform in AWS. The role involves building robust data pipelines, developing automation frameworks, and improving data architecture to support financial reporting and actuarial modeling functions.
Responsibilities:
- Design and implement cloud-native data pipelines in AWS using tools such as Glue, Lambda, Step Functions, Redshift, and S3
- Build and optimize ETL/ELT processes that deliver clean, reliable data to actuarial, finance, and analytics teams
- Develop efficient data models, transformations, and APIs using Python, SQL, and PySpark
- Modernize existing on-premises processes into scalable AWS architecture
- Implement and maintain best practices for data governance, lineage, and performance tuning
- Automate data ingestion, transformation, and validation workflows using Python and AWS services
- Build AI-driven and analytical tools to enhance financial forecasting, validation, and decision support
- Experiment with Agentic AI and intelligent orchestration to streamline complex actuarial data flows
- Continuously improve performance, resilience, and maintainability of data systems
- Partner closely with actuarial and finance teams to understand data requirements and deliver timely, high-quality solutions
- Work with cloud infrastructure and DevOps teams to integrate solutions into CI/CD pipelines (Terraform, CodeBuild, GitHub)
- Document data flows, architecture, and technical processes to enable transparency and reusability
- Support production data operations, troubleshooting, and optimization efforts
Requirements:
- Bachelor degree in Computer Science, Data Engineering, or related field
- 7+ years of experience in data engineering, architecture, or analytics, ideally in financial or actuarial environments
- Proven success as a hands-on technical expertise building and maintaining complex ETL systems and cloud data platforms
- Expert-level skills with AWS (S3, Redshift, Glue, Lambda, EC2, Step Functions)
- Proficiency in Python, SQL, and ETL pipeline design
- Experience with data warehousing, dimensional modeling, and performance tuning
- Hands-on experience with Terraform, Docker, and CI/CD integration
- Strong analytical and problem-solving mindset
- Proactive and self-directed; takes ownership of technical outcomes
- Excellent communication and documentation skills
- Collaborative and curious, eager to learn business context and continuously improve
- Passion for innovation, automation, and high-quality engineering practices
- Familiarity with actuarial or financial data systems (e.g., AXIS, Prophet) is a plus
- Exposure to AI/ML-based automation or data quality frameworks preferred
- AWS Certified Data Analytics – Specialty or Solutions Architect – Associate