Lead the design, development, testing, and deployment of scalable data pipelines utilizing Ab-initio and Python
Drive migration efforts from legacy Ab-initio workflows to Python-based data processing services
Collaborate with stakeholders to understand data needs, translating them into efficient, well-documented solutions
Ensure data quality, security, and performance across platforms and processes
Mentor team members on data engineering techniques, tools, and emerging best practices
Requirements
Minimum of 7+ years of professional experience in data engineering or related roles with impactful projects
Proven hands-on expertise with Ab-initio and Python in large-scale data environments
Experience in migrating workflows from legacy to modern pipelines is strongly preferred
Bachelor's degree in Computer Science, Data Engineering, Engineering, or related field; Master’s degree preferred
Relevant certifications in Python, cloud platforms, or data tools (e.g., AWS Certified Data Analytics, GCP Professional Data Engineer) are advantageous