Cardinal Health is a global healthcare services and products company headquartered in Dublin, Ohio. They are seeking a Senior Data Engineer to lead the design, development, and support of enterprise-grade data and analytics platforms, ensuring they meet evolving business needs while maintaining security and scalability.
Responsibilities:
- Lead the technical design, development, and maintenance of data platforms supporting enterprise analytics and reporting
- Define and enforce standards for database design, data movement, performance optimization, and platform architecture
- Analyze, re-architect, and migrate legacy database objects to cloud platforms (e.g., GCP BigQuery, Databricks)
- Partner with business stakeholders to translate analytic requirements into scalable, dependable technical solutions
- Ensure technical specifications align with enterprise architecture standards and business objectives
- Collaborate with Cloud, Shared Services, offshore teams, and external vendors to deliver high-quality solutions
- Monitor platform health, identify risks, and proactively communicate status, issues, and remediation plans to leadership
- Ensure compliance with data security, governance, and audit requirements
- Develop and promote best practices for data engineering, including naming conventions, scripting, coding standards, and CI/CD practices
- Mentor and provide technical guidance to a team of 1–5 engineers
- Continuously identify opportunities for platform, process, and performance improvements
Requirements:
- Advanced experience with enterprise data platforms such as GCP BigQuery, AWS Redshift, Databricks
- Strong expertise in database design, data modeling, and performance optimization
- Expert-level experience developing database objects including SQL, stored procedures, views, partitioning, indexing, and query optimization
- Experience building semantic layers for analytics consumption using tools such as AtScale and/or LookML
- Proven experience working on large-scale, business-critical data platforms
- Experience leading or mentoring small development teams
- Hands-on experience with DevSecOps and CI/CD practices, including tools such as GitHub, Airflow, and pipeline automation
- Strong understanding of Agile development methodologies
- Excellent written and verbal communication skills, with the ability to communicate complex technical concepts to non-technical stakeholders
- Hands-on experience with Python scripting for data engineering and automation
- Experience with enterprise scheduling tools such as Airflow, UC4, AutoSys, or Control-M
- Exposure to AI/ML-driven data platforms, including feature engineering and data pipelines supporting analytics or machine learning workloads
- Experience with GitHub Co-Pilot, Large Language Models (LLMs) and Generative AI (GenAI) concepts, use cases, or integrations (e.g., leveraging data platforms for AI-driven insights or automation)
- Familiarity with cloud-native AI/ML services and modern data ecosystem patterns
- Prior experience working with external system integrators or managed service providers