The Lead Data Engineer designs, builds, and maintains data processing architectures and solutions enabling the efficient conversion of structured and unstructured data to insights at enterprise scale
Ingests data from internal and external sources utilizing cloud native platforms and software development best practices and patterns
Develops software tools that leverage artificial intelligence, machine learning, and big-data techniques to cleanse, organize, and transform data into insights and actions that enable Humana to better serve our members
This lead will design and develop applications using Data engineering related tech
Act as one of the primary point of contact for the team, build POCs, build quick prototypes with latest industry standard software libraries and scale it for the platform
Design, code, test, debug and document programs using Agile development practices
Provides the strategy and design for projects associated with their technology domain including, upgrades and deployments
Develops new documentation and departmental technical procedures
Makes decisions on moderately complex to complex issues regarding technical approach for project components, and work is performed without direction
Exercises considerable latitude in determining objectives and approaches to assignments
Troubleshoots business and production issues by gathering information (for example, issue, impact, criticality, possible root cause);
engaging support teams to assist in the resolution of issues; formulating an action plan; performing actions as designated in the plan; interpreting the results to determine further action; performs root cause analysis to prevent future occurrence of issues; and completing online documentation
Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, mentoring other members of the engineering community
Requirements
12 + years of Data engineering experience with Hands-on experience with Data Lakes, Event Streaming, Data Fabric, DevOps
8 + years of experience designing software applications using best practices and design first approach
8 + years of experience with Python, Azure Databricks, Pyshark, Snowflake
Experience in event-based streaming using Kafka in various formats like Json or Avro
Experience designing and developing cloud solutions using AWS, Azure or Google is a significant plus
Experiencing leading engineering team/teams in end-to-end delivery
Proficiency in federated data architectures (e.g., Data Mesh) and centralized models
Strong experience with metadata management, data lineage, governance, discovery, and quality
Experience in Database related tech like Oracle or Sql server
Demonstrates a strong commitment to team success by proactively contributing and taking initiative to ensure the delivery of high-quality work
Experience analyzing existing applications and making modifications for new features while maintaining other existing functionalities
Experience working with GitHub and/or GitLab, SonarQube, JUnit, designing unit test cases using TDD and BDD methodologies
Experience using CI/CD pipelines and code quality tools
Test driven mindset using frameworks like Junit, SOAPUI, Selenium and contract testing tools
Expert knowledge of the Agile methodologies
Experience in Docker and Kubernetes
Tech Stack
AWS
Azure
Cloud
Docker
JUnit
Kafka
Kubernetes
Oracle
Python
Selenium
SQL
Benefits
medical, dental and vision benefits
401(k) retirement savings plan
time off (including paid time off, company and personal holidays, volunteer time off, paid parental and caregiver leave)