Optum is a global leader in health care innovation, developing solutions to help people lead healthier lives. The Principal Data Engineer will be responsible for designing, building, and optimizing scalable data solutions within the health plan ecosystem, advancing data modernization and improving platform efficiency.
Responsibilities:
- Partner in the design, development and communication of technology patterns and standards needed to manage Surest healthcare data
- Provide technical leadership, guidance and training to team of data engineers on individual and group levels
- Implement data processes and pipelines which incorporate product requirements with data storage, lifecycle, accessibility, performance, and security standards
- Solve complex data development issues within data development software
- Support Agile development practices, including refining user stories and collaborating with product owners and stakeholders
- Develop large scale, flexible data pipelines to efficiently process health care data supporting all aspects of a health plan
- Work with peers across Surest to ensure alignment on data capabilities and software development lifecycle (SDLC)
Requirements:
- Undergraduate degree in engineering, mathematics, computer science, software development, or related technical field, or equivalent experience
- 10+ years of solid SQL or NoSQL query experience
- 5+ years designing and implementing efficient, end-to-end, flexible data pipelines
- 5+ years with both batch and near real-time data architectures
- 3+ years of PySpark & Python data development experience
- Experience implementing parallel processing data strategies
- Experience integrating strategies for data pipeline introspection at scale
- Experience working with Data Lake or Delta Lake architecture
- Proven solid analytical thinking, creativity, and willingness to be hands-on
- Proven solid drive to evolve development efficiency, data quality, and pipeline improvements
- Demonstrated excellent problem-solving, documentation, teamwork, and communication skills
- Proven superior attention to detail
- Experience working in Agile Development Framework
- Post-graduate degree in engineering, mathematics, computer science, software development, or other technical field
- Experience working with healthcare data
- Experience developing and operating Databricks jobs as data pipelines using pyspark