Optum Tech is a global leader in health care innovation, seeking a Principal Data Engineer to design, build, and scale enterprise data platforms and pipelines. This role focuses on cloud-based data architecture and hands-on development to support large-scale data and analytics initiatives.
Responsibilities:
- Lead the design and implementation of cloud based data architectures and data solutions
- Build, optimize, and maintain scalable data pipelines on GCP or Azure
- Provide hands on technical leadership in data platform and solution architecture
- Work across cloud and on prem environments to support enterprise data needs
- Collaborate with engineering and analytics teams to ensure data reliability, performance, and scalability
- Guide junior engineers through technical mentorship and solution design reviews
- Leverage enterprise approved AI tools to streamline workflows, automate tasks, and drive continuous improvement
- Ensure data quality, observability, lineage, and documentation (e.g., Great Expectations/DBT tests, Cloud Monitoring/Logging, Data Catalog)
- Implement security, IAM, and compliance controls; manage access, encryption, and PII handling
- Partner with Analytics/BI, Data Science, and Product teams to deliver trustworthy datasets and metrics
- Troubleshoot and resolve data pipeline incidents, drive root cause analysis and preventive improvements
- Contribute to data platform roadmaps, standards, and best practices
Requirements:
- 8+ years of experience in data engineering or related roles
- Demonstrated experience designing and building data pipelines in enterprise environments
- Proven solid hands on experience with cloud platforms (GCP preferred; AWS or Azure acceptable)
- Experience with on prem and cloud based data architectures
- Hands on experience with develop and optimize data models (databricks/snowflake or similar platform) and ELT/ETL workflows to support analytics and ML
- Proven solid SQL and Python development skills for data processing and pipeline development, hands-on with Cloud Storage
- Experience in Implementation of streaming and micro-batch solutions for near-real-time use cases
- Demonstrated experience in Building CI/CD workflows for data pipelines and infrastructure-as-code using Terraform
- Demonstrated solid understanding of data warehousing concepts, dimensional modeling, and ELT patterns
- Experience with batch and streaming data processing at scale
- Experience supporting large scale analytics or data platforms
- Demonstrated familiarity with multi cloud or hybrid architectures
- Experience working in enterprise scale environments
- Reside in Minnesota
- All employees working remotely will be required to adhere to UnitedHealth Group's Telecommuter Policy