Booz Allen Hamilton is a leading consulting firm, and they are seeking a Health Services Data Engineer to help clients leverage data for impactful missions. In this role, you will build advanced technology solutions and lead data engineering activities, focusing on organizing and analyzing health data to support critical projects.
Responsibilities:
- Help our clients find answers in their data to impact important missions—from fraud detection to cancer research to national intelligence
- Build advanced technology solutions and lead data engineering activities on some of the most mission-driven projects in the industry
- Guide data engineering activities by overseeing the development and deployment of pipelines that organize and make disparate data meaningful
- Serve as a technical lead on analytic tasks in the federal health space
- Conduct analytic and engineering work as an individual contributor
- Organize analytic work for small- to medium-sized tasks and teams
- Help teams solve complex coding or analytic issues
- Mentor junior analysts and engineers
- Support the development and maintenance of scalable data pipelines that supply data in forms needed for business analysis
- Identify opportunities to leverage leading-edge analytic and engineering principles, theories, and concepts solutions to help clients solve new and existing problems
- Collaborate with various team members, including quantitative and qualitative analysts, policy experts, developers, communications specialists, and project managers
Requirements:
- 6+ years of experience performing data management with health data in SQL and Python, including using Databricks or Snowflake and big data tools or frameworks, such as Spark
- Experience analyzing health insurance claims data in the Chronic Conditions Warehouse (CCW), Integrated Data Repository (IDR), or Centralized Data Repository (CDR)
- Experience with data validation, cleansing, and enrichment techniques to improve the accuracy and completeness of data
- Experience designing, implementing, and optimizing end-to-end data pipelines for ingesting, processing, and transforming large volumes of data
- Experience designing and developing ETL workflows using tools, such as Apache Spark or AWS Glue, and monitoring and troubleshooting ETL processes to identify and resolve issues in a timely manner
- Knowledge of healthcare data such as Provider Enrollment, Chain, and Ownership System (PECOS), Master Beneficiary Summary File (MBSF), Standard Analytic Files (SAF), or Prescription Drug Event (PDE) data
- Ability to design and maintain data models, schemas, and database structures to support operations and to build and maintain data integrations with internal and external data sources and APIs
- Ability to communicate and clearly explain complex methodologies, and present findings and recommendations to both technical and non-technical audiences
- Ability to obtain and maintain a Public Trust or Suitability/Fitness determination based on client requirements
- Bachelor's degree in an Epidemiology, Economics, Statistics, or Computer Science field
- Experience with the federal procurement and proposal process
- Experience managing tasks for others
- Experience with Agile engineering practices
- Knowledge of quantitative methods for the detection of fraud, improper payments, or other payment anomalies
- Knowledge of healthcare pay-for-performance based payment adjustments
- Knowledge of quality measure calculation