Unite Us is a company focused on improving connections to healthcare and social services across communities. They are seeking a Senior Data Engineer to construct data warehouses, lakes, and pipelines, ensuring data integrity and accessibility for stakeholders.
Responsibilities:
- Implement a data architecture and infrastructure that aligns with business objectives. Collaborate closely with Application Engineers and Product Managers to ensure that the technical infrastructure robustly supports client requirements
- Create ETL and data pipeline solutions for efficient loading of data into the warehouse along with their testing to ensure reliability and optimal performance
- Collect, validate, and provide high-quality data, ensuring data integrity
- Champion data democratization efforts, facilitating accessibility to data for relevant stakeholders
- Guide the team with regard to technical best practices and contribute substantially to the architecture of our systems
- Supporting operational work like onboarding new customers to our data products and participating in on-call for the team
- Engage with cross-functional teams, including Solutions Delivery, Business Intelligence, Predictive Analytics, and Enterprise Services, to address and support any data-related technical issues or requirements
Requirements:
- At least 6-8 years of experience in working with data warehouses, data lakes, and ETL pipelines
- Proven experience with building optimized data pipelines using Snowflake
- Expert in orchestrating data pipelines using Apache Airflow, including authoring, scheduling, and monitoring workflows
- Experience designing ETL/ELT data pipelines using data transformation tools like DBT
- Advanced SQL knowledge, with experience in pulling complex queries, query authoring, and strong familiarity with Snowflake or various relational databases like Redshift, Postgres, etc
- Strong proficiency in Python for building, optimizing and maintaining data pipelines and services
- Experience with CI/CD pipelines to automate testing, deployment and release of data engineering and analytics workflows using tools such as GitHub Actions, Jenkins etc
- Exposure to AWS and proficiency in cloud services such as ECS, S3, RDS, etc
- Experience with tools like Kubernetes, Terraform, Docker, Kafka
- Experience in developing applications for large enterprise clients
- Previous engagement with healthcare and/or social determinants of health data products
- Experience designing, building and maintaining data pipelines for healthcare claims and clinical data, ensuring data quality, scalability and compliance
- A dedicated focus on building high-performance systems
- Exposure to building data quality frameworks
- Experience using and building solutions to support various reporting and data user tools (Tableau, ThoughtSpot, etc)
- Strong problem-solving and troubleshooting skills, with the ability to identify and resolve data engineering issues and system failures
- Excellent communication skills, with the ability to communicate technical information to non-technical stakeholders and collaborate effectively with cross-functional teams
- The ability to envision and construct scalable solutions that meet diverse needs for enterprise clients with dedicated data teams