The Data Engineer will leverage advanced techniques to design, implement, and maintain data pipelines, utilizing platforms like Snowflake, Fivetran, dbt Cloud, and Airflow to ensure efficient data flow and transformation.
They will be responsible for building and optimizing scalable, reliable data architectures that support the organization’s analytical and business intelligence needs.
The engineer will collaborate with cross-functional teams to prioritize data integration, ensure high-quality data delivery, and address data validation and governance requirements.
Their primary focus will be enhancing the efficacy of data ingestion processes by building and refining extraction, transformation, and load (ETL) workflows, increasing data accessibility and usability.
In this role, the Data Engineer will partner with stakeholders to define essential data requirements, implement data transformation logic, and ensure data models are efficient and optimized for reporting and analytics.
They will ensure data is available in real-time or near-real-time, as business needs require, monitor pipeline performance, and troubleshoot any issues to maintain data flow reliability.
The individual will be skilled in using data visualization and monitoring tools to ensure data accuracy and facilitate performance tracking.
They will create and manage automated data workflows using tools such as Airflow, Fivetran, and dbt Cloud, ensuring data is processed and delivered efficiently to Snowflake for further analysis.
Requirements
Bachelor’s degree and/or Clinical Informatics, Business, Accounting, Health Care Management or a related field required
Minimum 5-7 years of experience in data analytics, business intelligence, ETL/ELT tools, Data Governance, reporting, informatics, or related experience