Intrado is dedicated to saving lives and protecting communities, helping them prepare for, respond to, and recover from critical events. They are seeking an exceptional Data Engineer to build robust data pipelines that will power the company’s internal business analytics, ensuring that raw data from multiple systems is consistently ingested, cleaned, and made ready for analysis.
Responsibilities:
- Build and maintain Azure Data Factory pipelines to ingest data from multiple sources
- Write the Python code in Databricks to clean raw data and move it into the silver layer, handling deduplication, type casting, and validation
- Monitor daily jobs and troubleshoot failures. You are the first line of defense in ensuring that pipelines are stable and do not break
- Implement automated checks to verify that data arriving in the lake matches the source systems
Requirements:
- 5+ years of experience in Data Engineering, specifically focused on building and maintaining ETL/ELT pipelines of large-scale operational and financial data in a cloud environment
- Proficiency in building and optimizing data pipelines using Azure Data Factory and Databricks
- Strong proficiency in SQL for data analysis and Python for scripting and transformation
- Experience implementing automated data quality checks (e.g., schema validation, null checks). A proactive approach to identifying pipeline failures and implementing fixes to prevent recurrence
- Experience working with data schemas and APIs from common enterprise platforms like Microsoft Dynamics 365 F&O, Salesforce, ServiceNow
- Demonstrated experience using LLMs to streamline data engineering workflows and improve development efficiency
- Bachelor's degree in Computer Science, Software Engineering, Data Engineering, or a closely related technical field
- Prior experience working in a technology company or SaaS environment