Algoworks is seeking an experienced ELT / EDI Lead Engineer to lead the design and implementation of data pipelines for a supply chain transaction analytics platform. This hands-on leadership role will involve guiding data engineers while actively contributing to the development of ELT pipelines and ensuring the standardization and interpretation of EDI-driven transaction data.
Responsibilities:
- Lead ingestion and normalization of EDI/X12 transaction data (850, 855, 856, 810, 997) across multiple source systems
- Define consistent interpretation of transaction lifecycle states (received, processed, failed, delayed, acknowledged, etc.)
- Standardize data across different trading partners with varying schemas and formats
- Work closely with business stakeholders to define supply chain KPIs (transaction success rates, processing delays, error patterns)
- Design and implement ELT pipelines using Snowflake, Azure data services, or similar platforms
- Define and enforce Bronze (raw), Silver (cleaned), and Gold (analytics-ready) data layers
- Develop transformation logic for structured and semi-structured data (XML, JSON, EDI payloads)
- Ensure pipelines are scalable, reliable, and optimized for performance
- Guide data engineers on best practices for pipeline development and data modeling
- Use AI tools such as Cursor and GitHub Copilot to accelerate SQL development, transformation logic, and pipeline design
- Leverage LLM-based tools to analyze EDI schemas, summarize structure differences, and assist in normalization design
- Use Snowflake Cortex (where applicable) for query optimization, classification (e.g., error grouping), and performance improvements
- Apply AI tools to rapidly prototype transformation logic and refine through manual validation
- Ensure all AI-generated outputs are thoroughly reviewed for correctness, especially in business-critical transaction logic
- Define data validation rules for transaction completeness, accuracy, and consistency
- Ensure alignment between source data and reporting outputs through reconciliation logic
- Establish semantic consistency across KPIs and reporting layers
- Identify and resolve issues related to schema inconsistencies, missing data, and transformation errors
- Provide hands-on technical leadership to a team of data engineers
- Review code, transformation logic, and pipeline implementations
- Collaborate with QA, BI, and DevOps teams to ensure end-to-end data quality and delivery
- Act as a key technical point of contact for stakeholders and delivery leadership
Requirements:
- 8+ years of experience in data engineering, with strong focus on ELT/ETL pipelines
- Deep expertise in EDI/X12 transactions (850, 855, 856, 810, 997) and transaction lifecycles
- Strong hands-on experience with Snowflake, Azure SQL, or similar data warehouse platforms
- Advanced SQL skills for complex transformations and performance optimization
- Experience working with semi-structured data (XML, JSON, EDI formats)
- Strong understanding of data modeling and medallion architecture (Bronze/Silver/Gold)
- Hands-on experience using AI tools such as: Cursor (SQL and transformation development), GitHub Copilot (code generation and optimization), LLM-based tools (schema analysis, documentation, and design support)
- Ability to use AI tools for: Accelerating SQL development and transformation logic, Analyzing complex EDI schemas and identifying patterns, Generating and refining pipeline logic
- Strong ability to validate and refine AI-generated outputs, especially for business-critical transaction data
- Experience working in environments where AI is used to improve productivity while maintaining strict quality standards