Core & Main is a leader in advancing reliable infrastructure with local service nationwide. They are seeking a Senior Data Engineer to design, develop, test, and maintain data solutions and pipelines, serving as a subject matter expert in data engineering and cloud-based platforms.
Responsibilities:
- Collaborate with stakeholders to analyze data solution requirements, identify gaps, and assess feasibility
- Take responsibility for the design, documentation, and implementation of technical solutions that meet business and functional requirements
- Develop, implement, and maintain data pipelines, data warehouses, and data lake solutions. Ensure data systems possess sufficient controls and meet compliance standards
- Perform unit testing prior to moving code/configuration to the QA process. Evaluate and research upgrades, patches, and new functionality. Troubleshoot and resolve defects in data solutions
- Contribute to the development and definition of test plans and scripts for performance, regression, and user acceptance testing; support QA activities as required
- Build and maintain data models, data mappings, transformation rules, workflows, data extractions and imports, interfaces, and object models
- Ensure data solutions comply with security protocols and data governance standards
- Share expertise with team members and participate in peer reviews to uphold technical standards
Requirements:
- Design, develop, test, and maintain data solutions and pipelines
- Serve as a subject matter expert in data engineering, data integration, and cloud-based data platforms
- Maintain, enhance, and provide solutions for data warehousing and analytics environments
- Take ownership of technical deliverables and ensure high-quality, reliable data solutions that meet business needs
- Collaborate with stakeholders to analyze data solution requirements, identify gaps, and assess feasibility
- Take responsibility for the design, documentation, and implementation of technical solutions that meet business and functional requirements
- Develop, implement, and maintain data pipelines, data warehouses, and data lake solutions
- Ensure data systems possess sufficient controls and meet compliance standards
- Perform unit testing prior to moving code/configuration to the QA process
- Evaluate and research upgrades, patches, and new functionality
- Troubleshoot and resolve defects in data solutions
- Contribute to the development and definition of test plans and scripts for performance, regression, and user acceptance testing; support QA activities as required
- Build and maintain data models, data mappings, transformation rules, workflows, data extractions and imports, interfaces, and object models
- Ensure data solutions comply with security protocols and data governance standards
- Share expertise with team members and participate in peer reviews to uphold technical standards
- Bachelor's degree in computer science, Information Technology, or related field
- 5+ years of hands-on development experience in SQL and/or Python for data warehouse management, data integration, and data lake management
- Deep working knowledge in SQL development using T-SQL code to design, implement, and optimize complex database objects, such as tables, views, stored procedures, indexes, and functions
- Experience working with Azure data architecture, including a solid understanding of tools for building data pipelines on cloud-based data platforms, such as Delta Lakehouse Medallion architecture and data warehousing solutions
- Exposure to modern Spark-based data platforms like Databricks or Microsoft Fabric for data engineering tasks, including leveraging their capabilities for scalable data processing, analytics, and machine learning workflows in a cloud-based environment
- Understanding of ELT vs ETL and how to build efficient data pipelines with modern Change Data Capture processes
- Hands-on experience with CI/CD pipelines in Azure DevOps and understanding of Agile development methodologies
- Familiarity with common data mapping and transformation techniques for Dynamics 365 Data Entities and Data Management Framework for the Finance and Operations modules
- Familiarity with Power BI and its integration with Microsoft Fabric for end-to-end analytics
- Strong communication skills with the ability to translate complex technical concepts into business-friendly language