Supplier.io is the market leader in supplier intelligence, trusted by over half of the Fortune 100 to power smarter, more responsible sourcing decisions. As a Senior Data Engineer, you will play a critical role in scaling and modernizing our data platform, driving the execution of our long-term data strategy and ensuring our data systems support business growth.
Responsibilities:
- Execute our cohesive data strategy that aligns with company objectives
- Lead the design, development, and evolution of Supplier.io data architecture, ensuring scalability, reliability, performance and security
- Execute and influence a cohesive data strategy that aligns with company objectives and supports analytics, reporting, and downstream product use cases
- Design, build, and maintain robust, cloud-native data pipelines and data warehouses, with Snowflake as a core component of the platform
- Own complex data modeling initiatives, including dimensional and analytical models that support business intelligence and advance analytics
- Evaluate, introduce, and integrate new tools, technologies, and methodologies, while aligning legacy counterparts for improved data ingestion, transformation, storage, and processing
- Drive continuous improvement by optimizing data pipelines, query performance, reliability, observability, and cost efficiency
- Partner with Infrastructure, Product, Analytics, and Engineering teams to ensure data systems meet best practices, security standards, and business needs
- Provide technical leadership and mentorship to other data engineers, contributing to code reviews, design discussions, and engineering standards
- Create and maintain comprehensive technical documentation, including architecture diagrams, data flow maps, runbooks, security considerations, and operations procedures
- Participate in agile planning and executive, contributing to estimation, prioritization, and delivery of high-impact data initiatives
- Troubleshoot and resolve complex, cross-system data issues and incidents
- Perform other duties as assigned
Requirements:
- Bachelor of Computer Science, Management Information System, Engineering, Data Science, or related field
- 7+ years of progressive experience in data engineering with demonstrated ownership of complex data systems in production environments. At least 2 years in a senior or lead capacity preferred
- Advanced experience designing, building, and supporting Snowflake based solutions at scale
- Proven experience architecting and implementing large-scale data solutions (i.e. Azure, AWS, GCP, or multi-cloud environments)
- Strong expertise in data modeling, ETL/ELT processes, and modern data warehousing principles
- Advanced proficiency with Snowflake, Python and SQL, building reliable, testable data pipelines
- Strong experience with big data technologies, orchestration frameworks, and data pipeline tools. (i.e. Airflow, dbt, Spark, or similar)
- Experience working in an agile development environment and collaborating through ticketing systems such as Jira and Azure DevOps
- Ability to communicate technical concepts clearly to technical and non-technical teams and influence decision-making
- Strong problem-solving skills with the ability to troubleshoot and resolve ambiguous, high-impact issues
- A results-oriented mindset with a demonstrated history of driving process improvements and technical excellence
- Ability to work independently while also serving as a trusted technical partner and mentor to others along
- Ability to take vague requirements and turn them into technical roadmaps