Santcore Technologies is seeking a Staff-level Data Engineer with senior-level expertise to design, build, and maintain scalable data pipelines and architecture. The role involves collaborating with analytics, product, and engineering teams to develop data models and enforce data quality standards.
Responsibilities:
- Design, build, and maintain scalable ELT pipelines from source to gold-layer using dbt and Snowflake
- Ingest and harmonize data from enterprise systems including Oracle Fusion, Salesforce, HighRadius, and Salsify
- Architect source-to-gold data models across ingestion, staging, business logic, and analytics layers
- Apply AI tools throughout the SDLC to improve development efficiency and quality
- Collaborate with stakeholders to translate business requirements into data models
- Enforce data quality, lineage, testing, and documentation standards
- Contribute to data platform architecture and continuous improvement
- Mentor team members through code reviews, pairing, and knowledge sharing
Requirements:
- Must be able to convert to permanent (No H-1B)
- Expert-level proficiency in Python for pipeline development, data transformation, and orchestration
- Advanced experience with dbt for data modeling, testing, and documentation
- Deep expertise in Snowflake including performance tuning, clustering, time travel, and data sharing
- Strong understanding of source-to-gold data architecture, including medallion or data mesh patterns
- Experience working with enterprise source systems such as Oracle Fusion, Salesforce, HighRadius, and Salsify
- Strong experience with AI-assisted development tools including GitHub Copilot, OpenAI CodeX, and Claude
- Ability to demonstrate AI-augmented development productivity
- Strong problem-solving and analytical skills
- Experience in retail environments, including Order Management Systems (OMS)
- Experience in multichannel commerce or omnichannel data integration
- Experience in Consumer Packaged Goods (CPG) domain
- Experience working with POS, inventory, fulfillment pipelines, or retail data flows