
Role: Snowflake Developer
Location: Chicago, IL, or Tempe, AZ (hybrid)
Duration: 9 Months
!!Only Local !! !! Only W2 !!
Required Qualifications:
Minimum 4+ years of experience in data engineering or related roles
Proven track record of delivering production-ready data solutions at scale
Experience with version control systems (Git) and collaborative development practices
3+ years of hands-on experience with Snowflake data platform including advanced SQL, stored procedures, and performance optimization
Strong experience with data ingestion patterns including bulk loading, micro-batching, and streaming data processing
Proficiency with AWS services such as S3, Lambda, and CloudWatch Experience with Azure data services including Data Factory, Event Hubs, Blob Storage, and Azure Functions
Solid Python programming skills for data processing, API integrations, and automation scripts
Experience with data modeling concepts and dimensional modeling techniques
Understanding of data security, governance, and compliance best practices
Responsibilities:
Design, develop, and maintain scalable data pipelines using Snowflake as the core data warehouse platform
Build and optimize data ingestion processes for both batch file-based loads and real-time streaming data from various sources
Implement data transformation logic using Snowflake SQL, stored procedures, and Python integration Collaborate with data architects and analysts to understand business requirements and translate them into technical solutions
Monitor and troubleshoot data pipeline performance, ensuring high availability and data quality
Develop and maintain documentation for data processes, data models, and system architecture
Work closely with DevOps teams to implement CI/CD practices for data pipeline deployments.