About this roleCES has 26+ years of experience in delivering Software Product Development, Quality Engineering, and Digital Transformation Consulting Services to Global SMEs & Large Enterprises. CES has been delivering services to some of the leading Fortune 500 Companies including Automotive, AgTech, Bio Science, EdTech, FinTech, Manufacturing, Online Retailers, and Investment Banks. These are long-term relationships of more than 10 years and are nurtured by not only our commitment to timely delivery of quality services but also due to our investments and innovations in their technology roadmap. As an organization, we are in an exponential growth phase with a consistent focus on continuous improvement, process-oriented culture, and a true partnership mindset with our customers. We are looking for the right qualified and committed individuals to play an exceptional role as well as to support our accelerated growth.You can learn more about us at: http://www.cesltd.com/
About the Role: Design, develop, and maintain scalable data pipelines for an enterprise HCM data platform serving K-12 education clients. You will build end-to-end data solutions using modern cloud technologies, implement medallion architecture (Bronze/Silver/Gold), and ensure data quality and governance across the platform.
Key Responsibilities:
Design and implement data ingestion pipelines using CDC/CT (Change Tracking) from SQL Server sources
Build and maintain medallion architecture (Bronze → Silver → Gold) data layers in Snowflake
Develop data transformation models using tools like dbt, AWS Glue, or Snowflake native features (stored procedures, tasks)
Implement SCD Type 1 and Type 2 patterns for historical data tracking
Design and implement Row-Level Security (RLS) for multi-tenant data access
Create and optimize Snowflake objects including tables, views, streams, tasks, and stored procedures
Design data models supporting complex reporting requirements for absence management, workforce analytics, and substitute tracking
Ensure Gold layer is optimized for BI tool consumption (Power BI, Tableau, or other reporting tools) Establish data quality checks, monitoring, and alerting mechanisms
Document technical designs, data dictionaries, and lineage documentation
Mentor junior team members and conduct code reviews
Required Skills & Qualifications:
Snowflake: 2+ years hands-on experience with data warehousing, performance tuning, secure views, streams/tasks
SQL: Advanced SQL skills including complex joins, window functions, CTEs, and query optimization
Data Transformation: Experience with any transformation framework (dbt, AWS Glue, Snowflake stored procedures, or similar)
Data Modeling: Strong experience with dimensional modeling, star/snowflake schemas, and SCD implementations
ETL/ELT Pipelines: Experience with data pipeline design patterns, CT, and incremental loading strategies
SQL Server: Experience with SQL Server as source system including CDC and Change Tracking
Python: Proficiency in Python for data processing and automation scripts
Nice to Have:
CDC-based ingestion tools (Snowflake OpenFlow, Fivetran, Airbyte, AWS DMS)
SPCS (Snowpark Container Services) for containerized workloads
BI Tools: Understanding of how BI tools (Power BI, Tableau, QuickSight) consume data via DirectQuery/Live Connection
Git: Version control and CI/CD pipelines for data projects
Orchestration tools (Airflow, Dagster, AWS Step Functions) for pipeline scheduling
AWS Services: S3, Glue, Lambda, PrivateLink, VPC connectivity
Snowflake certifications (SnowPro Core, SnowPro Advanced Data Engineer)