Job Description – Data Engineer (Snowflake, DBT, AWS)
Experience
5–7 years of experience in Data Engineering
Location - Chennai
Key skills: Snowflake, DBT, Fivetran, SnapLogic, Python. AWS is must
Must-Have Skills
AWS (Mandatory): Hands-on experience with services like S3, Glue, Lambda, IAM, CloudWatch
Snowflake: Good experience in data warehousing, query optimization, and performance tuning
DBT: Experience in building models, transformations, and tests
SQL: Strong knowledge of writing complex queries and data transformations
Fivetran & SnapLogic: Experience in building and managing data ingestion pipelines
Python: Basic to strong scripting experience for data processing and automation
Good to Have
Experience with Airflow or any orchestration tool
Knowledge of data modeling (star schema, data marts)
Exposure to BI tools like Power BI or Tableau
Understanding of Agile methodology
Role Overview
You will work on building and maintaining data pipelines and transformation workflows using AWS, Snowflake, DBT, Fivetran, and SnapLogic to support analytics and reporting needs.
Key Responsibilities
Build and maintain data pipelines using Fivetran and SnapLogic
Develop data transformations in Snowflake using DBT
Write and optimize SQL queries
Use Python for automation and data processing
Work with AWS services for data storage and processing
Monitor pipelines, fix issues, and improve performance
Work with team members and stakeholders to understand requirements.