Competent data infrastructure development with limited coaching and guidance
Pipeline Design and Development
Architects and builds scalable data pipelines using modern ETL (Extract, Load, Transform) tools and frameworks such as dbt (Data Build Tool), Apache Airflow, or similar
Automates data ingestion processes from various sources including databases, APIs, and third party services
Data Storage and Management
Designs and implements data warehousing solutions using platforms like Snowflake, Redshift, or BigQuery.
Requirements
PySpark
Intermediate (4-6 Years)
Python
Intermediate
Databricks
Intermediate (4-6 Years)
Snowflake
Intermediate (4-6 Years)
Datawarehouse/Data Engineering
Tech Stack
Airflow
Amazon Redshift
Apache
BigQuery
ETL
PySpark
Python
Benefits
Competitive salary and performance-based bonuses
Comprehensive benefits package
Career development and training opportunities
Flexible work arrangements (remote and/or office-based)
Dynamic and inclusive work culture within a globally renowned group