EXL is a company seeking a Data Engineer specialized in GCP. The role involves building data pipelines, analyzing large datasets, and collaborating with stakeholders on migration plans.
Responsibilities:
- 5+ years total experince in data engineering and analytics
- 2 years of experience in GCP Cloud services such as Big Query, Airflow DAG, Dataflow, DataProc etc
- 2 years of experience in Data Extraction and creating data pipeline workflows on Bigdata with knowledge of Data Engineering concepts
- Exposure in analyzing large data sets from multiple data sources, perform validation of data
- Experience writing code in Python
- Knowledge of SQL/HQL functionalities to write the optimized queries
- Good to have Knowledge of Hadoop Eco-system components like HDFS, Spark, Hive, Sqoop
- Ability to build a migration plan in collaboration with stakeholders
- Analytical and problem-solving skills
Requirements:
- 5+ years total experience in data engineering and analytics
- 2 years of experience in GCP Cloud services such as Big Query, Airflow DAG, Dataflow, DataProc etc
- 2 years of experience in Data Extraction and creating data pipeline workflows on Bigdata with knowledge of Data Engineering concepts
- Exposure in analyzing large data sets from multiple data sources, perform validation of data
- Experience writing code in Python
- Knowledge of SQL/HQL functionalities to write the optimized queries
- Ability to build a migration plan in collaboration with stakeholders
- Analytical and problem-solving skills
- Good to have Knowledge of Hadoop Eco-system components like HDFS, Spark, Hive, Sqoop