Develop and implement solutions for managing, processing, and analyzing large volumes of data.
These solutions will support both the feeding of Data Science and Artificial Intelligence models and direct data analysis by business users.
To achieve this, you will work closely with Data Science and Analytics teams and with business users, supporting them in their analyses.
Solutions involve handling large datasets, integrating diverse data sources and/or processing high-velocity data, aiming to create competitive advantages for the Company through intensive use of data.
Requirements
Experience with Data Modeling;
Experience with programming languages focused on data (Python, R, Scala);
Experience with SQL databases;
Develop and maintain ETLs, ELTs, and data pipelines;
Knowledge of data architecture and data processing in cloud environments;
Knowledge of Data Infrastructure;
Agile methodology;
Data administration and governance;
Ability to work independently and as part of a team.
KNOWLEDGE THAT MAKES A DIFFERENCE:
Previous experience in Data Engineering, Big Data management (with massively parallel processing), and interdisciplinary research;
Knowledge of dimensional modeling, BI, and Data Warehouse concepts;
Desirable: specific experience with a cloud provider (GCP and/or AWS and/or Azure);
Experience with Power BI visualization tool;
Experience with Spark and Airflow tools.
Tech Stack
Airflow
AWS
Azure
Cloud
Google Cloud Platform
Python
Scala
Spark
SQL
Benefits
Profit Sharing
Food Allowance
Meal Allowance
Health Insurance
Dental Insurance
Gympass
Private Pension
Home Office Allowance
Allya
Unlimited access to a wide range of courses from our Localiza University