Combine data from multiple sources such as APIs, databases, and third-party systems;
Design and execute ETL/ELT workflows utilizing advanced data processing tools like Cloud Composer/Airflow, Dataflow;
Develop robust and efficient data pipelines, implement processing logic, and perform data transformations;
Enhance BigQuery queries and allocate resources effectively to minimize costs and boost performance;
Maintain data quality, availability, and security by adhering to industry best practices.
Requirements
Demonstrated experience as a Data or Software Engineer in Senior role
Prior experience with dbt and Python is advantageous
Familiarity with CI/CD systems like GitLab is required
Experience with any public Cloud service
Knowledge of Google Cloud and Cloud Data Warehousing is highly beneficial
Specifically, experience with BigQuery or Snowflake and their integration with dbt is preferred. Additionally, proficiency with Terraform is strongly desired.
Tech Stack
Airflow
BigQuery
Cloud
ETL
Python
Terraform
Benefits
Diverse and technically challenging projects
Training budget for professional and personal growth
Flexible working hours and a hybrid workplace model