You'll focus on solving problems and creating value for business by building solutions that are reliable and scalable to work with the size and scope of the company.
You will be tasked with creating a custom-built pipeline on GCP stack.
You will be part of teams that implement vendor sourced enterprise software, configuring that software, customizing it, and integrating with other internal systems.
Requirements
5+ years of industry experience in software development, data engineering, business intelligence, or related field with experience in manipulating, processing, and extracting value from datasets.
Design, build and deploy internal applications to support our technology life cycle, collaboration and spaces, service delivery management, data and business intelligence among others.
Building Modular code for multi usable pipeline or any kind of complex Ingestion Framework used to ease the job to load the data into Datalake or Data Warehouse from multiple sources.
Work closely with analysts and business process owners to translate business requirements into technical solutions.
Coding experience in scripting and languages (Shell scripting, Python, SQL).
Expertise in Google Cloud Platform (GCP) technologies in the data warehousing space (BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Cloud SDK, Cloud PubSub, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion, Dataproc (good to have), BigTable).
Maintain highest levels of development practices including: technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability.
Understanding CI/CD Processes using Github, Cloud Build, Google Cloud SDK