Understands and applies principles of data strategy to business problems
Extracts data from identified databases
Creates data pipelines and transforms data to a relevant structure
Performs initial data quality checks on extracted data
Analyzes complex data elements, systems and relationships for data models
Writes code to develop required solutions and application features
Creates test cases to review the proposed solution design
Translates business problems to data solutions and provides recommendations to business stakeholders
Establishes and documents data governance projects.
Requirements
6+ years of experience in Data Engineering
Well versed with Hadoop, Hive, Spark using Scala, Kubernetes, Cloud, API and Data Lake concepts
Proven track record coding with at least one programming language (e.g., Java, Python)
Experienced in computing platforms (e.g., GCP, Azure)
Skilled in data modelling & data migration protocols
Experience with Kafka connect, Druid, Big Query and Looker is added advantage
Experience with the integration tools like Automic, Airflow
Bachelor's degree in Computer Science and 3 years' experience in software engineering or related field or 5 years' experience in software engineering or related field or Master's degree in Computer Science and 1 year's experience in software engineering or related field
2 years' experience in data engineering, database engineering, business intelligence, or business analytics.