Techgene Solutions is a company focused on innovative technology solutions. They are seeking a Data/Java Developer who will be responsible for developing and maintaining data pipelines, utilizing core Java and Big Data frameworks to analyze and transform data.
Responsibilities:
- High proficiency in core JAVA development language OOP, Collections, Multithreading, Data Structures and Exception Handling
- Proficiency with JUNIT testing framework, Maven build tool, GitHub version control and IntelliJ IDE
- Proficient in Big Data Frameworks such as Apache Spark (preferred) or Hadoop
- Knowledgeable in Azure Databricks, Delta Lake, Spark Core, Azure Data Factory (ADF) and Unity Catalog
- Knowledgeable in analyzing data, finding data patterns, data visualization
- Ability to ingest and transform data using PySpark in Azure Databricks
- Understanding of key Data Warehousing and ETL/ELT Processes such as data pipelines and database management
- Ability to design, build, and maintain robust Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) pipelines
- Proficient in SQL and NoSQL databases, MySQL and Azure Cosmos DB are preferred
- Knowledgeable in object-oriented programming (OOP) concepts, design patterns, and general software architecture
- Knowledgeable in Cloud Platforms, such as Microsoft Azure (preferred) or AWS
Requirements:
- High proficiency in core JAVA development language OOP, Collections, Multithreading, Data Structures and Exception Handling
- Proficiency with JUNIT testing framework, Maven build tool, GitHub version control and IntelliJ IDE
- Knowledgeable in Azure Databricks, Delta Lake, Spark Core, Azure Data Factory (ADF) and Unity Catalog
- Knowledgeable in analyzing data, finding data patterns, data visualization
- Ability to ingest and transform data using PySpark in Azure Databricks
- Understanding of key Data Warehousing and ETL/ELT Processes such as data pipelines and database management
- Ability to design, build, and maintain robust Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) pipelines
- Knowledgeable in object-oriented programming (OOP) concepts, design patterns, and general software architecture
- Proficient in Big Data Frameworks such as Apache Spark (preferred) or Hadoop
- Knowledgeable in SQL and NoSQL databases, MySQL and Azure Cosmos DB are preferred
- Knowledgeable in Cloud Platforms, such as Microsoft Azure (preferred) or AWS