Lorven Technologies Inc. is seeking a GCP Data Integration Engineer for a long-term project. The role involves designing and implementing data integration workflows, developing scalable ETL/ELT pipelines, and ensuring data quality and consistency.
Responsibilities:
- Design and implement data integration workflows between internal and external systems, including APIs, databases, SaaS applications, and cloud platforms
- Develop and maintain scalable ETL/ELT pipelines for structured and unstructured data using tools like Informatica, Talend, SSIS, Apache NiFi, or custom Python/SQL scripts
- Ensure high data quality, accuracy, and consistency during data ingestion and transformation
- Implement data validation, cleansing, deduplication, and monitoring mechanisms
- Strong experience with integration tools such as Informatica, Talend, MuleSoft, SSIS, or Boomi
- Proficient in SQL, Python, and scripting for data manipulation and automation
- Experience with cloud data platforms (GCP) and services such as Google Cloud Dataflow
- Familiarity with REST/SOAP APIs, JSON, XML, and flat file integrations
- Good experience with message queues or data streaming platforms (Kafka, RabbitMQ, Kinesis)
- Good understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, BigQuery)
- Good knowledge of data security, privacy, and compliance best practices (HIPAA, GDPR, etc.)
- Good prior experience in industries like healthcare, fintech, or e-commerce is a plus
- Experience with state-of-the-art technology, architecture, design concepts, open-source operating systems, database systems, computer networking, and security
- Demonstrated ability to work effectively in cross-functional groups and generate results
Requirements:
- Bachelor's degree in computer information systems, Information Technology, Information Systems, Computer Science, or equivalent professional experience
- 4–8+ years of experience in data integration, data engineering, with strong ETL and SQL
- Design and implement data integration workflows between internal and external systems, including APIs, databases, SaaS applications, and cloud platforms
- Develop and maintain scalable ETL/ELT pipelines for structured and unstructured data using tools like Informatica, Talend, SSIS, Apache NiFi, or custom Python/SQL scripts
- Ensure high data quality, accuracy, and consistency during data ingestion and transformation
- Implement data validation, cleansing, deduplication, and monitoring mechanisms
- Strong experience with integration tools such as Informatica, Talend, MuleSoft, SSIS, or Boomi
- Proficient in SQL, Python, and scripting for data manipulation and automation
- Experience with cloud data platforms (GCP) and services such as Google Cloud Dataflow
- Familiarity with REST/SOAP APIs, JSON, XML, and flat file integrations
- Good experience with message queues or data streaming platforms (Kafka, RabbitMQ, Kinesis)
- Good understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, BigQuery)
- Good knowledge of data security, privacy, and compliance best practices (HIPAA, GDPR, etc.)
- Demonstrated ability to work effectively in cross-functional groups and generate results
- Prior experience in industries like healthcare, fintech, or e-commerce is a plus
- Experience with state-of-the-art technology, architecture, design concepts, open-source operating systems, database systems, computer networking, and security