PALNAR is seeking a detail-oriented and technically skilled Data Integration Engineer to design, develop, and manage robust data integration solutions. The ideal candidate will play a key role in enabling seamless data flow between systems to support business intelligence, analytics, and operational needs.
Responsibilities:
- Design and implement data integration workflows between internal and external systems, including APIs, databases, SaaS applications, and cloud platforms
- Develop and maintain scalable ETL/ELT pipelines for structured and unstructured data using tools like Informatica, Talend, SSIS, Apache NiFi, or custom Python/SQL scripts
- Build and manage real-time and batch data pipelines leveraging technologies like Kafka, Spark Streaming
- Ensure high data quality, accuracy, and consistency during data ingestion and transformation
- Implement data validation, cleansing, deduplication, and monitoring mechanisms
- Contribute to metadata management, data lineage, and data catalog initiatives
- Collaborate with data engineers, business analysts, data scientists, and application teams to understand integration needs and deliver effective solutions
- Troubleshoot and resolve data integration and pipeline issues in a timely manner
- Provide documentation and knowledge transfer for developed solutions
- Support data movement across hybrid environments (on-prem, cloud, third-party systems)
- Work with DevOps or platform teams to ensure scalability, security, and performance of data integration infrastructure
Requirements:
- Bachelor's degree in Computer Science, Information Systems, Engineering, or a related field
- 4–8 years of experience in data integration, data engineering, with strong ETL and SQL
- Strong experience with integration tools such as Informatica, Talend, MuleSoft, SSIS, or Boomi
- Proficient in SQL, Python, and scripting for data manipulation and automation
- Experience with cloud data platforms (GCP) and services such as Google Cloud Dataflow
- Familiarity with REST/SOAP APIs, JSON, XML, and flat file integrations
- Experience with message queues or data streaming platforms (Kafka, RabbitMQ, Kinesis)
- Understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, BigQuery)
- Knowledge of data security, privacy, and compliance best practices (HIPAA, GDPR, etc.)
- Prior experience in industries like healthcare, fintech, or e-commerce is a plus