Home
Jobs
Saved
Resumes
Data Engineer – Streaming Platform at Voodoo | JobVerse
JobVerse
Home
Jobs
Recruiters
Companies
Pricing
Blog
Jobs
/
Data Engineer – Streaming Platform
Voodoo
Website
LinkedIn
Data Engineer – Streaming Platform
France
Full Time
3 hours ago
No Sponsorship
Apply Now
Key skills
Apache
AWS
Cloud
Distributed Systems
Docker
Google Cloud Platform
Java
Kafka
Kubernetes
Pulsar
Python
Scala
Spark
Terraform
Data Engineering
GCP
Google Cloud
Helm
Kinesis
Pub/Sub
Event Streaming
CI/CD
About this role
Role Overview
Build, maintain, and optimize real-time data pipelines to process bid requests, impressions, clicks, and user engagement data.
Develop scalable solutions using tools like Apache Flink, Spark Structured Streaming, or similar stream processing frameworks.
Collaborate with backend engineers to integrate OpenRTB signals into our data pipelines and ensure smooth data flow across systems.
Ensure data pipelines handle high-throughput, low-latency, and fault-tolerant processing in real-time.
Write clean, well-documented code in Java, Scala, or Python for distributed systems.
Work with cloud-native messaging and event platforms such as GCP Pub/Sub, AWS Kinesis, Apache Pulsar, or Kafka to ensure reliable message delivery.
Assist in the management and evolution of event schemas (Protobuf, Avro), including data consistency and versioning.
Implement monitoring, logging, and alerting for streaming workloads to ensure data integrity and system health.
Continuously improve data infrastructure for better performance, cost-efficiency, and scalability.
Requirements
3-5+ years of experience in data engineering, with a strong focus on real-time streaming systems.
Familiarity with stream processing tools like Apache Flink, Spark Structured Streaming, Beam, or similar frameworks.
Solid programming experience in Java, Scala, or Python, especially in distributed or event-driven systems.
Experience working with event streaming and messaging platforms like GCP Pub/Sub, AWS Kinesis, Apache Pulsar, or Kafka.
Hands-on knowledge of event schema management, including tools like Avro or Protobuf.
Understanding of real-time data pipelines, with experience handling large volumes of event-driven data.
Comfortable working in Kubernetes for deploying and managing data processing workloads in cloud environments (AWS, GCP, etc.).
Exposure to CI/CD workflows and infrastructure-as-code tools such as Terraform, Docker, and Helm.
Tech Stack
Apache
AWS
Cloud
Distributed Systems
Docker
Google Cloud Platform
Java
Kafka
Kubernetes
Pulsar
Python
Scala
Spark
Terraform
Benefits
Competitive salary upon experience
Comprehensive relocation package (including visa support)
Swile Lunch voucher
Gymlib (100% borne by Voodoo)
Premium healthcare coverage SideCare, for your family is 100% borne by Voodoo
Child day care facilities (Les Petits Chaperons rouges)
Wellness activities in our Paris office
Unlimited vacation policy
Remote days
Apply Now
Home
Jobs
Saved
Resumes