Design, build, and maintain event streaming pipelines that ingest data from client systems, internal services, and third-party sources into the data platform
Develop and operate analytical databases and data models optimized for high-volume event data queries and low-latency access
Write production Elixir and Python services for event processing, transformation, and routing
Integrate legacy event pipelines with modern streaming infrastructure, designing migration paths that minimize risk and disruption to downstream consumers
Build and maintain monitoring, alerting, and observability tooling for event data systems, ensuring pipeline health, data freshness, and SLA compliance
Define and enforce event schemas, data contracts, and quality standards in partnership with producing and consuming teams
Collaborate with the data platform, product engineering, and analytics teams to understand data needs and deliver reliable event data products
Participate in system design reviews and help establish best practices for the Events Data team
Requirements
6+ years of professional experience in data engineering or backend/systems engineering, with significant focus on event-driven and streaming data systems
Strong proficiency in Elixir and/or Python as a primary programming language for building application connectors, data services and pipeline components
Advanced SQL skills for data modeling, query optimization, and analytical workloads
Hands-on experience with columnar/OLAP (Online Analytical Processing) databases at production scale
Experience with stream processing frameworks and message brokers such as Apache Flink, Kafka, Pulsar, or Kinesis; Flink experience is a strong plus
Demonstrated ability to integrate and migrate systems, bridging legacy and modern architectures
Proven track record of operationalizing data pipelines, including building monitoring, alerting, SLA dashboards, and runbooks for production systems
Experience designing and operating data systems on AWS; GCP experience is a plus
Strong collaboration and communication skills, comfortable leading design discussions, writing technical specs, and working across team boundaries
Experience with Infrastructure-as-Code (IaC) tools such as Terraform, CloudFormation, or similar
Experience with retail events data such as clickstream, purchase events, or product interaction data is a plus
Tech Stack
Apache
AWS
Elixir
Google Cloud Platform
Kafka
Pulsar
Python
SQL
Terraform
Benefits
full range of medical, financial, and/or other benefits