Design and evolve metering and analytics infrastructure spanning real-time analytics, long-term analysis, data transfer, governance, and retention policies
Collaborate with teams across the organization to ensure metering and analytics are correct and complete for their domains
Monitor and manage datasets with varying cardinality
Ensure data reliability through delivery guarantees, dead letter queues, validation, and anomaly detection
Design and enforce schema evolution strategies to evolve infrastructure without breaking downstream consumers
Optimize ClickHouse and blob storage for query performance and reliability
Requirements
Strong experience designing and operating data pipelines and distributed systems in production across dozens of global regions
Extensive experience with Go programming language
Deep experience with columnar/analytical databases (ClickHouse, BigQuery, or similar)
Strong understanding of data correctness, delivery semantics, and schema compatibility
Previous experience working on data-intensive SaaS applications with web-based dashboards in the analytics space
Nice to have: Experience with stream processing frameworks (Kafka, Pulsar), Kubernetes, OpenTelemetry, protobuf/Avro schema registries, or usage-based billing/metering systems