ClickHouse is a fast-growing private cloud company recognized on the 2025 Forbes Cloud 100 list. The role involves building data-intensive systems and ensuring the reliability of database integrations at a petabyte scale, while collaborating with customers and internal teams to drive product innovation.
Responsibilities:
- Build data-intensive systems
- Design and develop high-throughput integrations with databases (Postgres, MySQL, MongoDB), data lakes (Iceberg, Delta Lake), and data warehouses (BigQuery, Snowflake)
- Handle edge cases in real-world production scenarios: unconventional database setups, internals of data types, database upgrades/failovers, large transactions, etc
- Design integration solutions to enable users to fully harness ClickHouse's performance and throughput
- Debug complex issues in production leveraging runtime diagnostics (e.g. pprof, parca) and observability tools (e.g. metrics, logging, tracing)
- Build and improve infrastructure and tools to increase system reliability, reduce incident response time, and simplify/automate operations
- Write clear documentation, both publicly and internally
- Participate in on-call rotation
- Work directly with customers to understand integration requirements and discover gaps in existing product
- Collaborate cross-functionally with internal teams to ensure operational efficiency
- Lead technical discussions and influence product roadmaps
Requirements:
- 5+ years of industry experience building data-intensive software solutions
- Proficient in Go, or experienced in systems programming with willingness to ramp up quickly in Go
- Cloud-native experience deploying and operating services on at least one major cloud platform (AWS/GCP/Azure)
- Practical experience with Kubernetes
- Strong problem solver and solid production debugging skills
- Clear communication in writing (design docs, code review) and verbally (technical discussions, customer calls, incident response)
- Experience with database replication technologies (CDC, logical replication)
- Experience with durable execution frameworks (Temporal)
- Experience with data formats and protocols (Avro, Parquet, Protobuf)
- Experience with modern data processing frameworks (e.g. Kafka, Spark, Flink)
- Experience with maintaining/contributing to open-source software