Design, build, and operate our foundational data-heavy services: storage (cloud data warehouse, data lake), orchestration (Airflow), batch processing (Spark, SQL), streaming services (Kafka), query federation and caching, time-series db, graph db, and real-time event aggregation stores.
Build and maintain data integration & process SDKs for use by internal services and product teams throughout Coinbase.
Design and build self-service applications to empower our users to manage and troubleshoot their own data pipelines running on our platforms.
Design and build services for end-to-end data security and data observability: managing access controls across multiple storage and access layers, tracking data quality, cataloging datasets and their lineage, usage auditing.
Convert functional requests from data analysts, ML, and security & compliance into reusable and scalable patterns; and assemble data microservices into data platforms for critical business verticals and user cohorts.
Requirements
You have at least 5+ years of experience in software engineering.
You have Strong Python, Go, or Java backend development skills.
You have general experience working with data systems or data pipelines.
You are familiar with design patterns such as scale-out, caching, key/value, and columnar.
You leverage SQL, Python, Airflow, and BI expertise to analyze data for operational insights.
Demonstrates the ability to responsibly use generative AI tools and copilots (e.g., LibreChat, Gemini, Glean) in daily workflows, continuously learn as tools evolve, and apply human-in-the-loop practices to deliver business-ready outputs and drive measurable improvements in efficiency, cost, and quality.
Nice to Have: Crypto-forward experience, including familiarity with onchain activity such as interacting with Ethereum addresses, using ENS, and engaging with dApps or blockchain-based services