Coinbase is on a mission to increase economic freedom in the world, and they are seeking a Senior Analytics Engineer to support their Finance Analytics team. This role involves building scalable data solutions that empower stakeholders to make data-driven decisions while ensuring financial data integrity and reliability.
Responsibilities:
- Quickly build subject matter expertise in a specific business area and data domain
- Understand the data flows from creation, ingestion, transformation, and delivery
- Step into a new line of business and work with Engineering and Product partners to deliver first data pipelines and insights
- Communicate with engineering teams to fix data gaps for downstream data users
- Take initiative and accountability for fixing issues anywhere in the stack
- Perform reconciliation-style validation across sources (internal systems and/or external statements/vendors), identifying discrepancies and driving fixes with upstream owners
- Interface with stakeholders on data and product teams to deliver the most commercial value from data (directly or indirectly)
- Build curated data models that streamline ledger verification and accounting workflows, helping finance teams accelerate time-to-close for new product launches
- Leverage deep understanding of the reconciliation engine alongside statistical and data expertise to propose engineering improvements that drive faster execution and higher match accuracy
- Work with PMs to tie together new x-PG, and x-Product data into one holistic framework to optimize key financing product business metrics
- Collaborate cross-functionally with Finance/Accounting to translate requirements into durable data models, and with upstream engineering teams to improve source data contracts
- Use a variety of frameworks and paradigms to identify the best-fit tools to deliver value
- Develop new abstractions (e.g. UDFs, Python packages, dashboards) to support scalable data workflows/infra
- Stand up a framework for debugging AI skills/data apps internally, enabling other non-tech stakeholders to quickly add value
- Use established tools with mastery (e.g. Google Sheets, SQL) to quickly deliver impact when speed is top priority
- Implement strong data quality guarantees (tests, monitoring, alerting, SLAs) and partner with stakeholders to define "done" for financial correctness
- Improve reliability and operational excellence for critical pipelines (incident response, retro/action items, performance tuning, cost optimization)
Requirements:
- Strong understanding of best practices for designing modular and reusable data models (e.g., star schemas, snowflake schemas)
- Experience designing curated datasets for analytics/reporting with clear definitions and change management
- Proficiency in advanced SQL techniques for data transformation, querying, and optimization
- Expertise in scripting and automation, with experience in Object-Oriented Programming (OOP) and building scalable frameworks
- Strong ability to translate technical concepts into business value for cross-functional stakeholders
- Proven ability to manage projects and communicate effectively across teams
- Strong cross-functional communication skills and ability to work effectively with Finance/Accounting partners and navigate ambiguity
- Experience building, maintaining, and optimizing ETL/ELT pipelines, using modern tools like dbt, Airflow, or similar
- Experience orchestrating data workflows with Airflow (DAG design, scheduling patterns, backfills, operational ownership)
- Proficiency in building polished dashboards using tools like Looker, Tableau, Superset, or Python visualization libraries (Matplotlib, Plotly)
- Familiarity with version control (GitHub), CI/CD, and modern development workflows
- Knowledge of modern data lake/warehouse architectures (e.g., Snowflake, Databricks) and transformation frameworks
- Hands-on experience with Snowflake and/or Databricks in production environments
- Track record of building for correctness and reliability: data quality frameworks, monitoring/alerting, incident response, and stakeholder-facing SLAs
- Ability to understand and address business challenges through analytics engineering
- Familiarity with statistics and probability
- Expertise in prompt engineering and design for LLMs (e.g., GPT), including creating, refining, and optimizing prompts to improve response accuracy, relevance, and performance for internal tools and use cases
- Demonstrate the ability to responsibly use generative AI tools and copilots (e.g., LibreChat, Gemini, Glean) in daily workflows, continuously learn as tools evolve, and apply human-in-the-loop practices to deliver business-ready outputs and drive measurable improvements in efficiency, cost, and quality
- Experience with financial reconciliation, controllership/accounting reporting, audit/SOX-style controls, or regulated environments
- Familiarity with ledger/event-based financial models and concepts like double-entry accounting
- Experience with streaming/event-driven systems (e.g., Kafka/Kinesis) and/or near-real-time data validation patterns
- Experience with table replication/synchronization patterns between lakehouse and warehouse environments
- Fintech/crypto domain experience
- New York City location presence preferred