Build and own dbt models for the financial subledger platform (staging → intermediate → canonical facts/balances/marts → semantic), including naming conventions, macros, and reusable patterns.
Implement strong data quality and controls in dbt: tests (unit/relationship/assertions), freshness, anomaly checks, and automated reconciliations that support close and audit readiness.
Embed AI-assisted reconciliation capabilities into the platform (within appropriate security/controls guardrails): automate variance triage, suggest likely root causes, and generate human-reviewable reconciliation narratives and evidence artifacts.
Own end-to-end subledger data products (e.g., event/fact layers, balance rollforwards, reconciliation outputs) with traceability from source events through transformations to reporting outputs.
Partner with Accounting/Financial Reporting to translate requirements into clear model specifications (definitions, posting logic assumptions, tie-out rules) and ship them as durable dbt assets.
Drive production operational ownership: monitoring/alerting, incident response, root-cause fixes, and release hygiene for the pipelines and models you own.
Collaborate with upstream engineering teams to define inputs and improve source data quality via contracts and change management.
Coach and develop one Analytics Engineer (Poland) via code review, pairing, scoped ownership, and clear technical direction.
Requirements
Deep, hands-on analytics engineering experience: you have built and maintained production dbt projects (models, tests, macros, documentation, CI) and can own them end-to-end.
Strong SQL and data modeling skills; experience building canonical datasets that support finance reporting, reconciliations, or other correctness-critical outcomes.
Experience operating data pipelines in a warehouse environment (Snowflake preferred): performance tuning, cost awareness, and reliability practices.
Strong programming skills (Python preferred) for building data tooling and automation (e.g., control checks, reconciliation workflows, utilities, CLIs, and integrations).
Deep Snowflake technical expertise: architecture patterns (micro-partitioning/clustering, query optimization), security/governance (RBAC, masking policies), and operational excellence (monitoring, cost management, reliability).
Strong ownership mindset and ability to lead ambiguous work through influence across Finance, Accounting, and Engineering stakeholders.
Nice to have:
Experience with accounting concepts (subledgers, journal entries/posting logic, balance rollforwards, tie-outs, close processes).
Experience designing SOX-friendly controls and producing repeatable evidence from data pipelines.
Experience applying AI/LLMs to data quality or reconciliation workflows (e.g., anomaly explanation, automated investigation summaries), with strong governance and human-in-the-loop review.
AWS experience (nice to have): S3/IAM, data platform primitives, and cloud architecture concepts that support secure, reliable data products.
Tech Stack
AWS
Cloud
Python
SQL
Benefits
Health care coverage
Affirm covers all premiums for all levels of coverage for you and your dependents
Flexible Spending Wallets
generous stipends for spending on Technology, Food, various Lifestyle needs, and family forming expenses
Time off
competitive vacation and holiday schedules allowing you to take time off to rest and recharge
ESPP
An employee stock purchase plan enabling you to buy shares of Affirm at a discount