Aurora Payments is a payments infrastructure company that empowers businesses through its ARISE platform. The Senior Data Engineer will be responsible for building a unified data platform from scratch, managing ingestion, data modeling, and ensuring data quality across multiple payment processors.
Responsibilities:
- Extend an existing Dataform project in BigQuery (staging models, data quality checks)
- Build reconciliation proofs against production data
- Stand up ingestion for a new transaction-level source (GCS to BigQuery, PCI scrubbing required)
- Own transaction-level lineage across multiple payment processors
- Build automated ingestion for processor files and APIs
- Design the journaling and audit layer (point-in-time recalculation, not just snapshots)
- Deliver the data contract consumed by the merchant-facing dashboard
- Build a processor-agnostic onboarding framework (we acquire companies; their data has to flow in)
- Evolve into tech lead as the team grows
Requirements:
- Data modeling depth. The top differentiator for this role. The domain has tricky grain — merchant-level aggregates that hide fee-category detail, mid-month adjustment conventions that silently break date filters, classification errors that affect 61% of rows. You should be able to look at a model and say 'this grain is wrong' before it ships. Dimensional modeling, slowly changing dimensions, knowing when to denormalize. Demonstrated in production, not in theory
- You ship. First deployed artifact within 30 days. We have a Dataform project scaffolded, BigQuery sources defined, a first data quality assessment complete, and data waiting. The work has started. Comfort with incomplete specs — documentation is being created in parallel with the build. Some answers only come from querying the data. Default to building, not waiting
- Advanced SQL + Dataform + BigQuery. Advanced SQL is the daily language: window functions, CTEs, cross-grain aggregation. Experience with Dataform and BigQuery or comparable technologies required. Understands materialization strategies, incremental models, and testing frameworks
- AI-native workflow. You work daily with Claude Code, Cursor, Copilot, or equivalent. Aurora leans heavily on AI (mostly Claude), we use it to write, test, debug, and explore. This is a two-person team where both members operate at 3-5x through AI leverage. AI fundamentally shapes how we work across the stack. A traditional workflow at traditional speed is a mismatch
- Financial data or payments domain experience (residuals, interchange, fee structures, commission calculations)
- Pipeline construction from raw files or APIs (schema drift, late-arriving data, file format inconsistencies)
- PCI-DSS awareness — you know why you hash before you load
- Python for light scripting