People Architects is proud to partner with a high growth-stage SaaS company to recruit a Data Operations Engineer, a critical, hands-on role that sits at the intersection of data, engineering, and operations. The role involves owning customer data onboarding and operational data workflows, transforming messy data into reliable systems for the engineering team.
Responsibilities:
- Lead customer data onboarding, including mapping, cleansing, transforming, and importing data from competitor platforms, spreadsheets, and ad-hoc sources
- Build and maintain repeatable ingestion processes and scripts using Python, SQLAlchemy, and Postgres
- Partner with Customer Success Managers to define data requirements and onboarding timelines
- Translate messy, inconsistent customer data into clean internal schemas with accuracy and consistency
- Maintain a library of reusable migration utilities, validation scripts, and automation tools
- Own internal and external reporting requests requiring SQL or data extraction
- Perform one-time data cleanups, corrections, and backfills directly in the SaaS database
- Investigate data anomalies and support engineering with root-cause analysis
- Improve and maintain ETL pipelines to reduce manual engineering work
- Build lightweight automations to streamline recurring operational workflows
Requirements:
- Strong SQL skills (Postgres preferred)
- Comfort working with large, messy Excel, Google Sheets, and CSV datasets
- Python proficiency (SQLAlchemy strongly preferred)
- Experience designing data transformations, mappings, and validations
- Solid understanding of ETL principles, automation, and scripting
- Ability to interpret data models and navigate relational schemas
- High attention to detail and a strong data quality mindset
- Clear communicator with both technical and non-technical partners
- Experience with Python-based migration or ETL frameworks
- Familiarity with SaaS data structures, multi-tenant databases, or systems like CRM, ATS, or LMS platforms
- Experience building reusable internal tools for data operations
- Exposure to Git and basic DevOps workflows
- Comfort troubleshooting and working in production-like environments