Establish and own a layered automated testing and validation framework across the data platform.
Ensure platform reliability, correctness, and scalability through stronger development-time quality controls.
Create the testing and validation foundation needed to reduce manual QA burden, improve release confidence, and mitigate business risk associated with public-facing and AI-enabled data consumption.
Focus on establishing a reliability baseline across core data products and public-facing integrations.
Provide ongoing shared ownership of the data platform and its continuous improvement.
Design, implement, and maintain an automated and extensible testing framework for the data platform, including unit testing for data transformations, integration testing across datasets, models, and platform components, regression testing for output consistency, schema and data contract validation, invariant testing for core business rules, end-to-end pipeline testing, performance and scalability testing, AI prompt regression testing, AI evaluation testing, and telemetry validation for AI and data systems.
Establish development-time safeguards to validate business logic and expected behavior prior to release.
Contribute to quality standards, release criteria, and reliability practices for core data products and public-facing integrations.
Partner with data engineering, analytics engineering, product, and AI/application teams to embed testing into development workflows.
Identify structural quality gaps and recommend scalable solutions.
Contribute to platform resilience planning, including disaster recovery and failover readiness.
Document testing standards, practices, and reusable frameworks for cross-team adoption.
Requirements
5+ years of experience in Data Engineering, DevOps, ML Ops, ML Engineering, Analytics Engineering, and/or Site Reliability Engineering.
Experience designing or implementing automated testing frameworks for data pipelines, ETL/ELT systems, analytics platforms, or similar environments.
Strong SQL skills and experience validating complex transformation logic.
Experience with modern data platforms, orchestration tools, and data quality practices.
Demonstrated ability to translate business rules into automated validations and testing coverage.
Experience partnering with cross-functional stakeholders to improve engineering quality and release confidence.
Experience with semantic-layer architectures and downstream BI or application consumption.
Experience supporting public-facing or customer-facing data products.
Familiarity with AI/LLM-enabled systems and related quality practices, including prompt regression testing or evaluation frameworks.
Experience with observability or monitoring tools for data systems.
Experience defining platform standards or leading quality initiatives across teams.
Familiarity with disaster recovery, resilience, and failover planning for data platforms.
Enthusiasm to contribute to Stand Together's vision and principled approach to solving problems, and a commitment to stewarding our culture, which champions values including transformation and innovation, entrepreneurialism, humility, and respect.
Tech Stack
ETL
SQL
Benefits
Competitive benefits: Enjoy a 6% 401(k) match with immediate vesting, flexible time off, comprehensive health and dental plans, plus wellness and mental health support through Peloton and Talkspace.
A meaningful career: Join a passionate community of over 1,300 employees dedicated to improving lives and driving innovative solutions to complex social challenges.
Commitment to growth: Thrive in a non-hierarchical environment that empowers employees to discover, develop and apply their unique talents.
Competitive compensation: Our approach rewards the value you create through competitive salaries and bonus opportunities.