TEKsystems is a leading provider of business and technology services, and they are seeking a Data Quality Engineering and Operations Manager to lead the design, delivery, and operation of enterprise data quality capabilities. This role involves overseeing data quality engineers, managing data quality operations, and ensuring data accuracy and trust across various platforms.
Responsibilities:
- Oversees Data Quality Engineers and owns the data quality operating model using Monte Carlo
- Define, deploy, and continuously improve: Data quality monitoring at source and across lineage, Scoring aligned to key corporate metrics, Routine data quality scorecards
- Establish and manage issue management processes: Centralized issue tracking, SLAs for remediation
- Partner with analytics, engineering, and business stakeholders to: Prioritize quality monitoring, Address data defects impacting business outcomes
- Own the enterprise data quality strategy, roadmap, and backlog aligned to data governance objectives and business priorities
- Define success metrics for data quality, including coverage, incident reduction, SLA performance, analytics trust, and AI impact through well-documented and enforceable policy, standard and procedures
- Drive adoption and value realization of data quality policy and standards from Monte Carlo, ensuring it is used consistently and effectively across domains
- Translate business, governance, analytics, and AI requirements into actionable data quality rules, thresholds, and monitoring
- Configure and operationalize Monte Carlo to monitor data freshness, volume, distribution, schema changes, and anomalies
- Ensure data quality controls are implemented across: o Source and operational datasets o Curated analytics and semantic layers o AI training, feature, and inference pipelines
- Own day-to-day data quality operations, including alert triage, root cause analysis, and remediation coordination
- Establish and operationalize data quality standards for: o Critical data elements (CDEs) used in decision-making o Management and regulatory reporting datasets o Enterprise metrics, KPIs, and dashboards
- Use Monte Carlo observability signals to proactively identify upstream issues impacting reports and analytics
- Improve trust and adoption of analytics through transparent quality metrics and reporting
- Establish and operationalize data quality standards for AI and ML use cases, including: o Training and validation data completeness and representativeness o Label accuracy and consistency o Schema, volume, and distribution drift detection o Bias, outlier, and feature stability monitoring
- Partner with data science teams to identify AI-critical datasets and features
- Use Monte Carlo monitoring and anomaly detection to identify data issues that could impact model performance or reliability
- Manage and mentor Data Quality Engineers responsible for rule development, monitoring, and issue analysis
- Collaborate with Data Engineering, Analytics, Data Science, Privacy, and Business Data Owners
- Communicate data quality health, trends, and risks to governance and executive stakeholders
Requirements:
- 7+ years of experience in data, analytics, or data management roles with a strong focus on data quality
- 3+ years in a people-lead role supporting data or analytics platforms
- Hands-on experience implementing or operating Monte Carlo or similar data observability platforms
- Strong understanding of data quality dimensions across operational, analytical, and AI use cases
- Experience working with modern data platforms (cloud data warehouses/lakehouses, ETL/ELT pipelines, BI tools)
- Data Governance
- Data Quality
- Data analytics
- Data Quality Management
- Data Quality Frameworks
- Monte Carlo
- databricks
- Data observability
- AI/ML
- Experience working within a formal Data Governance organization
- Familiarity with data observability, anomaly detection, and data drift concepts
- Experience supporting AI/ML or advanced analytics use cases
- Background in regulated industries