Design, build, and maintain data pipelines and architectures to support data ingestion, transformation, integration, storage, and dissemination.
Apply software engineering and ETL principles to ensure data accuracy, quality, consistency, and scalability.
Integrate COTS and customer-developed tools within existing data frameworks to meet operational and analytical requirements.
Collaborate with DataOps teams to prepare, automate, and optimize data workflows for real-time analytics.
Implement and enforce data security policies, including data encryption and access controls.
Monitor data quality and implement proactive alerting on data pipelines.
Develop and maintain documentation for data pipelines, models, and governance policies.
Provide Tier-2 and Tier-3 support for enterprise data products and services.
Conduct root cause analysis for recurring issues and implement solutions to prevent future occurrences.
Develop and maintain training materials and a centralized knowledge repository for data operations.
Proactively communicate with customers regarding known issues and new features.
Collect customer feedback to identify areas for improvement and enhance customer satisfaction.
Foster a collaborative team environment that encourages innovation and continuous improvement.
Ensure compliance with all applicable regulations related to data privacy and security.
Requirements
Active Top Secret (TS) clearance with SCI eligibility
Bachelor’s degree in Computer Science, Data Science, Engineering, Information Systems, or related technical discipline and 8–12 years of relevant experience OR Master’s degree in a related field and 6–10 years of relevant experience
Minimum of 8 years of experience in data engineering or related roles
Proven experience in designing and implementing data pipelines and architectures
Experience with data quality monitoring tools and processes
Excellent communication and interpersonal skills
Experience developing and maintaining enterprise-scale data pipelines in cloud environments (AWS, Azure, or GCP)
Experience implementing ETL/ELT processes and data orchestration frameworks
Strong knowledge of ETL processes and data integration techniques
Experience working with structured and unstructured data sources, including APIs, streaming platforms, and relational/non-relational databases
Experience integrating data pipelines into DevSecOps CI/CD environments