Create, sustain, and troubleshoot complex operational data flows including data storage, data transport, data management, data security, data compliance, and knowledge store management
Work with the mission customer to perform exploratory data analysis on raw data to clean, enrich, transform and convert the raw data into the required formats.
Devise methods to improve existing operational data flow processing, distribution, and reliability.
Requirements
Minimum of 5 years' as a DevOps/Systems Engineer in programs and contracts of similar scope, type, and complexity is required
Bachelor's Degree in System Engineering, Computer Science, Information Systems, Engineering Science, Engineering Management, or related discipline
Five (5) years of additional SE experience may be substituted for a bachelors degree.
Must be authorized to work in the US
Active TS/SCI w/ appropriate level Polygraph required
Experience with AWS (S3, VPCs & Networking, EC2, ECS/EKS)
Experience with containerization (Docker, Kubernetes, Registries)
Experience with IaC (Terraform/Cloud Formation)
Experience with CI/CD (Jenkins/Gitlab/Github Actions)
Experience creating, managing, and troubleshooting complex operational data flows
Experience with Corporate data flow processes and tools
Experience with Corporate data security and compliance procedures and policies
Experience with the Atlassian Tool Suite (JIRA, Confluence, Bitbucket)