EPAM Georgia is a team of innovators united by a passion for technology, seeking a Senior Data DevOps Engineer to support the development and operationalization of an Enterprise Data Platform for a leading oil and gas company. The role focuses on implementing and optimizing an on-premises data platform, integrating advanced technologies for data ingestion, processing, and analytics while emphasizing automation and operational excellence.
Responsibilities:
- Install and configure platform components ensuring seamless integration with the EDP stack
- Set up and manage RBAC (Role-Based Access Control) to enforce security best practices
- Design implement and maintain CI/CD pipelines (e.g., GitLab CI) for automated build test and deployment of platform components and data workflows
- Integrate Infrastructure as Code (IaC) tools (e.g., Terraform) into pipelines for repeatable auditable deployments
- Deploy and manage logging and monitoring solutions using the LGTM stack (Loki Grafana Tempo Mimir)
- Build and configure a centralized management console for the EDP
- Establish and support multi-tenancy for secure independent environments across teams and locations
- Automate deployment pipelines for data ingestion transformation and querying frameworks
- Manage infrastructure for scalability high availability and reliability using Kubernetes and RedHat OS
- Implement and enforce security policies with HashiCorp Vault and Open Policy Agent (OPA)
- Proactively monitor troubleshoot and optimize platform components for high performance
- Collaborate with Data Engineering and Platform teams to streamline releases and promote continuous delivery
- Work closely with the customer's technical team to align goals and resolve issues
Requirements:
- Proven experience as a Data DevOps Engineer or similar role in enterprise data platforms
- Hands-on expertise with Kubernetes and RedHat OS for infrastructure management
- Strong background in CI/CD pipeline design and automation (preferably GitLab CI)
- Proficiency with Infrastructure as Code tools (e.g., Terraform)
- Experience implementing RBAC security policies and secrets management (HashiCorp Vault OPA)
- Familiarity with logging and monitoring stacks (LGTM or similar)
- Solid understanding of multi-tenancy and centralized management in data platforms
- Excellent troubleshooting and problem-solving skills
- High reliability self-sufficiency and ability to work independently
- Strong communication skills for effective collaboration with technical teams and stakeholders
- Experience with Apache Kafka Apache Spark (including Spark Streaming) MinIO Apache Iceberg PostgreSQL and Trino
- Prior involvement in building or optimizing on-premises enterprise data platforms
- Knowledge of distributed systems and big data architectures
- Familiarity with S3-compatible object storage solutions
- Experience supporting multi-location or multi-team environments
- Certifications in Kubernetes RedHat or relevant DevOps technologies