First Citizens Bank is seeking an experienced DevOps Engineer to design, build, and maintain CI/CD pipelines and infrastructure automation for their data engineering platform. The role involves working with cloud operations, enabling data engineers to deploy reliably and rapidly across AWS and Azure environments.
Responsibilities:
- Design and implement robust CI/CD pipelines using Azure DevOps or GitLab; automate build, test, and deployment processes for data applications, dbt Cloud jobs, and infrastructure changes
- Build deployment orchestration for multi-environment (dev, qa, uat, production) workflows with approval gates, rollback mechanisms, and artifact management
- Implement GitOps practices for infrastructure and application deployments; maintain version control and audit trails for all changes
- Optimize pipeline performance, reduce deployment times, and enable fast feedback loops for rapid iteration
- Design and manage Snowflake, AWS and Azure infrastructure using Terraform; ensure modularity, reusability, and consistency across environments
- Provision and manage Cloud resources
- Implement tagging strategies and resource governance; maintain Terraform state management and implement remote state backends
- Support multi-cloud architecture patterns and ensure portability between AWS and Azure where applicable
- Deploy and manage Ansible playbooks for configuration management, patching, and infrastructure orchestration across cloud environments
- Utilize Puppet for infrastructure configuration, state management, and compliance enforcement; maintain Puppet modules and manifests for reproducible environments
- Automate VM provisioning, OS hardening, and application stack deployment; reduce manual configuration and ensure environment consistency
- Build automation for scaling, failover, and disaster recovery procedures
- Automate Snowflake provisioning, warehouse sizing, and cluster management via Terraform; integrate Snowflake with CI/CD pipelines
- Implement Infrastructure as Code patterns for Snowflake roles, permissions, databases, and schema management
- Build automated deployment workflows for dbt Cloud jobs and Snowflake objects; integrate version control with Snowflake changes
- Monitor Snowflake resource utilization, costs, and performance; implement auto-suspend/auto-resume policies and scaling strategies
- Develop Python scripts and tools for infrastructure automation, cloud operations, and deployment workflows
- Build custom integrations between CI/CD systems, cloud platforms, and Snowflake; create monitoring and alerting automation
- Integrate monitoring and logging solutions (Splunk, Dynatrace, CloudWatch, Azure Monitor) into CI/CD and infrastructure stacks
- Build automated alerting for infrastructure health, deployment failures, and performance degradation
- Implement centralized logging for applications, infrastructure, and cloud audit trails; maintain log retention and compliance requirements
- Create dashboards and metrics for infrastructure utilization, deployment frequency, and change failure rates
- Support deployment of data processing jobs, Airflow DAGs, and dbt Cloud transformations through automated pipelines
- Implement blue-green or canary deployment patterns for zero-downtime updates to data applications
- Build artifact management workflows (Docker images, Python packages, dbt artifacts); integrate with Artifactory or cloud registries
- Collaborate with data engineers on deployment best practices and production readiness reviews
- Design backup and disaster recovery strategies for data infrastructure; automate backup provisioning and testing
- Implement infrastructure redundancy and failover automation using AWS/Azure native services
- Maintain comprehensive documentation for infrastructure architecture, CI/CD workflows, and operational procedures
- Create runbooks and troubleshooting guides for common issues; document infrastructure changes and design decisions
- Establish DevOps best practices and standards; share knowledge through documentation, lunch-and-learns, and mentoring
Requirements:
- Bachelor's Degree and 4 years of experience in Data engineering, big data technologies, cloud platforms OR High School Diploma or GED and 8 years of experience in Data engineering, big data technologies, cloud platforms
- CI/CD tools: Azure DevOps Pipelines or GitLab CI/CD (hands-on pipeline development)
- Infrastructure as Code: Terraform (AWS and Azure providers) — production-grade experience
- Configuration Management: Ansible and/or Puppet — ability to write playbooks/manifests and manage infrastructure state
- Cloud platforms: AWS (EC2, S3, RDS, VPC, IAM, Lambda, Glue, Lakeformation) and Azure (VMs, App Services, Blob Storage, Cosmos DB, networking)
- Python programming: scripting, automation, API integration, and tooling development
- Snowflake: operational knowledge of warehouse management, cost optimization, and cloud integration
- Git/GitLab/GitHub: version control, branching strategies, and repository management
- Linux/Unix system administration and command-line proficiency
- Networking fundamentals: VPCs, subnets, security groups, DNS, load balancing
- Scripting languages: Bash, Python, or similar for automation
- 5+ years in DevOps, Platform Engineering, or Infrastructure Engineering
- 3+ years hands-on with Terraform and Infrastructure as Code
- 3+ years with CI/CD tools (Jenkins, GitLab CI, Azure DevOps, or similar)
- 2+ years with configuration management tools (Ansible, Puppet, or similar)
- 2+ years supporting cloud platforms (AWS and/or Azure in production)
- 1+ years with Python automation and scripting
- Experience supporting or integrating with Snowflake or modern data warehouses
- Financial banking experience is a plus
- Must have one or more certifications in the relevant technology fields