Rackspace Technology is a multicloud solutions expert that delivers end-to-end solutions for its customers. They are seeking a Senior Big Data Engineer who will design and develop scalable batch processing systems and manage complex data workflows using technologies like Hadoop, Oozie, and GCP tools.
Responsibilities:
- Design and develop scalable batch processing systems using technologies like Hadoop, Oozie, Pig, Hive, MapReduce, and HBase, with hands-on coding in Java or Python (Java is a must)
- Must be able to lead Jira Epics
- Write clean, efficient, and production-ready code with a strong focus on data structures and algorithmic problem-solving applied to real-world data engineering tasks
- Develop, manage, and optimize complex data workflows within the Apache Hadoop ecosystem, with a strong focus on Oozie orchestration and job scheduling
- Leverage Google Cloud Platform (GCP) tools such as Dataproc, GCS, and Composer to build scalable and cloud-native big data solutions
- Implement DevOps and automation best practices, including CI/CD pipelines, infrastructure as code (IaC), and performance tuning across distributed systems
- Collaborate with cross-functional teams to ensure data pipeline reliability, code quality, and operational excellence in a remote-first environment
Requirements:
- Bachelor's degree in Computer Science, software engineering or related field of study
- Experience with managed cloud services and understanding of cloud-based batch processing systems are critical
- Must be able to lead Jira Epics is MUST
- Proficiency in Oozie, Airflow, Map Reduce, Java are MUST haves
- Strong programming skills with Java (specifically Spark), Python, Pig, and SQL
- Expertise in public cloud services, particularly in GCP
- Proficiency in the Apache Hadoop ecosystem with Oozie, Pig, Hive, Map Reduce
- Familiarity with BigTable and Redis
- Experienced in Infrastructure and Applied DevOps principles in daily work. Utilize tools for continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) like Terraform to automate and improve development and release processes
- Proven experience in engineering batch processing systems at scale
- 5+ years of experience in customer-facing software/technology or consulting
- 5+ years of experience with 'on-premises to cloud' migrations or IT transformations
- 5+ years of experience building, and operating solutions built on GCP
- Proficiency in Oozie and Pig
- Must be able to lead Jira Epics
- Proficiency in Java or Python