Federal Express Corporation is seeking a Data Engineer Advisor who will play a pivotal role in driving engineering innovation within Dataworks. The role involves collaborating across multi-disciplinary teams to design, build, and maintain scalable data pipelines, while also providing technical mentorship and optimizing data platform performance.
Responsibilities:
- Design, build, and deploy highly scalable, fault-tolerant batch and real-time streaming data pipelines utilizing core GCP services such as Dataflow, Dataproc, Confluent Kafka and Pub/Sub
- Develop complex ELT/ETL workflows and implement efficient physical data models in BigQuery, focusing on query optimization, partitioning, and clustering for high-performance analytics
- Serve as a technical authority on the team, guiding junior engineers, conducting rigorous code reviews, and establishing standard methodologies for robust data platform development
- Implement Infrastructure as Code (IaC) and robust CI/CD pipelines to automate deployments, seamlessly orchestrating complex data workflows using Cloud Composer (Airflow) and Terraform
- Partner closely with Data Architects, IT stakeholders, and business teams to translate complex architectural designs and use cases into actionable, production-ready engineering tasks
- Proactively monitor data platform health, troubleshoot performance bottlenecks, resolve data ingestion failures, and optimize GCP resource utilization to ensure maximum efficiency and cost-effectiveness
Requirements:
- Bachelor's Degree in Information Systems, Computer Science, or a quantitative discipline such as Mathematics or Engineering and/or equivalent formal training or work experience
- Five to Seven (5 - 7) years equivalent work experience in measurement and analysis, quantitative business problem solving, simulation development and/or predictive analytics
- Extensive knowledge in data engineering and machine learning frameworks including design, development and implementation of highly complex systems and data pipelines
- Extensive knowledge in Information Systems including design, development and implementation of large batch or online transaction-based systems
- Strong understanding of the transportation industry, competitors, and evolving technologies
- Experience providing leadership in a general planning or consulting setting
- Experience as a leader or a senior member of multi-function project teams
- Strong oral and written communication skills
- Expert-level proficiency in building scalable, fault-tolerant batch and streaming pipelines using BigQuery, Dataflow, Dataproc, Confluent Kafka and Pub/Sub
- Deep technical expertise in Python, SQL, and Cloud Composer (Airflow) for complex data transformation, ELT/ETL, and workflow automation
- Strong hands-on experience with Terraform, CI/CD pipelines, and platform performance tuning to deploy and manage high-availability cloud infrastructure