Bridge the gap between robust data architecture and cutting-edge intelligence.
Own the end-to-end lifecycle of data-driven AI applications.
Design, build, and maintain scalable data pipelines within Google Cloud Platform (GCP).
Design and deploy autonomous AI agents and orchestration frameworks to automate complex workflows.
Leverage the full Google suite—including Vertex AI, Agent Builder, and BigQuery—to create production-ready RAG systems.
Develop and optimise vector databases and traditional data models (SQL/NoSQL).
Partner with technical and non-technical teams to identify data-rich automation opportunities.
Take accountability for the performance, data integrity, and safety of AI deployments.
Requirements
Proven experience building production-grade data solutions on GCP, with deep knowledge of BigQuery, Cloud Run, and Cloud Functions.
Hands-on experience building AI applications using Vertex AI (Model Garden, Studio) and implementing Gemini or other LLM architectures.
Proficiency in developing AI agents using Vertex AI Agent Builder, LangChain, or similar frameworks to manage complex logic flows.
Expert-level Python skills, specifically for data engineering, API development, and LLM integration.
Experience with RAG architectures, managing Vector Databases (e.g., Vertex AI Search, Pinecone, or pgvector), and handling structured vs. unstructured data flows.
A "DataOps" mindset—strong experience with Git, CI/CD, and version control for both code and data/model lineage.
A relevant Google Cloud certification (Professional Data Engineer or Professional Machine Learning Engineer) would be highly beneficial.
Tech Stack
BigQuery
Cloud
Google Cloud Platform
NoSQL
Python
SQL
Benefits
25 days' annual leave
Birthday day off and Wellbeing leave
Opportunity to extend your stay if travelling for a race event
Health Cash Plan and access to the Aviva Digital Workplace app