Role Overview
Job Purpose
The Systems and Data Engineer is responsible for the hands-on development, maintenance, and evolution of the Group’s internal integration infrastructure, home-built micro-services, and AI tooling. The role sits at the intersection of the Systems and Data teams, playing a critical part in ensuring data flows reliably between platforms and that the business can move quickly, confidently, and with the support of modern AI tooling. This is a builder role — the person in this position will be writing code, shipping integrations, and directly contributing to the Group’s AI adoption strategy.
Key Facts
- The Group is scaling rapidly, with growing demand for integrations, automation, and AI-native workflows across every business function.
- Our home-built Enterprise Service Bus (ESB) is a strategic asset that requires ongoing development and maintenance as new systems are onboarded through organic growth and M&A activity.
- AI tooling — including custom MCP connectors and internal agents — is live, evolving infrastructure that needs hands-on contributors now.
- Reliable, high-quality data flow between systems is foundational to every commercial and operational function across the Group.
- This role will grow with the team — there is a clear path from Junior to Mid to Senior as the platform matures.
Role Responsibilities
Integration Development & ESB
- Design, build, and maintain integrations between internal and third-party systems using the Group’s home-built Enterprise Service Bus.
- Own the reliability and quality of integration pipelines, proactively monitoring for failures, data inconsistencies, or performance degradation.
- Collaborate with the Systems team to scope and deliver new integration requirements as the Group onboards additional platforms.
- Document integration patterns, field mappings, and data flows to ensure maintainability and knowledge continuity.
Micro-Service Development & Maintenance
- Maintain and extend home-built micro-services, including connectors to Salesforce and source-of-truth CRM systems.
- Ensure micro-services are resilient, well-tested, and version-controlled, with clear deployment and rollback processes.
- Contribute to the evolution of internal platforms and supporting databases, working with the broader team to improve architecture over time.
- Identify and resolve bugs, performance issues, and technical debt across the service landscape.
AI Tooling & MCP Development
- Design and build custom Model Context Protocol (MCP) connectors that give AI agents structured, secure access to internal systems and data sources.
- Work closely with the Head of Group Systems & Data to identify high-value automation and AI use cases across both teams.
- Contribute to prompt engineering for internal AI workflows, ensuring outputs are accurate, reliable, and useful for business users.
- Stay current with the rapidly evolving AI tooling landscape (Claude, Cursor, agent frameworks) and bring relevant innovations back to the team.
Cross-Team Collaboration
- Work fluidly between the Systems and Data teams, adapting priorities in line with sprint commitments and business needs.
- Translate requirements from non-technical stakeholders into well-scoped engineering tasks.
- Contribute to peer code review, shared coding standards, and team knowledge-sharing practices.
Requirements
Essential
- Proficiency in Python — able to write clean, maintainable, production-grade code independently.
- Practical experience building or consuming REST APIs and integration patterns (webhooks, polling, event-driven).
- Familiarity with Salesforce or similar CRM platforms at an integration/API level.
- Experience with or strong interest in prompt engineering and working with LLMs (Claude, GPT, or similar).
- Hands-on experience with Claude and/or Cursor as development tools — comfort using AI to accelerate engineering work.
- Understanding of relational databases and SQL.
- Good instincts around error handling, logging, and building for reliability.
Desirable
- Experience building MCP servers or working with the Model Context Protocol.
- Exposure to ESB patterns, message queuing, or event-driven architectures.
- Familiarity with micro-service architecture and containerisation (Docker).
- Experience with source control best practices (Git, branching strategies, PR workflows).
- Any exposure to Snowflake, BigQuery, or similar data warehouse tooling.
Behaviours
- Builder mentality — not satisfied with “good enough”, always looking to improve or extend what exists.
- Curious and self-directed — comfortable picking up new tools and technologies without needing hand-holding.
- Detail-oriented — takes pride in clean, well-documented code and reliable pipelines.
- Collaborative — works well across teams and communicates clearly about blockers, progress, and trade-offs.
- High agency — takes ownership of problems and follows them through without needing to be chased.
Tech Stack
- BigQuery
- Docker
- Python
- SQL
Benefits
**Why work for us **
We place huge importance on caring for and developing our people. If you join us you can expect a good work-life balance and the training and support you need to succeed in your role and continue to progress. We are a socially conscious company, but one that also likes to have fun. We offer a generous holiday allowance, flexible hours, buying and selling holiday, enhanced maternity pay, free breakfast, fruit, and drinks, regular socials and much more.