architect and maintain cutting-edge data systems that power analytics, AI, and operational decision-making
take ownership of end-to-end data lifecycles, designing pipelines, models, and architectures that support real-time insights and machine learning at scale
Build modern, cloud-native data platforms (AWS, Snowflake, Databricks) supporting batch and streaming use cases
Automate ETL/ELT workflows, optimise data models, and enable self-serve analytics and AI
Manage ingestion, storage, processing, and delivery of structured and unstructured data
Continuously tune infrastructure for high concurrency, low latency, and cost efficiency
Ingest telemetry, API, and application data in real time to power dashboards and AI-driven tools
Provision datasets for ML/AI workloads, integrating with SageMaker, Snowflake ML, and MLOps best practices
Ensure robust data governance, compliance (GDPR, SOC 2), and enterprise-grade security
Work closely with Product, Engineering, DevOps, and Analytics teams to align data solutions with business goals
Requirements
significant experience in technology roles, with 5+ years in data engineering on real-time, scalable cloud platforms (AWS & Snowflake preferred)
Experience in SaaS/product companies managing large-scale IoT, telemetry, or digital datasets is highly desirable