Dwelly is a UK-based, AI-enabled lettings and property management platform that is rapidly growing through acquisitions. They are seeking a Data Engineer to build their data engineering function, focusing on designing and maintaining production data pipelines, optimizing database performance, and ensuring compliance with data protection regulations.
Responsibilities:
- Design and maintain a unified data architecture: database schemas, data models, and micro-architecture solutions to ensure scalability and reliability
- Optimize database performance at all levels: indexing, partitioning, clustering, and tuning configuration parameters
- Ensure full compliance with GDPR, UK Data Protection Act, and other relevant regulations: data masking, consent management, retention policies, and privacy impact assessments
- Optimize queries, schemas, and indexes where needed
- Set up basic data quality checks
- Support GDPR and UK data protection requirements, including:
- Data masking
- Access control
- Retention policies
- Take data notebooks and calculation logic
- Turn them into reliable, production-ready pipelines
- Ensure scalability, reliability, and reproducibility
Requirements:
- Write clean, readable, maintainable code
- Have real experience supporting data pipelines in production
- Have worked with a data warehouse (BigQuery or similar)
- Have strong experience in GCP
- Understand orchestration, monitoring, and performance tuning
- Can make practical engineering decisions independently
- Strong communication skills and fluency in English
- Startup mentality: resilience, adaptability, and ability to thrive in a fast-paced environment
- Customer-centric mindset: focus on delivering value to end-users or clients
- Strong problem-solving skills – ability to approach challenges logically and propose practical solutions
- Experience with AWS, or Azure
- Experience with message queues or distributed systems
- Basic CI/CD for data pipelines