Lead the design and evolution of highly scalable, fault-tolerant ETL/ELT pipelines.
Drive the strategy for real-time messaging and stream processing using Kafka and RabbitMQ to ensure sub-second data availability.
Act as the subject matter expert for ClickHouse, optimising complex schema designs, indexing strategies, and query performance for large-scale financial datasets.
Oversee the deployment of data services within cloud environments, implementing advanced security protocols and data governance standards essential for the finance industry.
Collaborate with senior leadership to align data strategy with business objectives. Mentor data engineers through code reviews and technical guidance.
Implement advanced monitoring and automated recovery systems to ensure the integrity and quality of high-stakes financial data.
Requirements
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
Proven experience in data engineering, with a strong background in designing and implementing ETL processes within cloud environments.
Experience within the Finance or Trading technology sector, with a proven track record of handling real-time market or transactional data.
Strong programming skills in Python, with experience in developing robust, maintainable, and scalable data processing pipelines.
Extensive SQL knowledge and experience.
Excellent problem-solving skills and the ability to work collaboratively in a team environment.
Strong communication skills, with the ability to convey complex technical concepts to non-technical stakeholders.
Tech Stack
Cloud
ETL
Kafka
Python
RabbitMQ
SQL
Benefits
Hybrid working arrangement
Opportunities for enriching career growth, including exposure to regional contexts
Complimentary snacks and beverages available in the office pantry