Design, build, and operate highly scalable, low latency data platforms
Create modern data lakehouse architectures, real time data pipelines, and analytics ready data models that support trading, risk, reporting, and regulatory requirements
Collaborate closely with trading desks, product teams, AI/analytics teams, and architects to deliver reliable, secure, and high performance data solutions
Requirements
7–12+ years of experience in data engineering, backend engineering, or distributed systems
Strong programming expertise in Python, Scala, and/or Java
Advanced SQL skills and hands on experience with distributed data processing frameworks such as Apache Spark or Apache Flink
Extensive experience with streaming platforms including Kafka, Kinesis, or Pulsar
Hands on experience with data lake and data warehouse technologies such as Databricks, Snowflake, Amazon Redshift, or similar platforms
Proven experience building real time or near real time data pipelines
Strong understanding of data modeling, distributed systems, scalability, and performance optimization
Experience in Wealth Management or Capital Markets trading systems
Familiarity with OMS/EMS platforms such as Charles River Development (CRD), Aladdin, FIS, or similar systems
Strong knowledge of market data across equities, fixed income, derivatives, and other asset classes
Understanding of the trade lifecycle, including order capture, execution, allocation, clearing, and post trade processing
Experience with cloud native data platforms on AWS, Azure, or GCP
Exposure to real time analytics, risk management systems, and regulatory reporting platforms