Tential Solutions is seeking a high-caliber Data Engineer to support a massive post-merger integration between two tier-1 financial institutions. The role involves designing and implementing data pipelines, managing data extraction, and optimizing data architectures to ensure seamless integration in a fast-paced environment.
Responsibilities:
- Design and implement robust, low-latency streaming pipelines using Apache Flink and Kafka to handle high-volume banking transactions
- Manage data extraction from legacy systems using Postgres WAL Change Data Capture (CDC) to ensure data consistency during the migration
- Build and optimize Lakehouse architectures using Databricks and Snowflake, leveraging Delta Lake for ACID transactions and data versioning
- Refine ETL/ELT processes to meet strict financial regulatory and performance benchmarks
- Work alongside Big 4 partners and client stakeholders to map data lineages and resolve integration blockers in a fast-paced merger environment
Requirements:
- Minimum 5 years of professional experience in Data Engineering, ideally within Financial Services or Fintech
- Expert-level proficiency in Apache Flink, Kafka, and Databricks
- Strong hands-on experience with Postgres (CDC) and Snowflake data warehousing
- Deep understanding of Delta Lake and Lakehouse design principles
- Proficiency in Python, Scala, or Java for streaming applications
- Understanding of banking data domains (e.g., retail banking, payments, or risk management) is highly preferred
- Ability to hit the ground running in a high-pressure, ASAP-start environment
- Strong ability to document technical workflows and present findings to both technical and non-technical leadership