Designing and building next-generation data product engineering patterns on modern cloud platforms including Snowflake and Databricks.
Developing reusable engineering assets such as frameworks, build kits, CI/CD templates, and performance optimization approaches.
Partnering with Enablement and Execution teams to operationalize and scale data engineering patterns across delivery teams.
Evaluating, testing, and experimenting with emerging data and AI tools, platforms, and services.
Participate in technical proofs of concept, comparing alternative solutions, and making data-driven recommendations for platform and tool rationalization.
Documenting project outcomes, transition plans, adoption guides, and solution usage scripts to support enterprise rollout.
Supporting platform modernization efforts through hands-on development, tuning, and optimization.
Collaborating with data product owners, architects, and platform teams to align engineering solutions with enterprise data strategy.
Requirements
Bachelor’s Degree in a quantitative field such as computer science, data science, mathematics, or statistics.
5 to 7 years of statistical and/or analytical experience.
Typically, 6+ years of experience in data engineering, analytics engineering, or platform engineering roles.
Demonstrated experience building and supporting data solutions in a cloud environment.
Proven track record of designing reusable components or standards adopted by multiple teams.
Experience working in regulated or large-scale enterprise environments preferred.
Strong organizational skills with the ability to work on multiple initiatives concurrently.
Deep understanding of banking and financial institutions concepts.
Knowledge of banking regulation and requirements for regulatory reporting.
Strong analytical, organizational and problem-solving.
Hands-on experience with programming languages such as Python and SQL.
Proficiency with big data technologies including Hadoop, Hive, and Spark.
Expertise in visual analytics tools such as Power BI, Tableau, or equivalent platforms.
Experience with Power Platform tools such as Power Automate and Power Apps.
Proven track record in automating and optimizing ETL processes at scale.
Hands-on experience with cloud platforms (e.g., Azure, AWS, GCP) and cloud-native data services.
Excellent written and verbal communication skills for documenting technical processes and engaging with cross-functional teams.
Tech Stack
AWS
Azure
Cloud
ETL
Google Cloud Platform
Hadoop
Python
Spark
SQL
Tableau
Benefits
Healthcare (medical, dental, vision)
Basic term and optional term life insurance
Short-term and long-term disability
Pregnancy disability and parental leave
401(k) and employer-funded retirement plan
Paid vacation (from two to five weeks depending on salary grade and tenure)
Up to 11 paid holiday opportunities
Adoption assistance
Sick and Safe Leave accruals of one hour for every 30 worked, up to 80 hours per calendar year unless otherwise provided by law