Partners with Regions Technology partners to Design, Build, and Maintain the data-based structures and systems in support of Data and Analytics and Data Product use cases
Builds data pipelines to collect and arrange data and manage data storage in Regions’ big data environments
Builds robust, testable programs for moving, transforming, and loading data using cloud-based Big Data tools such as Glue, Step-Functions, Snowflake, Kafka, and Sagemaker
Builds applications that integrate with AI, including knowledge base management and document processing.
Designs and implements event-based architectures for streaming data
Coordinates design and development with Data Products Partners, Data Scientists, Data Management, Data Modelers, and other technical partners to construct strategic and tactical data stores
Ensures data is prepared, arranged and ready for each defined business use case
Designs and deploys frameworks and micro services to serve data assets to data consumers
Collaborates and aligns with technical and non-technical stakeholders to translate customer needs into Data Design requirements, and work to deliver world-class visualizations, data stories while ensuring data quality and integrity
Provides consultation to all areas of the organization that plan to use data to make decisions
Supports any team members in the development of such information delivery and aid in the automation of data products
Acts as trusted adviser and partner to business leaders; assisting in the identification of business needs & data opportunities, understanding key drivers of performance, interpreting business case data drivers, turning data into business value, and participating in the guidance of the overall data and analytics strategy
Ensures compliance with risk management programs, rules and regulations, and cybersecurity practices
Identifies opportunities for and supports process improvements
Applies disciplined change management practices
Requirements
Ph.D. and four (4) years of experience in a quantitative/analytical/STEM field
Or Master’s degree and six (6) years of experience in a quantitative/analytical/STEM field or technical related field
Or Bachelor's degree and eight (8) years of experience in a quantitative/analytical/STEM field or technical related field
Five (5) years of working programming experience in Python/PySpark, Scala, SQL, and Terraform
Five (5) years of working experience in cloud-based Big Data Technology such as Elastic Map Reduce (EMR), AWS Glue, BigQuery, or Snowflake