First Citizens Bank is seeking a Senior Data Engineer to support their enterprise data warehouse and critical business functions. The role involves designing and maintaining the data platform, establishing data integration procedures, and collaborating with global technology teams to deliver business value.
Responsibilities:
- Responsible for designing, building, and maintaining data platform that supports data integrations for Enterprise Data Warehouse, Operational Data Store or Data Marts etc. with appropriate data access, data security, data privacy and data governance
- Establish enterprise-scale data integration procedures, data pipelines and frameworks across the data development life cycle. Suggest and implement appropriate technologies to deliver resilient, scalable, and future-proof data solutions
- Create data ingestion pipelines in data warehouses and other large-scale data platforms
- Creating scheduled as well as trigger-based ingestion patterns using scheduling tools
- Create performance optimized DDLs for any row-based or columnar databases such as Oracle, Postgres, Netezza database per Logical Data Model
- Performance tuning of complex data pipelines and SQL queries
- Performs impact analysis of proposed changes on existing architecture, capabilities, system priorities, and technology solutions
- Manage deliverables of developers, perform design reviews and coordinate release management activities
- Estimate and provide timelines for project activities. Identify, document, and communicate technical risks, issues and alternative solutions discovered during project
- Drive automation, identify inefficiencies, optimize processes and data flows, and recommend improvements
- Use agile engineering practices and various data development technologies to rapidly develop and implement efficient data products
- Collaborate with Product Owners to understand PI goals, PI planning, requirement clarification, and delivery coordination
- Technical support for production incidents and failures
- Work with global technology teams across different time zones (primarily US) to deliver timely business value
Requirements:
- Bachelor's Degree and 4 years of experience in Data engineering, big data technologies, cloud platforms OR High School Diploma or GED and 8 years of experience in Data engineering, big data technologies, cloud platforms
- Experience in designing and building Data Warehouse and Data lakes. Good knowledge of data warehouse principles, and concepts
- Technical expertise working in large scale Data Warehousing applications and databases such as Oracle, Netezza, Teradata, and SQL Server
- Experience with public cloud-based data platforms especially Snowflake and AWS
- Expertise in design and development of complex data pipelines
- Solutions using any industry leading ETL tools such as SAP Business Objects Data Services (BODS), Informatica Cloud Data Integration Services (IICS), IBM Data Stage
- Experience of ELT tools such as DBT, Fivetran, and AWS Glue
- Expert in SQL - development experience in at least one scripting language (Python etc.), adept in tracing and resolving data integrity issues
- Strong knowledge of data architecture, data design patterns, modeling, and cloud data solutions (Snowflake, AWS Redshift, Google BigQuery)
- Expertise in Logical and Physical Data Model using Relational or Dimensional Modeling practices, high volume ETL/ELT processes
- Performance tuning of data pipelines and DB Objects to deliver optimal performance
- Experience in Gitlab version control and CI/CD processes
- Experience working in Financial Industry is a plus