Capital Bank, N.A. is a publicly traded company headquartered in Maryland, providing banking services primarily in Maryland, DC, and Northern Virginia. They are seeking a Senior Data Engineer who will manage and optimize enterprise data pipelines and database solutions, ensuring high-performance data platforms and reliable integrations across the bank's operations.
Responsibilities:
- Serve as the expert for ETL and DB solutions, collaborating with business stakeholders and IT teams to define requirements, gather data, and implement optimized data solutions
- Design, implement, and maintain data systems in Snowflake to ensure data scalability and accessibility
- Implement and manage data lakes and data warehouses, creating pipelines and data models to enable efficient analytics and reporting
- Establish and document strategies for managing data transfer processes, including secure file transfers (SFTP), batch data processing, and real-time streaming
- Build and optimize ETL pipelines for data extraction, transformation, and loading into operational databases or analytical platforms
- Integrate and support data visualization tools such as Power BI, Sisense, Google Looker, Tableau, or similar platforms to enable actionable insights for business stakeholders
- Develop and maintain optimized data models for dashboards and reporting, ensuring compatibility with visualization tools
- Plan, coordinate, and implement database migrations, upgrades, and patches with minimal downtime
- Define and enforce database governance policies, including data integrity, security, and compliance with regulatory requirements
- Analyze and resolve database performance issues by optimizing queries, indexes, and schema designs
- Partner with vendors to evaluate, select, and implement database tools, services, and technologies; stay informed about product roadmaps and industry trends
- Develop disaster recovery and high-availability solutions, including replication, clustering, and failover
Requirements:
- Bachelor's degree or higher in Computer Science, Information Systems, or a related field
- 6+ years of experience in data engineering, ETL, database management with experience in cloud-based databases and financial services preferred
- Experience in Database Administration (DBA), managing and optimizing databases for performance and security
- 3+ years of experience designing and building data lakes and data warehouses using platforms like Azure Fabrik, Snowflake, Amazon Redshift, or Google BigQuery
- 2+ years of experience using data visualization tools like Power BI, Sisense, Google Looker, Tableau, or similar platforms
- Experience with managing data transfers and file processes, including SFTP, secure data pipelines, and real-time or batch data movement
- Experience in proactively identifying, addressing, and monitoring data quality issues, adhering to established data quality standards
- Excellent communication skills and the ability to collaborate effectively across teams and stakeholders, explain technical concepts to non-technical stakeholders
- Expertise in cloud-based database platforms, such as Azure SQL, Amazon RDS, Google Cloud Spanner, and Snowflake
- Strong knowledge of data lake and data warehouse architectures, including designing efficient schemas, partitioning strategies, and optimizing storage
- Proficiency with data integration tools and technologies, such as Apache Kafka, Apache Spark, Talend, or Informatica
- Hands-on experience building and maintaining ETL pipelines to support large-scale data environments
- Advanced SQL skills and familiarity with programming languages like Python or Java for data manipulation and automation
- Experience with data visualization platforms, including building and optimizing dashboards using Sisense, Power BI, Google Looker, Tableau, or similar tools
- Experience with CI/CD tools (e.g., GitLab, Azure DevOps, Jenkins) and data pipeline monitoring tools (e.g., Airflow, Apache NiFi, Azure Data Factory)
- Strong understanding of database security best practices, including encryption, access controls, and compliance with regulatory standards
- Ability to manage data file transfers and processing workflows effectively
- Experience with database monitoring and performance tuning in cloud and hybrid environments
- Preferred experience in understanding data relationships, as well as developing and optimizing SQL queries, stored procedures, and database functions in both OLTP and OLAP systems
- Strong organizational and problem-solving skills in Agile or fast-paced environments