SmartAsset is an online destination for consumer-focused financial information and advice, and they are seeking a Senior Data Engineer to help build and optimize their data platform. In this role, you'll design and implement high-impact, cloud-native solutions that improve data reliability and scale with the business.
Responsibilities:
- Architect, design, implement, and operate end-to-end data engineering solutions using Agile methodology
- Develop and manage robust data integrations with external vendors and organizations (including complex API integrations)
- Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams to understand requirements and deliver high-impact data solutions
- Lead and take ownership of assigned technical projects in a fast-paced environment
- Drive continuous improvement in the quality, security, efficiency, and scalability of our data pipelines and infrastructure
Requirements:
- 7+ years of hands-on experience in data engineering
- Bachelor of Science degree in Computer Science or equivalent practical experience
- 4+ years of dedicated experience building and maintaining complex ETL/ELT pipelines
- 3+ years of Python development experience, specifically building production-grade data APIs using FastAPI or similar frameworks
- Strong expertise in SQL, including advanced query optimization and performance tuning
- Expert-level proficiency in dbt implementation, including managing models, macros, and incremental and dynamic tables
- Extensive hands-on experience with Snowflake, including performance optimization, data recovery using Time Travel, and advanced data modeling techniques
- Practical experience with key AWS data services such as Kinesis, Firehose, Redshift, Spectrum, Elastic MapReduce, and Lambda; and container orchestration services like ECS and EKS for deploying and managing data applications
- Architect, design, implement, and operate end-to-end data engineering solutions using Agile methodology
- Develop and manage robust data integrations with external vendors and organizations (including complex API integrations)
- Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams to understand requirements and deliver high-impact data solutions
- Lead and take ownership of assigned technical projects in a fast-paced environment
- Drive continuous improvement in the quality, security, efficiency, and scalability of our data pipelines and infrastructure
- Proven experience designing and building near real-time data processing systems
- Experience with data observability tools for proactive monitoring of data quality, lineage, and pipeline health
- Familiarity with modern data orchestration platforms such as Argo or Airflow
- Hands-on experience with the full software development lifecycle (SDLC), including strong CI/CD practices for data pipelines and proficiency with data testing frameworks
- Demonstrated ability to identify and adopt new technologies that improve data quality and reliability
- Deep understanding of the data lifecycle, emphasizing the importance of high-quality data in applications, machine learning, business analytics, and reporting
- Proven track record of mentoring junior team members
- Experience with data ingestion tools like Singer is a plus
- Knowledge of analytics tools such as Tableau, Plotly, and Pandas
- Experience in the financial services industry