SmartAsset is an online destination for consumer-focused financial information and advice, helping people make smart financial decisions. The Data Engineer will design and implement high-impact, cloud-native solutions that improve data reliability and support data-driven decisions across the organization.
Responsibilities:
- Design, implement, and maintain scalable data engineering solutions following Agile methodology
- Develop and manage reliable data integrations with external vendors and organizations (including API integrations)
- Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams to understand requirements and deliver high-quality data solutions
- Take ownership of assigned components and deliverables within larger technical projects
- Contribute to improving the quality, security, efficiency, and scalability of data pipelines and infrastructure
Requirements:
- Bachelor of Science degree in Computer Science or equivalent
- 3 - 5 years of experience building and maintaining ETL/ELT pipelines
- 2+ years of Python development experience, including building and maintaining production-grade data services (e.g., FastAPI or similar frameworks)
- Strong SQL skills, including performance tuning and query optimization
- Hands-on experience with dbt, including developing models and managing incremental tables
- Practical experience with Snowflake, including data modeling and query optimization
- Experience with AWS data services such as S3, Redshift, Lambda, or similar tools for building and operating data workflows
- Design, implement, and maintain scalable data engineering solutions following Agile methodology
- Develop and manage reliable data integrations with external vendors and organizations (including API integrations)
- Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams to understand requirements and deliver high-quality data solutions
- Take ownership of assigned components and deliverables within larger technical projects
- Contribute to improving the quality, security, efficiency, and scalability of data pipelines and infrastructure
- Experience working with near real-time or streaming data systems
- Familiarity with data observability or monitoring tools
- Exposure to orchestration platforms such as Airflow or Argo
- Experience contributing to CI/CD pipelines and testing frameworks for data workflows
- Understanding of the data lifecycle and the role of high-quality data in analytics and reporting
- Experience with data ingestion tools like Singer is a plus
- Knowledge of analytics tools such as Tableau, Plotly, or Pandas
- Experience in the financial services industry