Astir IT Solutions, Inc. is seeking a highly technical, hands-on Expert Data Engineer to design, build, and operate large-scale data platforms and pipelines. This role requires deep expertise in modern data engineering practices and cloud data platforms, where the candidate will serve as a technical leader and drive architecture decisions while remaining involved in hands-on development.
Responsibilities:
- Design, build, and maintain end-to-end ETL/ELT pipelines and scalable data platforms
- Develop robust data integrations and APIs with external vendors and systems
- Build production-grade data APIs using Python (FastAPI or similar frameworks)
- Optimize Snowflake data models, SQL queries, and pipeline performance
- Implement and manage dbt models, macros, and incremental pipelines
- Work closely with data analysts, data scientists, and engineering teams to deliver high-impact data solutions
- Ensure data quality, reliability, scalability, and security across data pipelines
Requirements:
- 10+ years of hands-on data engineering experience
- Strong Python development experience (data pipelines and APIs)
- Expert SQL development and query optimization
- Extensive Snowflake experience (data modeling, optimization, Time Travel)
- Advanced dbt implementation
- Experience with AWS data services (Kinesis, Firehose, Redshift, EMR, Lambda)
- Experience deploying applications using ECS or EKS
- Experience with real-time or streaming data pipelines
- Familiarity with Airflow or Argo orchestration
- Experience with data observability and monitoring tools
- Experience with CI/CD for data pipelines
- Familiarity with Singer, Tableau, Pandas, or Plotly
- Financial services industry experience is a plus