AgileEngine is an Inc. 5000 company that creates award-winning software for Fortune 500 brands and trailblazing startups across 17+ industries. The Senior Data Engineer role is central to transforming large, diverse datasets into reliable insights that support research and strategic decisions across a global financial platform. You will design and build scalable data architectures and collaborate with data scientists and stakeholders to enhance the data ecosystem.
Responsibilities:
- Design and build scalable Data Lakes, Data Warehouses, and Data Lakehouses
- Design and implement robust ETL/ELT processes at scale using Python and pipeline orchestration tools like Airflow
- Develop ingestion workflows from diverse third-party APIs and data sources
- Manage and optimize file formats such as Parquet, Avro, and ORC for high-performance data retrieval
- Work with AI development tools to support machine learning initiatives and advanced analytics
- Act as a technical consultant to gather requirements, understand business goals, and translate them into technical roadmaps
- Work with Terraform and other tools to build AWS and on-prem infrastructure
Requirements:
- You must be authorized to work for ANY employer in the US (e.g., Green card holders, TN visa holders, GC EAD, H4 EAD, U4U with EAD), as we are unable to sponsor or take over employment visa sponsorship at this time
- Bachelor's degree in computer science/engineering or other technical field, or equivalent experience
- 5+ years of experience with Python with strong hands-on expertise
- 5+ years of experience with data processing and analytics libraries such as Pandas, Polars, PySpark, and DuckDB
- 2+ years of experience with Big Data technologies such as Spark and Snowflake
- Expert-level knowledge of Airflow or similar pipeline orchestration tools
- Deep understanding of Medallion Architecture, columnar file formats, and database technologies including SQL, NoSQL, and Lakehouse architectures
- Proven ability to work with third-party APIs for complex data ingestion
- Proficiency with cloud platforms such as AWS, GCP, and Snowflake, including advanced SQL optimization
- Upper-intermediate English level
- Familiarity with the fintech industry and financial data domains
- Documentation skills for data pipelines, architecture designs, and best practices
- OpenSearch or Elasticsearch
- AWS SageMaker Studio and Jupyter for data analysis
- Terraform
- Scala