About this role
AgileEngine is an Inc. 5000 company that creates award-winning software for Fortune 500 brands and trailblazing startups across 17+ industries. We rank among the leaders in areas like application development and AI/ML, and our people-first culture has earned us multiple Best Place to Work awards.
WHY JOIN US
If you're looking for a place to grow, make an impact, and work with people who care, we'd love to meet you!
ABOUT THE ROLE
As a Cloud Data Architect, you’ll lead the design of a next-generation AWS data platform from the ground up — building the Medallion Lakehouse architecture that will become the unified data foundation for the entire organization. This is a greenfield opportunity for a senior architect who wants genuine ownership: shaping ingestion patterns, governance frameworks, and scalable data models across S3, Redshift, Glue, dbt, and Airflow.
WHAT YOU WILL DO
- Lead design and implementation of enterprise data architecture following Medallion framework;
- Develop data models, pipelines, and schemas for data unification, governance, and analytics;
- Build and optimize AWS-based data infrastructure using core services;
- Implement Lakehouse architecture integrating structured and unstructured data;
- Ensure performance, scalability, and cost efficiency of the data ecosystem;
- Design and oversee data ingestion from multiple on-prem and cloud systems;
- Define integration strategies using batch, streaming, and event-driven patterns;
- Collaborate with governance teams to enforce data security, access control, and quality;
- Work with engineering and business teams to deliver scalable data solutions;
- Support data platform modernization initiatives and roadmap execution;
- Document architecture, standards, and data flows for diverse audiences.
MUST HAVES
-
8+ years of experience
in data engineering or architecture;
-
5+ years of experience
working with
AWS data ecosystems
;
-
Expertise
with
AWS core services
(S3, Glue, Redshift, Lambda, Athena, Kinesis, EMR);
-
Experience
with
data integration tools
(dbt, Step Functions, Airflow, AWS DMS, Kafka, Kinesis);
-
Strong knowledge
of
data modeling
(Medallion/Lakehouse, dimensional modeling, Data Vault, Kimball);
-
Experience
with
infrastructure as code
(Terraform, CloudFormation);
-
Proficiency
in
Python, SQL, and Spark (PySpark)
for ETL;
-
Strong understanding
of
data governance, quality, and security
;
-
Experience
with
enterprise-scale architecture and data integration
;
-
Upper-intermediate English level.
NICE TO HAVES
- Practical experience and certification with Boomi AtomSphere.
PERKS AND BENEFITS
-
Professional growth:
Mentorship, TechTalks, and personalized growth roadmaps.
-
Competitive compensation:
USD-based pay with education, fitness, and team activity budgets.
-
Exciting projects:
Modern solutions with Fortune 500 and top product companies.
-
Flextime:
Flexible schedule with remote and office options.