Adobe is a company that empowers everyone to create through innovative platforms and tools. They are seeking a full-time Data & AI Engineer to build data integrations using AWS technology for their Digital Experience enterprise customers.
Responsibilities:
- Collaborate with Data architects, Enterprise architects, Solution consultants and Product engineering teams to capture customer data integration requirements, conceptualize solutions & build required technology stack
- Collaborate with enterprise customer's engineering team to identify data sources, profile and quantify quality of data sources, develop tools to prepare data and build data pipelines for integrating customer data sources and third party data sources with Adobe solutions
- Develop new features and improve existing data integrations with customer data ecosystem
- Encourage team to think out-of-the-box and overcome engineering obstacles while incorporating new innovative design principles
- Collaborate with a Project Manager to bill and forecast time for customer solutions
- Work with Project Managers to scope, bill, and forecast time for customer solutions, demonstrating agent-based AI and automation strategies
- Develop and improve features that incorporate LLMs, AI agents, and/or multi-agent orchestration for dynamic data integration, workflow automation, and real-time business value
Requirements:
- Experience as an enterprise Data Engineer from a consulting background
- AWS Certified Data Engineer – Associate or AWS Certified Cloud Practitioner
- 10+ years experience in building/operating/maintaining fault tolerant and scalable data processing integrations using AWS
- 10+ years experience in Python programming language preferably using PySpark
- Software development experience working with Apache Airflow, MongoDB, MySQL
- 4+ years working with AWS AI/ML and agentic services such as SageMaker, Bedrock, Vector databases (OpenSearch, Pinecone)
- Demonstrated experience (or significant exposure) designing, integrating, and scaling agentic AI systems—such as LLM agents, multi-agent frameworks (LangChain, LangGraph, LangSmith, MLFlow), autonomous orchestration, or decision-making pipelines. Capable of evolving data engineering solutions into intelligent, agent-based offerings
- Strong capacity to manage numerous projects are a must
- Ability to identify and resolve problems associated with production grade large scale data processing workflows
- Excellent communication skills (we're a geographically distributed team)
- Experience creating and maintaining unit tests and continuous integration
- Passion for creating Intelligent data pipelines that customers love to use
- Experience using Docker or Kubernetes is a plus
- Experience & knowledge with Web Analytics or Digital Marketing
- Experience & knowledge with Customer Data Platform (CDP) or Data Management Platform (DMP)
- Experience & knowledge with Adobe Experience Cloud solutions