Own end-to-end delivery of data platform engagements: managing scope, timeline, budget, team coordination, and client satisfaction from kickoff through handoff
Lead cross-functional delivery teams that may include project managers, business analysts, data engineers, developers, and change management professionals — ensuring the right skills are in place at each stage
Manage engagement health proactively: identify delivery risks early, escalate appropriately, and maintain clear communication with client stakeholders throughout
Serve as the primary client relationship owner during delivery, building trust through reliable execution and transparent progress reporting
Conduct technical requirements gathering and capability assessments to establish a sound foundation for each engagement
Design and implement modern cloud data architectures across AWS, Azure, and Google Cloud — including data lakes, lakehouses, data warehouses, and real-time streaming platforms
Lead migration of legacy on-premise data systems to cloud-native architectures, ensuring performance, scalability, and cost-efficiency
Build and oversee ETL and data pipelines using cloud-native automation and orchestration tools
Design and implement machine learning infrastructure on cloud platforms — including feature stores, model training pipelines, experiment tracking, and model serving and monitoring in production
Architect data foundations that support downstream ML, AI, and agentic workflows, including intelligent document processing, knowledge retrieval, and structured/unstructured data integration
Establish and enforce data quality, governance, and observability standards across client environments, including ML model performance monitoring and drift detection
Evaluate and recommend appropriate tooling across the modern data and ML stack based on client context, capability, and long-term roadmap
Partner with sales teams as a technical authority during discovery calls, client workshops, and solution presentations
Develop proposals including solution architecture, scope of work, resource requirements, and budgetary estimates
Requirements
Bachelor’s degree in a related discipline and 8 years’ experience in a related field OR a Master’s degree and 6 years’ experience OR a Ph.D. and 3 years of experience OR 12 years’ experience in a related field
1+ years of hands-on experience building and deploying machine learning solutions in a cloud environment — including model training pipelines, feature engineering, and production model serving using platforms such as Google Vertex AI, Amazon SageMaker, or Azure Machine Learning
3+ years building and maintaining ETL and data pipelines using cloud-native automation and orchestration tools
Hands-on experience with cloud data warehousing and lakehouse platforms (e.g., Google BigQuery, Amazon Redshift, Azure Synapse Analytics, Microsoft Fabric, Databricks, Snowflake)
Proficiency in Python for data engineering tasks; working knowledge of at least one additional language relevant to the data stack (e.g., SQL at scale, Scala, Java)
Strong communication skills — able to present technical architectures to executive audiences and document solutions clearly for both technical and non-technical stakeholders.
Tech Stack
Amazon Redshift
AWS
Azure
BigQuery
Cloud
ETL
Java
Python
Scala
SQL
Benefits
The Company offers eligible employees the flexibility to take as much vacation with pay as they deem consistent with their duties
seven paid holidays throughout the calendar year
up to 160 hours of paid wellness annually for their own wellness or that of family members
additional paid time off in the form of bereavement leave
time off to vote
jury duty leave
volunteer time off
military leave
parental leave
health care insurance (medical, dental, vision)
retirement planning (401(k))
paid days off (sick leave, parental leave, flexible vacation/wellness days, and/or PTO)