ezCater is the #1 food tech platform for workplaces in the US. They are seeking a Senior Data Product Manager to lead and scale their Enterprise Data Platform, owning the long-term product vision, strategy, and roadmap while ensuring it supports analytics and AI experiences across the company.
Responsibilities:
- Lead Enterprise Data Platform product strategy (in partnership with Engineering). Define and continuously refine the Enterprise Data Platform vision and product strategy, grounded in company and Enterprise Data pillar goals and business outcomes, and connect that strategy to the broader Enterprise Data and company roadmaps. Partner closely with Principal/Staff Engineers on the long-term technical direction and trade-offs so product and technical strategy stay tightly aligned
- Identify and quantify platform value. Partner with senior stakeholders (VP/Director level) across Data, Engineering, Analytics, and product to understand current-state friction and future-state needs. Use data and experimentation to prioritize which platform capabilities (e.g., ingestion patterns, promotion paths, observability, cost controls) to deliver first
- Make our Enterprise Data Platform and Hub vision real. Partner with Platform Capabilities and Engineering leadership to turn our Enterprise Data Platform and Enterprise Data Hub vision into an actionable product roadmap—using the current Platform Capabilities plan as a starting point—covering architecture baseline, high-level target reference architecture, capability roadmap, and platform cost/consumption model
- Own and evolve a multi-quarter Data Platform roadmap. Build and maintain a multi-quarter, multi-team roadmap for Data Platform capabilities that balances foundational work (architecture evolution, trusted and scalable platform services, governance hooks, Nova/AI integration, cost and observability) with high-leverage use cases (ML/DS workloads, AI/NL-enabled analytics, real-time reporting, data product delivery surfaces, BI consumption paths)
- Enable ML and Data Science on the platform. Partner with Data Science and ML Engineering to ensure the platform supports model development and deployment needs (for example, data access patterns, performance, feature computation, and monitoring) without creating “shadow” pipelines
- Turn cross-functional needs into reusable platform capabilities. Look across Enterprise Data's portfolio (platform, governance & quality, finance, growth/CCO, product/tech, etc.) and Activation pods to identify patterns—across ingestion, modeling, promotion, monitoring, access, and NL/AI usage—and convert them into reusable platform capabilities and guardrails, rather than one-off solutions. Work closely with the Senior DPM for Data Product Activation to define clear contracts between platform and activation (e.g., readiness criteria, SLAs/SLOs, semantic and metrics layers, and serving surfaces)
- Translate platform needs into clear technical requirements. Convert ambiguous platform and data needs into clear, actionable requirements and platform “contracts” (SLAs/SLOs, readiness criteria, security and governance expectations, cost and observability requirements) for Data Platform and Data Engineering teams. Make and communicate trade-offs across value, effort, risk, and timing for platform initiatives
- Drive cross-workstream execution for platform initiatives. Orchestrate Data Platform, Data Engineering, Analytics, and partner teams through discovery, planning, delivery, and launch of platform initiatives such as the Enterprise Data Hub, major migration waves, and AI/NL enablement work. Ensure alignment on scope, sequencing, and ownership across producing, shaping, and consuming teams, and make sure platform dependencies and risks are visible in integrated plans and RAID logs
- Act as a peer to Data Engineering and Architecture. Own the product “what and why” for the Enterprise Data Platform, while Engineering and Architecture own the “how” (detailed design and implementation). Co-own a healthy delivery flow with a shared Definition of Ready, clear acceptance criteria, and predictable outcomes
- Deliver, measure, and iterate on platform capabilities. Own the lifecycle of Data Platform capabilities: from value discovery and design, through build and UAT, into launch, iteration, and deprecation. Ensure what ships is not just technically correct, but usable, trusted, observable, cost-effective, and adopted by downstream teams building AI, NL, ML/DS, and traditional analytics experiences
- Increase adoption & track outcomes for platform usage. Treat the Data Platform as a product with real ROI. Define success metrics (e.g., time-to-ship data products, SLIs/SLOs for platform health, platform utilization and cost, adoption of AI/NL-powered analytics paths), lead UAT with key consumers, and own documentation, enablement, and change management. Monitor usage and business impact and adjust the roadmap accordingly
- Be the authoritative expert on our Enterprise Data Platform. Become the go-to expert on our Enterprise Data Platform: architecture, capabilities, constraints, and how it supports analytics, BI, data products, ML/DS, and AI/NL use cases. Resolve questions and inconsistencies by tracing lineage and platform flows, understanding upstream/downstream systems, and working through governance, quality, and ownership issues
Requirements:
- 5+ years experience as an owner of Data or Analytics Products, with direct Data Product Management experience strongly preferred; experience owning platform- or infrastructure-adjacent data products is a plus
- 7+ years working in or directly with Data Engineering, Data Platform, or Analytics teams, ideally in complex, multi-system environments
- Demonstrated success owning end-to-end data or platform products: from problem discovery and requirements through launch, adoption, and measurable business impact, ideally including reliability, cost, or scalability work on a shared data platform
- Deep familiarity with data warehousing and data platforms (e.g., Snowflake, Redshift, BigQuery), data lakes, and ELT patterns, as well as experience working with modeling frameworks (e.g., dbt) and metrics/semantic layers that can support NL/AI analytics
- Strong SQL proficiency and comfort exploring data and platform metadata (e.g., logs, cost, usage) yourself to validate requirements, debug issues, and size opportunities
- Experience with BI tools (e.g., Sigma, Tableau, Looker) and how they consume data from platforms, including governance, performance, cost, and how they will participate in AI/NL analytics (e.g., NL features, grounding on Enterprise Data Platform data)
- Experience partnering with Data Science and/or ML Engineering teams and supporting ML/DS use cases on a shared data platform (for example, feature computation, training data pipelines, model scoring/serving, and monitoring)
- Proven ability to build and execute multi-quarter product plans that align business and engineering priorities, and to make and communicate trade-offs across competing initiatives—ideally in the context of large, multi-team platform initiatives
- Solid project/program delivery skills, including tracking roadmap progress against estimates and team velocity in a Scrum/Agile environment (e.g., Jira, Confluence), and working inside larger cross-functional programs and planning phases
- Excellent communication and stakeholder management skills: able to explain platform and architectural concepts (including AI/NL implications) to non-technical audiences, influence senior leaders, and work seamlessly with Engineering, Architecture, Analytics, Governance, and business stakeholders
- Hands-on experience with AI-assisted analytics or natural-language query tools (for example, Hex, Snowflake Cortex, or similar) to explore warehouse data, validate requirements, and prototype natural-language analytics or self-serve experiences
- A disposition that is friendly, flexible, pragmatic, and curious, with a desire to learn something new every day and to help raise the bar for the broader data, platform, and product teams
- Ability to travel up to 5 days per quarter for Together Weeks, team gatherings and other events, when applicable
- Demonstrated ability to design and evaluate natural-language analytics flows—grounding NL answers in governed warehouse data, thinking through guardrails, and partnering with engineering/analytics to measure quality, latency, and trust
- Familiarity with modern AI-powered data platform patterns (vector/search, semantic layers, conversational analytics, or agentic workflows) and how they change expectations for how business users and customers discover and consume data