Crusoe is on a mission to accelerate the abundance of energy and intelligence through sustainable technology. They are seeking a Staff Product Manager for Managed Inference, who will own the complete product lifecycle and drive the development of inference services in collaboration with engineering and go-to-market teams.
Responsibilities:
- Own the end-to-end product lifecycle for Crusoe’s Managed Inference services, including roadmap definition, execution, and iteration
- Translate customer needs, market signals, and technical constraints into clear product requirements and prioritization
- Partner closely with Engineering, Infrastructure, and Platform teams to deliver scalable, reliable inference services
- Drive product decisions across performance, reliability, cost efficiency, and developer experience
- Define and track success metrics for inference services in production environments
- Collaborate with go-to-market teams to support product launches, positioning, and customer adoption
- Communicate product strategy and tradeoffs clearly to cross-functional partners and leadership
Requirements:
- 6+ years of experience in technical product management or engineering roles with product responsibilities
- Experience building and launching cloud infrastructure, platform, or AI/ML services used in production
- Strong understanding of cloud infrastructure (e.g., AWS, GCP, Azure) and modern compute architectures
- Familiarity with the machine learning lifecycle, particularly model deployment, inference, and monitoring
- Strong communication and collaboration skills, with experience working across engineering, product, and business teams
- Demonstrated ability to operate independently with strong product judgment and a bias for action
- Bachelor's degree in Computer Science or a related technical field (or equivalent experience)
- Experience building developer-facing platforms or services
- Exposure to inference-as-a-service, model serving frameworks, or ML infrastructure tooling
- Participation in developer communities or open-source projects
- Strong interest in trends across AI infrastructure and inference at scale