Lead technical discovery with customers to understand workflows, constraints, data realities, and success criteria—and translate that into a clear technical plan.
Architect end-to-end AI systems (GenAI + traditional ML when needed), spanning data ingestion, retrieval, orchestration, evaluation, deployment, and UX.
Design and deliver rapid prototypes that make abstract ideas concrete and de-risk key decisions (cost/latency/reliability, integration feasibility, quality thresholds).
Develop reference architectures and technical frameworks that demonstrate what’s possible with current and emerging AI technologies.
Guide clients through the AI landscape—capabilities, limitations, tradeoffs, and what it takes to operate AI systems in production over time.
Partner with sales teams to translate technical concepts into clear value propositions, proposals, business cases, and technical presentations.
Define technical requirements and scalable architecture patterns for production-grade systems (security, governance, observability, evaluation, and reliability).
Collaborate with fulfillment to ensure solutions align with customer needs and leverage Tribe AI’s capabilities—and help create smooth handoffs from pre-sales to delivery.
Stay current on frontier developments, bringing in SMEs from the Tribe network and collaborating with partners (including OpenAI/Anthropic) to keep customers ahead of the curve.
Act as a trusted advisor to technical and executive stakeholders navigating AI implementation and adoption challenges.
Requirements
5+ years leading technical workstreams; experience as a Solutions Architect, or a similar customer-facing technical role in the AI, cloud, data platform, or backend infrastructure space.
Strong customer-facing instincts: you can run discovery, ask the right questions, and turn ambiguity into clear architecture and next steps.
Experience scoping projects and estimating effort to build end-to-end applications (including assumptions, risks, milestones, and dependencies).
Strong fluency in Generative AI and the evolving LLM ecosystem and you’ve built or consulted on AI/GenAI systems in production.
Expertise with production cloud infrastructure (AWS, Azure, and/or GCP) and modern deployment patterns.
Systems thinker who can balance speed and rigor: you’re comfortable making tradeoffs and explaining them to both engineers and execs.
Excellent communication skills (written + verbal): you can make complex technical concepts feel simple and actionable.