Thomson Reuters is building the AI platform that will power the next decade of tax and accounting products, including CoCounsel for Tax and Accounting. As a Principal Software Engineer, you will own core backend and AI orchestration systems that turn frontier models into reliable, production‑grade workflows at scale.
Responsibilities:
- Lead multi‑quarter initiatives that cut across AI, product, and infra (e.g., a new orchestration layer, a low-latency retrieval system, or a unified knowledge base)
- Mentor staff and senior engineers, raising the bar on system design, code quality, and AI integration practices across the org
- Be the expert in new model capabilities, collaborating closely with AI/ML engineers, researchers, designers, and PMs to translate industry improvements into reliable, user‑facing workflows that accountants trust
- Help shape the team’s roadmap, technical strategy, and engineering culture – from experimentation practices to testing, rollout, and postmortems
- Architect and implement backend services (Python, FastAPI, PostgreSQL, AWS, Vercel) that power generative AI agents, complex workflows (e.g., tax filing, advisory, audit), and document‑centric experiences
- Build and evolve AI orchestration: routing, tool calling, MCP servers, multi‑step workflows, safety and guardrails, and robust error handling around third‑party LLMs (OpenAI, Anthropic, and others)
- Design for high‑throughput, low‑latency AI workloads: caching, queuing, rate‑limiting, model failover, and cost/performance tradeoffs
- Work with large‑scale data: millions of documents, retrieval and search, vector stores, and indexing strategies tailored to tax and accounting use cases
- Establish and refine SLOs, observability, and incident response for AI systems that must be correct, auditable, and trustworthy in professional workflows
Requirements:
- Bachelors Degree in Computer Science, Computer Engineering, Related Field, or Equivalent Experience
- 8-10+ years of experience in backend development, building scalable web services and APIs
- Deep Python expertise and experience with production systems using frameworks like FastAPI (or similar), relational databases (PostgreSQL or equivalent), and a major cloud provider (AWS preferred)
- Strong background in distributed systems: data modeling, API contracts, observability, resilience patterns, and performance tuning under load
- Proven track record owning large, complex projects end‑to‑end: architecture, execution, rollout, and long‑term operation
- Excellent communication skills and the ability to partner with product, design, and ML teams in a fast‑moving environment
- Demonstrated interest in AI systems and new engineering paradigms: LLMs, agents, retrieval, or similar – you care about what 'AI‑native' software should look like
- Hands‑on experience integrating LLM APIs (e.g., OpenAI, Anthropic) into production applications, including prompt/response management, cost controls, and safety considerations
- Experience with AI‑adjacent infrastructure: vector databases, embeddings, semantic search, or custom retrieval pipelines
- Opinions and experience around automated testing, reliability, and release practices for systems with nondeterministic model behavior
- Prior experience in domains where correctness, auditability, and compliance matter (fintech, tax, legal, or similar), or strong interest in applying AI in those contexts