PythonAILLMLarge Language ModelsClaudeRAGFastAPICommunicationCollaboration
About this role
Role Overview
Design, implement, and maintain AI-driven product features end-to-end using LLM APIs.
Develop both visible chat-based UX flows and non-visible AI integrations that power Libra's core functionality.
Build and optimize RAG systems, including document parsing, semantic chunking, and retrieval pipelines.
Build scalable, maintainable backend systems using FastAPI and modern Python tooling.
Design and implement agent orchestration systems for complex, multi-step legal workflows.
Build infrastructure for evaluations and benchmarking, including continuous monitoring and alerting to ensure quality across all AI-powered features (e.g., using Langfuse).
Collaborate closely with Product and Legal Engineering to rapidly prototype, test, and refine new AI use cases.
Stay ahead of the curve by exploring new models, tools, and techniques—including AI coding agents like Cursor and Claude Code.
Requirements
Deep experience building products that integrate large language models (LLMs) via APIs.
Proficiency with FastAPI, Python, and common data science packages.
Strong understanding of software engineering and security best practices.
Experience with modern AI developer tools (e.g., Cursor, Claude Code, Copilot, etc.).
Excellent communication skills in English; German fluency is a plus.
You have an entrepreneurial mindset and a deep sense of urgency in achieving your goals.
You're a self-starter who enjoys autonomy but thrives in a collaborative team.
You take a pragmatic approach to problem-solving—focused on impact, not hype.
Tech Stack
Python
Benefits
Flexible working policy (2 days a week in the office, 3 days a week from home)
A high-impact engineering role at the forefront of AI and knowledge automation.
The opportunity to build and ship features that directly shape how lawyers work.
Close collaboration with world-class engineers, product thinkers, and legal experts.
A culture that values speed, ownership, and craftsmanship.