How We Built a Privileged Legal Copilot for a Global Bank
Design principles from a Lex LLM deployment serving 2,500 legal and compliance professionals across four continents.
Last year a global bank asked us to build a legal copilot for 2,500 lawyers and compliance officers. The mission: collapse research time, maintain privilege, and keep regulators comfortable with AI inside the institution.
The Guardrails Brief
- Data security: All ingestion had to run inside the bank's Azure tenancy with customer-managed keys.
- Privilege: Every document classification needed to map to the bank's privilege taxonomy.
- Audit: The bank's audit team required a complete record of prompts, responses, and reviewer feedback.
Ingestion & Retrieval
We categorised 11 million documents across policy libraries, advice memos, regulator correspondence, and transaction templates. Lex LLM connectors enforced format constraints and red-flagged documents missing privilege labels.
// Simplified ingestion workflow
const workflow = createPipeline({
sources: [sharePoint(), documentum(), secureBlob()],
classifiers: [privilegeClassifier, jurisdictionTagger],
sanitizer: piiScrubber,
storage: vectorStore.withEncryption(),
});
Evaluation & Monitoring
We built a bespoke evaluation suite featuring factual accuracy, citation coverage, tone, and privilege leakage. Lawyers scored responses directly in the UI; poor answers triggered automatic re-training tasks.
Adoption
Within six weeks the copilot handled 14,000 queries. Average research time dropped from 47 minutes to 8 minutes. More importantly, regulator reviews appreciated the auditable evidence of controls.
Legal AI is not about chasing novelty. It’s about giving teams trusted answers faster than the inbox and safer than consumer tools.