top of page
Logo

Case Study: Enhancing Legal Contract Review with AI Assistance

  • Writer: Amrit kumar
    Amrit kumar
  • Jul 27
  • 6 min read

Updated: Aug 12


ree

Introduction — Rina's Story: A Human Lens on a Legal Problem. Rina Thomas, a contract review manager at a multinational telecom company, logs into her CLM dashboard. A compliance deadline looms. Her team must review over 15 documents by end of day. Manually. And fast.Deadlines. Discrepancies. Fatigue. Risk.

Just when the pressure peaks, she clicks on a contract card and activates Clause Assistant.

Boom — within seconds:

  • Risky indemnity clauses are flagged

  • Expiry dates are pulled into a checklist

  • Obligations are summarized

  • A smart timeline appears, suggesting what needs attention first

By noon, she’s reviewed 7 files, sent feedback, and even got a compliance nod.

AI didn’t just save time, it gave her confidence.

"I feel like I can finally see the document, not just read it."


ree


The Current Scenario

Consider a scenario where thousands of contracts, each containing hundreds of pages, must be meticulously reviewed to pinpoint risky clauses, positions, numeric values, or dates that could pose legal risk.

This isn’t optional.

For lawyers, overlooking details can lead to devastating consequences. But manually scanning each contract is labor-intensive and error-prone.

That’s where our tool steps in.


Research & Ecosystem Understanding Who We Talked To
  • Legal Managers: Assign workflows, prioritize tailored reviews

  • Reviewers / Paralegals: Hands-on document markup and feedback

  • Model Trainers: Needed precise feedback on data extraction quality



We conducted workshops and 1:1 interviews with:

  • 15 internal lawyers

  • Legal ops, AI trainers, compliance managers

  • Stakeholders across client & Sirion side

We didn’t just ask what they did. We asked what slowed them down, what gave them doubt, and where they placed their trust.We wanted to:

  • Understand their real-world review pain points

  • Capture critical workflows (e.g. M&A, Legal Audits)

  • Identify tech gaps in manual contract analysis

To keep the whole process documented created a flow diagram to communicate the overall journey


The journey of the App from inception to execute
The journey of the App from inception to execute

Insight Collection & Key Pain Points We uncovered:
  • Need to add comments on extracted values

  • Lack of status visibility for reviewed vs pending fields

  • Repetitive manual metadata editing

  • No autosave when editing fields

  • Confusion around cloning values across similar docs

ree
Contract Review Workflow

We mapped the current journey from document upload to clause extraction → review → final assignment, identifying frictions like:

  • Lack of visibility into what was already reviewed

  • No standardised deviation detection

  • Ambiguity in AI confidence vs human verification




Understanding the Contract Anatomy

To design a system that thinks like a legal reviewer, we had to first understand the document the way they do.

We broke contracts into core components, each requiring different types of interactions:

  • Clauses (e.g. Termination, Governing Law): Often long-form, context-heavy, needing summarization + tagging.

  • Obligations: Categorised under 8 key types (Data Privacy, Payment Terms, SLAs, etc.), tracked for compliance.

  • Metadata: Structured values like Effective Date or Expiry Date—ideal for AI-driven extraction.

  • Tables, Signatures, Service Levels: These required visual parsing and often manual verification.


This decomposition let us align UI patterns to content type—rather than treating everything as flat text.



Structure of Contract Document Content
Structure of Contract Document Content


Problem Statement.
How might we improve the efficiency of contract review for lawyers?

Design should:

  • Supporting first-time accuracy meant designing interfaces that could guide reviewers to the right decisions without the need to second-guess themselves. We prioritized clear visual cues, inline guidance, and intelligent defaults to help users make confident choices on their first interaction.

  • Scalability wasn’t just about handling more documents—it meant accounting for different contract types (MSA, NDA, SOW), review styles, and team structures. Our system was built to adapt to varied legal workflows and data complexity without breaking consistency.

  • Familiarity and learnability were critical for adoption. Our users were legal professionals—not tech-savvy product testers—so we rooted the design in patterns that mirrored their mental models. From layout structure to terminology, we ensured the experience felt intuitive from day one, yet powerful enough to grow with them.


UX Flow — Metadata Review.

To ensure a frictionless and focused workflow, I mapped out the core user journey around metadata field review:

  1. Enter the showpage: Document opens with metadata highlights visible.

  2. Jump to extracted values: Highlights draw the reviewer directly to key fields.

  3. Accept, Edit or Retag: Interactions are kept lightweight, allowing immediate action.

  4. Optional summary/commenting: Adds context where automation may fall short.

  5. Mark as reviewed: Status updates visually, feeding back into the team and the AI model.

This flow reflects an intentionally tight feedback loop—where every review action helps future extractions improve. It also minimized user fatigue by emphasizing simplicity, relevance, and speed.

UX Flow — Metadata Review.
UX Flow — Metadata Review.

The possible features we discussed with stakeholders.

Prioritization of features
Prioritization of features
Prioritization Framework

To turn insights into action, we needed clarity on what to build—and when.

I introduced an Impact vs Effort prioritization matrix, collaboratively evaluated with PMs and engineers. We classified ideas into:

High Impact, Quick Wins: These were no-brainer improvements—simple to implement but immensely helpful to the end-user.

  • Sequential review mode: Helped reviewers move field-by-field, reducing decision fatigue.

  • Metadata search & filter: Gave users control over large volumes of data.

  • UI for in-line annotation: Allowed fast edits without context-switching.

Harder but Critical Features: These would take more time, but unlock powerful workflows:

  • OCR-backed metadata tagging: Critical for scanned documents.

  • Auto-detection of reviewer counterparts: Helped sync workflows when multiple users were involved.

  • Live review status: Made multi-reviewer workflows visible and collaborative.

This prioritization exercise became our north star—helping us scope sprints realistically while still being user-driven.

Blueprinting the Wireflow

Before final UI design began, I created an interactive wireflow blueprint that served as both a design prototype and a product alignment tool.

  • We explored 3 layout strategies, testing how each would impact readability, decision-making, and task speed.

  • Each flow was annotated with logic for edge cases—like untagged fields, OCR failures, and multiple reviewers.

  • These wireframes were shared with engineering to preempt feasibility concerns early in the cycle.

The wireflow became our shared design contract—ensuring fewer surprises, smoother handoffs, and stronger cross-team clarity.


Screenshot of how rough wireframes and flows look like
Screenshot of how rough wireframes and flows look like


Designing a Review Status Language

To bring clarity to the review experience, I introduced a semantic and visual status system—a shared language between AI and human reviewers.

Each clause or data point is tagged with a clearly defined status, enabling lawyers to focus on what needs attention without second-guessing.

Core Status States Defined:

  • Extracted – AI has identified the data, but it’s untouched

  • In Review – Assigned, pending human validation

  • Synced - The data which was found duplicate and now been synced.

  • Reviewed – Approved as-is

  • Needs Action – Requires edits or discussion

  • Edited – Modified by a reviewer

  • Rejected – Invalid or irrelevant

  • Approved – Human Reviewer

  • Uncategorised - Clause status not defined.

  • Moved - Moved to other category

Status chips for better communicating the status
Status chips for better communicating the status


These statuses were designed with consistent color, iconography, and interaction patterns, and were used across:

  • Clause chips

  • Table views and filters

  • Review progress bars

  • Tooltip metadata for collaboration history

Result: Lawyers could now trust the system to tell them what’s pending, what’s done, and what needed their attention—without losing track.


Immersive Review with Assist Bars

In traditional contract review, users constantly jump between documents, scrolling and scanning to locate related clauses. This context-switching creates friction and slows decision-making.

To address this, we introduced Assist Bars — horizontal, context-aware markers that surface AI-extracted insights and actions directly alongside the document. Instead of breaking focus to consult side panels or separate views, users can now review clauses, metadata, and AI recommendations in a continuous, in-document flow.

The result? Less mouse travel, faster comprehension, and a more immersive review experience where information feels seamlessly integrated with the reading journey.


Assist bars for immersive review experience
Assist bars for immersive review experience


Rethinking Visual Cues for Clarity During testing, we compared icon-based indicators with geometric shapes to link assist bars to document text. While icons offered a familiar interface pattern, they introduced unexpected friction

Think-aloud sessions revealed that geometric shapes, like dots, created a stronger visual connection between the assist bar and the relevant clause. Users found them faster to process and less cognitively demanding, leading to smoother navigation.

Outcome: Switching to geometric cues improved intuitiveness for 85% of participants, reduced hesitation, and supported a more fluid review flow.


ree


Test result visualisation
Test result visualisation
  • “85% of users found dots more intuitive than icons for linking assist bars to text.”
  • “Users struggled to interpret the icons, leading to confusion and slower navigation.”
  • “Think-aloud sessions revealed that geometric shapes provide clearer visual feedback.





Final design landing page

ree



Key components

Final review card
Final review card

AI-Powered Clause Context in Focus

This design surfaces clause type, AI-suggested value, and contextual confidence in a single, compact module. The primary classification (e.g., Agreement Type → Master Service Agreement) is paired with:

  • Reference count & navigation – Users can scan multiple source snippets (1/3) without leaving the view.

  • Supporting rationale – “Based on the last 12 documents…” offers instant transparency into AI reasoning, reducing trust gaps.

  • Direct actions – Accept, edit, or discard in one click, avoiding modal interruptions.

By keeping the clause context, decision tools, and navigation in-line with the review flow, the design reduces context switching, speeds up validation, and builds reviewer trust in AI suggestions.

ree



Doc Review Interactions




Reflection

This project wasn’t just about crafting interfaces — it was about earning trust.

  • With legal reviewers grounded in the comfort of static PDFs & words, we had to create an experience that felt familiar yet meaningfully better.

  • With AI models that thrive on precise, consistent feedback, we needed to design interactions that made quality input effortless.

  • With stakeholders chasing speed, we had to prove that velocity and accuracy could co-exist.

Design became the bridge — translating legal expertise into structured AI learning, and turning automation into something users could believe in. In the end, it wasn’t pixels that mattered most, but the confidence we built between human judgment and machine intelligence.




Amrit.

I loves sharing thoughts and lessons from my design journey. Simple thoughts, but I believe even the simplest ideas can spark growth.

9560255168

  • LinkedIn
bottom of page