Client Results · Legal Services

Contract review in twelve minutes

Automated clause extraction and risk flagging, with the lawyer retaining full sign-off. Review time dropped from hours to minutes.

12 min

average review time, down from 3+ hours

99%

clause detection accuracy on sampled output

All case studies

The situation

The firm is a UK commercial practice with somewhere between twenty and forty lawyers, depending on how you count the paralegals. Contracts are the bread and butter. Supplier agreements, NDAs, services contracts, licensing deals, framework agreements, the occasional joint venture. On a normal week the team was looking at anywhere from forty to sixty new or amended agreements coming in from clients, and each one needed a proper review before anyone signed off on it.

The managing partner described the reality to us plainly. A standard commercial contract of thirty or forty pages was taking one of their lawyers three hours to review, and that was on a good day. A more complex agreement could stretch to a full afternoon. The work itself was not difficult in the intellectual sense, most of it was pattern matching. Reading the indemnity clause, checking it against what the firm considered acceptable, noting the deviation, moving on to the limitation of liability, and so on through the document. Careful, necessary work, but the kind of work that three years of legal training does not really prepare you to enjoy at ten o'clock at night.

The second problem was harder to talk about, and they were honest about it. Different lawyers flagged different risks. One senior associate would pick up a notice period she considered aggressive and flag it. Another, reviewing a similar contract the following week, would let it through without comment because in his view it sat within the normal commercial range. Both answers were defensible. Neither was wrong. But the firm's clients were getting slightly different advice depending on whose desk the contract landed on, and the partners were uncomfortable with that. Consistency across a team of twenty-odd reviewers is genuinely hard to enforce when the playbook lives in everyone's heads.

What we did

We built them a contract review tool that reads an incoming agreement, extracts the clauses the firm cares about, and flags anything that deviates from their standard playbook. The playbook itself was the interesting part of the project. Before we wrote any code, we sat down with three of their senior lawyers and documented what actually gets flagged and why. What counts as an acceptable indemnity cap. When a unilateral termination right is a dealbreaker and when it is fine. How the firm thinks about governing law clauses for international counterparties. That document became the thing the tool measures contracts against.

We should address the obvious concern directly because it matters in legal work. Is it safe to put an AI anywhere near a contract review workflow. Our answer, and the firm's answer after some careful testing, is that it depends entirely on what you ask it to do. The tool does not give legal advice. It does not decide whether a clause is acceptable. It reads the contract, pulls out the clauses that match the playbook categories, and highlights where the language diverges from the firm's standard position. The lawyer then reads the flags, makes the judgement, and signs off. Nothing leaves the firm without a qualified human reviewing it. That boundary is not a marketing line, it is how the tool is built and how the workflow is structured.

On sampled output across the first two hundred contracts the firm ran through the system, clause detection accuracy came in at around ninety-nine per cent. The misses were almost always clauses with unusual headings or buried in schedules, and the lawyers caught them on read-through. We also ran a careful audit on the first month of live use to make sure nothing the tool flagged was materially misleading, and the partners signed off on the results.

The result

Average review time dropped from over three hours to around twelve minutes per contract. That figure is for a standard commercial agreement of the kind the firm sees most often, and it covers the full workflow from upload to final sign-off by the reviewing lawyer. More complex matters still take longer, as they should.

What we found more interesting was how the team chose to spend the recovered hours. We expected them to put more contracts through the pipeline, and some of that did happen. But a lot of the time has gone into work the lawyers said they had been neglecting. More direct client conversations. Proper handover notes on complicated matters. Training sessions for the junior associates who had been thrown onto contract review because someone had to do it. One senior lawyer told us she now spends the first hour of her day reading case updates instead of hunting through PDFs for a termination clause she was sure she had seen somewhere. That struck us as the right kind of outcome. The tool did not replace anyone's judgement, it gave them back the time to use it.

Sound familiar?

If your team is losing hours to work that should take minutes, a 45 minute conversation is all it takes to find out what is possible.

Book a discovery call