December 22, 2022

A Lawyer’s Guide to Contextual AI: No ‘Garbage in, Garbage Out’

Steven P. Keely
A Lawyer’s Guide to Contextual AI: No ‘Garbage in, Garbage Out’

Professionals are sometimes reluctant to adopt new technology, lawyers included. New technology often fails. But sometimes it makes a big impact. Software-based artificial intelligence (AI) is one type of new technology. 

One kind of AI, generative, takes the first-drafter role off your plate. When done right, generative AI lets you sit back and thoughtfully edit. This is being done, in terms of dealmaking for example, with contract clauses and provisions. It can also be done with briefs and other documents for litigation contexts. 

Most lawyers have heard about other forms of AI-based legal tech, including in the contract context. For many, the main issue has been ‘garbage in, garbage out’ (a classic theme in applied mathematics). This is about bad inputs causing bad outputs. 

Lawyers point out, rightfully so, that good transactional and litigation work products are specific to context. Factors include jurisdiction-specific laws as well as social, economic, and other legal factors. Some AI certainly put out garbage, and for some it’s because they put in garbage.

• What is the real benefit of generative AI for a professional, for a lawyer? 

• How can a lawyer find an AI that is context specific? 

This article answers both questions.

Read on.

Reliance is the right standard for lawyers, but not for legal tech AI

When lawyers talk about generative AI in legal tech, they tend to hold it to the same standard as they would other lawyers: Reliance. It is a lawyerly virtue to be complete in one’s work, making sure you cover every key detail. Lawyers put in long hours for a reason.

But this virtue is also the source of confusion when it comes to AI as an augment to legal practice. What confuses lawyers most when it comes to AI is this: Comprehensiveness versus improvement at the margin. 

The lawyerly emphasis on getting everything right makes sense because clients rely on lawyers. Reliance involves trust, it involves dependence. In the context of a relationship like that, all clients have is the presentation and reputation of a lawyer to make a decision about which lawyer to hire. So a lawyer really has to be as complete as possible, owning the whole situation, and getting everything right.

For example, contracts are mostly about managing downside. You don’t want a gaping hole. You have to anticipate the worst cases that could materialize.  What if you forgot a limitation of liability? This is something that generative AI for contract review might suggest, if the type of deal appears to call for it. 

But it makes no sense to hold AI to the same standard. It is an augment, and a cognitive one. This is just as true when dealing with contracts. AI is not a lawyer and it never will be. A lawyer–what is worth calling one–is a creative problem-solver. 

Lawyers don’t need to, they shouldn’t, rely on any AI.  Generative AI is a tool, but it is a tool with conditional benefits. 

It’s good to remove one risk for a client, even if there are multiple

No potential deal or dispute ever poses just one risk to a client. In practice, a client and the lawyer working on their behalf always face a set of risks. This does not even come to grips with risks neither the client or the lawyer know about.

But you wouldn’t know it by hearing lawyers object to generative AI. They point out all the errors in AI-generated content. Two points here:

1. An AI, as I’ll show, isn’t by nature susceptible to criticism like the ones levelled at it normally. An AI speaks the language of deeper correlation, not causation.

2. Lawyers make tons of errors all the time, but they can course correct, with improved causal knowledge. We’ll return to this topic.

Circling back to risk management: If you face a set of risks, each one is worth removing. Requiring that every problem be solved for any progress to be acknowledged makes no sense. If you didn’t limit your liability that doesn’t mean your choice of law provision wasn’t worth it. You want the best set of laws for your client and you want to limit their liability. Both of these problems are worth solving–independently and together. 

It’s your job as a lawyer to solve each and every risk. No AI can find them all. Why? Because it’s a machine. It’s a fancy machine, to be sure, but just a robot. An AI only holds in itself patterns from the past, it cannot completely grasp the uniqueness of the present. Many deals really are snowflakes to some extent.

But how do you know when to rely on an AI (a good one) and when to rely on you? More on that later, but the short answer is you don’t need to worry about it.

Circling back to contract law, only you can make a contract complete and handle all the relevant risks. But you can only do that when you have the cognitive freedom to do it. AI tailored to drafting and review helps you find issues in your contracts faster, maybe ones you wouldn’t have seen, and suggests alternatives. In general, I believe, lawyers have misunderstood what generative AI can do for them. 

With this context, we can move on. Learn when and why an AI does not suffer from the main objection to date that lawyers have had to its use. 

Use contextual AI as a lawyer, no ‘garbage in, garbage out’

Most lawyers who object to using AI to do legal work are essentially talking about ‘garbage in, garbage out.’ This is a classic criticism of any automated operation. If A is a low quality input, then B will be a low quality output. 

Fair enough. 

But let’s focus on the latest wave of innovation in generative AI: GPT-3. Like other AIs, GPT-3 has inputs and they come via OpenAI, a research and development firm. It used a “large” language model, which essentially blows up the size of its sample to get results. OpenAI used GPT-3 to process trillions of documents across the internet. Many, many thousands were contracts.

With that in mind, let’s talk about what the actual output here is first. In the case of generative AI, you get ideas written up in front of you, ready for review. 

A cup half full is better than an empty one

You might be thinking–so what? If I get a bad draft from a first-year associate that requires a complete rewrite, that’s no better than if I went out and grabbed some previous contracts and customized them myself. 

Makes sense.

The thing is, you aren’t going to get a complete rewrite. You’re going to get a partial one.

Generative AI has plenty of functions, and first drafter is just one of these functions. Robots will never be amazing first drafters, for documents in complex deals or disputes. Why? 

Good contract language combines the wisdom of past practice and provisions with the unique needs of the present transaction. Good contract language, for any particular deal, is a creative process. 

Robots will never replace people. Artificial intelligence will never replace lawyers. It may replace some lawyers, but it cannot assume the role of strategy, tactics, and empathy that a good lawyer offers to clients. enerative AI, though, can give you the wisdom of past practice and provisions. 

Partial rewrites, while better than complete rewrites, are not all made equally awful. The difference isn’t always to do purely with the number of changes, because some rewrite action items just take more time and energy. They carry heavier cognitive loads.

Speaking in terms of your time and energy, part of the way done is better than none. Generative AI helps you run the last mile–the hardest one. 

You already use automation in your law practice

You might be skeptical still, at least when it comes to contracts. 

How, exactly, can a robot be a decent drafter of a contract provision?

No analogy can prove a point, but it can gesture to an underlying principle. The analogy is that you already use automation in contracts.

• Your law firm or company uses the same contract template for similar customers.

• Your own brain recognizes patterns, applies existing rules and customs of legal analysis and lawyer techniques, and takes action accordingly. It is more common for you to act automatically than to engage in cognitive labor.

But you might object, our law firm’s contracts with clients, and our clients’ contracts with theirs, are assigned to target customer groups. There is a particular context there. 

True.

Taking up my second example, you might object that the human brain is not only a pattern recognition machine, but a causal one too. Indeed, circling back to the first example, human brains came up with the target customer group and decided to assign to them certain contract templates. 

This is an important distinction. Let’s dig into it.

Staying within the four corners or going outside of them for context

Imagine, it’s your first day at a job. You’re a law student at a summer internship. You’re reading a bunch of past employment agreements. Your supervising attorney handed you the stack. (She handled these prior matters.)

You’re reading actively, but not really thinking about why one clause or provision would be better than another. 

Day two: You take some initiative. You read about the context for the same employment agreements. Context includes, among other things, the identities of the parties and jurisdiction-specific laws.

You are again not actively thinking about why one contextual element matters or not. You are absorbed in the present experience of reading. 

It may look like a hypothetical, and while I did imagine it, you can observe yourself and other people doing exactly this. Even educated, hard-working, high-paid professionals do not actively think sometimes when we read (or listen…). Often we also do not follow up with active thought about what we read or heard. 

Unconscious creativity is a human process

There is a critical difference between a human having these experiences and an AI executing its computation. 

The difference is that you unconsciously produce causal ideas, so even focused reading will cause you to remember certain things and come up with “Why is this…?” type ideas. You don’t have to try to come up with ‘A causes B’ or ‘if I believe A, then I should also believe B’ type thoughts. (If you try of course then you have more causal thoughts, but a base production is always at play in a human brain.)

They come naturally to you. (That’s why you don’t need to worry about relying on AI.)

They don’t come at all to an AI. 

But AI does everything you did (in your imagination) on both days one and two. AI does pattern recognition for the object of study (e.g., employment agreements) and context too (e.g., jurisdiction-specific laws).

AI is just better at pattern recognition

Lawyers do love definitions, so let’s do one now.

• “Pattern recognition” in this article means the storage and computation, conscious and unconscious, of two or more (2+) related things.

That was boring and technical, but I mean exactly that: A correlation, the co-incidence, of 2+ things. They can be that two pairs of contracting parties were wealthy corporations, or romantic partners, and so on. 

The difference between your pattern recognition and an AI’s is that, well, the AI’s is better. AIs have more memory storage than a human brain. It’s a harsh truth, because it is why AI can replace some people and some positions entirely. But it’s a freeing truth, because AI can also free up people to be smarter. 

Now you know how pattern recognition is improved by an AI. Why do you care?

When you think about solving the problem, you naturally look for the most common elements. Generative AI finds common elements and brings them to your attention in an easy-to-process way.

AI can’t create causal knowledge, which lawyers need

The problem with some patterns is that they don’t matter. They may be relevant, but have no importance in terms of cause and effect. Let’s circle back to our law student hypothetical.

• Is it relevant if the employment agreements have the same font? 

Not to the law, and not to a well-trained AI for contract drafting and review. But who did the training of the AI? People.

• Is it relevant if the employment agreements were signed the same day?

Maybe, it depends on the law. A legal rule may favor a contract signed earlier in time or later in time. A well-trained AI will take notice of jurisdiction-specific rules. But in the end, it is the mind of the lawyer handling the matter that decides, must decide. 

In the end, all contract language choices are causal. ‘If I as a lawyer approve this contract language, what will happen for or to my client?’ 

Artificial intelligence cannot answer these questions. Only people can. 

This doesn’t mean you can’t, won’t, shouldn’t bring to bear the patterns you’re aware of either. AI synergizes with active minds. For example, you may have noticed a pattern among recent appellate judicial decisions. The AI may not have had that pattern labeled and computed. All AIs are blind to some parts of reality of which you’re aware. 

Get a new, interesting option with another click

In short, generative AI makes it convenient to get a decent first draft of provisions in front of you. You click a button. If you don’t like the one you got, you can just click again and get a new version. No emails and calls with another person. 

Augment your legal practice with generative AI

You cannot and should not rely on generative AI as a professional, as a lawyer. 

You can and should use AI as an augment to your practice. The power of AI is to free up your mind for better things.

AI is here to stay and grow. Be the first, be in the middle of the pack, or be the last–it’s up to you. 

Generative AI can never give you an ironclad contract for the deal in hand, but it can give you the metal for you to make something. This is the best that generative AI can offer. 

Augment contract drafting and review

If you’re a lawyer who handles contracts on the regular for clients, you may want to consider trying Spellbook. It is the first of its kind, a generative AI built from GPT-3 and tailor-made for contract drafting and review. It also has functions that help with analysis, negotiation, and advising clients. 

Consider the competitive advantages that legal tech can offer: Time to think strategically and be less bogged down in trivialities. Get your spot on the waitlist for Spellbook.

Ready to learn more? Check out the rest of our Getting Started Guide.