A.I.: a Deceptive aid set to Send Shockwaves Through the Lawyer’s World
Photo by Andrey_Popov / Shutterstock.com
Péter Lakatos, managing partner at Lakatos, Köves and Partners, outlines lawyers’ concerns in the Brave New World of artificial intelligence.
BBJ: What effects will the application of AI have on lawyers’ work?
Péter Lakatos: From a helicopter view, the situation is that AI will definitely affect how we practice law. The problem is that, in principle, it sounds very good that there is a machine which helps your efficiency, helps you to do things that take a long time, like the due-diligence process. It will also help you to draft contracts, very big, heavy contracts where you have to go and check manually, to fix the grammar and details, so it’s helping a lot.
Drafting complex documents is a very complex, time-consuming, highly intellectual process. So, the technology helps to search, and that’s what it’s mainly used for nowadays: it’s basically extending the search technology.
In the traditional way, if you put a word search into the system, then the software just searches for that particular word. But in the AI world, if you start to teach the machine, the technology is able to conclude what you are looking for could also be somewhere else. An IT guy tried to explain it to me once, and what I understood is that this is a very advanced, sophisticated pattern-recognition research tool.
BBJ: So it’s not just a word search; it’s a more meaningful concept search?
PL: Yes, which, in principle, would very much help our work and does help. The problem is that this technology is not [usable] off the shelf. It can handle only that part which it has been taught, but it requires a lot of energy and time to teach it to make it usable for your own purposes.
It involves a combination of processes and, at the end of the day, a lot of work and a lot of investment.
BBJ: It needs training, in fact, like a new employee?
PL: We have to train the system for it to be useful for us, and this is one of the fundamental or conceptual issues that we have to deal with. Second, lawyers are.… It’s a social science if you like, and there is a certain thinking process around how you approach social issues and relationships.
Law is just one set of rules on how to regulate society, but there are many other rules, and you have to combine them together properly. It is something that is always in adjustment, always sensitive to your thinking and to the process. The IT world is totally different: in the IT world, it’s either byte or no byte.
BBJ: You mean it’s a kind of black and white?
PL: IT is a technical, science-based philosophy. It’s kind of black and white. The lawyer-thinking process is totally different from an IT-thinking process. And to work together, it’s a real, real issue, because to train the system, you effectively involve more and more IT and IT people, [over whom] I don’t have real control about how they think.
Having worked with a lot of them, I have much experience. But, in order to make this [AI assistance] happen, you have to allow them to enter your territory, but you cannot control how they do that. And this is also true from the other side: only those guys from the IT world who spend time with you understand you. But they are always looking for byte or no-byte solutions, and this is conceptually a conflict between how these two sets of mindsets can work together.
BBJ: What is the upshot of this contradiction?
PL: Because we are a law firm, we are running the legal risk; we have to do the intellectual activity to advise the client, to solve problems. So far, we have been controlling our thinking process, our business, and our risk at the end of the day, and I’m signing it off.
But in this new world, this will not be true. I will not control completely what and how we are working. And this is a fundamental conflict. I can see more and more people are realizing this.
BBJ: But AI people would see this all as an investment, wouldn’t they? That over the long term, your costs will go down?
PL: Unfortunately, this is also not true. Because the AI is not just helping us deliver the same service, it reduces the work we can deliver to clients. Because what we are doing is always an intellectual [service]. Whether I can sell on an hourly basis or a fixed fee, I’m selling an intellectual work product. Am I really interested in applying AI here, in my law firm, because that will reduce the potential number of hours that I can sell to clients and reduce my profits.
BBJ: This sounds like the intellectual equivalent of the textile workers dealing with machine looms in the 18th century?
PL: Absolutely, it’s like an industrial revolution issue. Yeah, we have to increase the business; that’s the only way. In the long run, definitely, the answer is we have to use AI. The long-term answer is clear, but how we get there is going to be very painful, very disruptive.
Will Less ‘Grunt Work’ Mean Less Training, Poorer Skill Sets?
While the application of AI to legal searches means large savings in time pondering wordy texts, it also causes issues in terms of training young lawyers, argues Péter Lakatos.
BBJ: If trainee lawyers are no longer in the data room checking texts, doing the “grunt work,” doesn’t this mean they are losing training experience?
PL: Absolutely: the more time they spend in the data room, the more documents they have reviewed, and that’s a learning path. Because the trainee is forced to go through several hundred documents and needs to look at these with a purpose: it’s not just reading. The trainee is learning a lot; he or she can see a lot of things. But this is just finding the information; the second [element] is to evaluate [it].
BBJ: With a senior lawyer?
PL: Before he or she reports to a senior lawyer, he or she needs to take a professional view, and that’s part of the learning curve. When the trainee finds something in a document in the data room, first they will need to evaluate it and take a position.
But now, the machine will come on, and the machine says this or that. I can well imagine asking: but did you consider X? And the answer will come back: No, because the machine said Y. So, they are not learning. Lawyering, to a large extent, is the ability to make a judgment on a situation, but [with AI] they will not be doing that.
It takes two, three, four years until someone gets the experience to read, process, and make a judgment. But in the AI world, how will they be able to get the learning? Big question. I don’t know.
This article was first published in the Budapest Business Journal print issue of June 4, 2021.
SUPPORT THE BUDAPEST BUSINESS JOURNAL
Producing journalism that is worthy of the name is a costly business. For 27 years, the publishers, editors and reporters of the Budapest Business Journal have striven to bring you business news that works, information that you can trust, that is factual, accurate and presented without fear or favor.
Newspaper organizations across the globe have struggled to find a business model that allows them to continue to excel, without compromising their ability to perform. Most recently, some have experimented with the idea of involving their most important stakeholders, their readers.
We would like to offer that same opportunity to our readers. We would like to invite you to help us deliver the quality business journalism you require. Hit our Support the BBJ button and you can choose the how much and how often you send us your contributions.