Are we entering a new era of AI lawyers?
Writing, analysing, and fact-checking are all key skills for lawyers, and artificial intelligence’s sudden leaps in improvements cannot be ignored
As ChatGPT makes waves across the legal industry and beyond, Technology Reporter Jane Wakefield talks to law firms and tech experts about its impact on their sector, and what role artificial intelligence will play in their business.
When ChatGPT was released to the public in November 2022, it quickly went viral, amassing 100 million users in just a few months, making it faster growing than TikTok.
Owned by Microsoft-backed firm OpenAI, ChatGPT and its successor GPT-4 are text-based tools that are able to understand questions, research them and offer summaries of everything from college essays to news articles to poetry, often in seconds. It has been trained on data from the internet.
The legal profession, like all industries around the world, has been forced to sit up and take notice. Writing, analysing, and fact-checking are all key skills for lawyers, and artificial intelligence’s (AI) sudden leaps in improvements cannot be ignored and beg the question – are we entering a new era of AI lawyers?
Tech up
Baker McKenzie has been tracking machine learning since 2017, so was less surprised than most by the emergence of ChatGPT. A few years ago, it set up a small team, made up of lawyers, data scientists and data engineers, and, together, they have been playing around with such systems.
Ben Allgrove, the firm’s chief innovation officer, thinks that law firms need to ‘tech up’ – in other words, make sure that they have the capacity for technical expertise within their organisations, because AI and its potential to transform business is not going away.
The recruitment drive appears to have begun. In February, legal giant Mishcon de Reya put a job advert on LinkedIn, looking to hire a software engineer to better understand how generative AI could be used in its day-to-day business.
Allgrove compared ChatGPT to a: ‘very smart paralegal at this point’. But, he also pointed out that: ‘these consumer products are not the ones that will actually be used in the general practice of law – they are not fit for purpose for that.’
The democratisation of AI
This is a viewpoint shared by Dr Thomas Wood, Lead Data Scientist, at the Solicitors Regulation Authority (SRA). At the regulator's recent Innovate event in Bristol, he explained that, while ChatGPT is fun to play around with, it is basically the ‘pub quiz’ of AI. For him, the business case for AI lies in OpenAI’s application programming interface (API) – basically an access point allowing people to feed into its GPT software.
Wood told delegates at the conference: ‘This is really, really powerful, because you can essentially harness the power of large language models and integrate it into your own processes and systems. This is the democratisation of AI.’
It is early days, though, for law firms to set up their own tailored AI project – and, meanwhile, they need to think about what staff might be doing with existing AI systems. ‘You can’t put the genie back in the bottle. People can access ChatGPT on their personal devices. It’s built into Bing search, it’s going to be built on Google search. So, at the end of the day, people are going to be using it,’ comments Allgrove.
The policy at Baker McKenzie is that if someone in the firm wants to use any sort of generative AI systems, then they need to ask for permission, although they are encouraged to experiment: ‘At least we have a record of who has asked and what they have said they want to use it for, and I think that is a better place to be from an oversight and governance point of view.’ Allgrove says that ‘multiple hundreds’ have requested to use such tools so far and the firm is starting to look for which teams are using it more to see if there are particular departments where the technology could be more useful.

Dr Thomas Wood, Lead Data Scientist, SRA
Supervision of AI
Start-up Harvey – another iteration of GPT’s API – is used by lots of law firms. It can perform legal tasks, such as streamlining document drafting and research. Its algorithms and machine learning models can also analyse vast amounts of legal data, including statutes, case law and legal literature. But Harvey comes with an important disclaimer stating that it should be supervised by legal professionals and that it ‘hallucinates’, which, put in plain English, means it makes things up.
One of the biggest problems with current systems, whether it be ChatGPT, Google’s equivalent Bard or specific tailored tools like Harvey is that, while they are astonishingly good at some tasks, they are also prone to big errors.
A New York lawyer who used ChatGPT for legal research came unstuck when it emerged that it was referencing legal cases that simply did not exist. He now faces legal action himself.
The case has prompted several senior US judges to require lawyers to disclose whether AI was used for court filings. Although, as Allgrove points out, it is not really the fault of the technology. ‘If somebody relies on ChatGPT to produce a citation and doesn’t check the actual citation, that’s a bad lawyer,’ he said.
Dr Wood agrees that the current inaccuracies of AI systems mean humans need to be on their guard. He told the audience at Innovate: ‘It is well documented that people often switch off their mental faculties when a computer tells them to do something, even if they realise that the computer is telling them to do something really stupid.
‘And I think, from a regulatory point of view, this is our biggest challenge. As AI comes in, people need to engage their mental faculties.’
Allen & Overy began using Harvey in February, after testing it since November. Despite concerns that the legal profession is one of the most vulnerable when it comes to taking jobs from humans, the firm told the Financial Times (FT) that it did not expect it to replace any of its workforce, saying instead that it was just a: ‘smart way of working, saving time on all levels.’ According to the FT, out of the 3,500 lawyers at the firm who have access to the tool, around a third are now using it on a daily basis.
The obvious benefit of tools such as Harvey is to save time, which in turn raises questions about billable hours, the traditional way law firms charge clients for work.
Allgrove thinks payments generally may face a shake-up, but using AI might actually make legal services more expensive in future. ‘If it’s better quality, if it’s faster, if it’s accessible 24/7 and never makes a mistake, those things are valuable. Reasonably, there should be a price attached to them.’

Ignoring AI is clearly not an option, but it is worth remembering its limitations
AI legal representation
For Alex Monaco, founder of tech-powered employment law firm Monaco Solicitors, AI could revolutionise legal services, bringing them to those who previously hadn’t been able to afford them. ‘With AI, suddenly people can represent themselves. They don’t need lawyers, they can use ChatGPT and ask it to write them a legal letter. It won’t be as good as a lawyer, but it will be much better than most people could do.’
Monaco was helped in its journey to tech know-how, partly with the help of a government grant but also by working with universities, something that panellists on Innovate 2023 agreed was a great way to start getting to grips with new technology.
Sally Holdway, director of Teal Legal, explained how her firm has worked with Keele University on tech programmes, including the gamification of conveyancing – applying game theory to an area of law that can be seen as a bit dry. ‘It meant that, when we developed our app, we had some new features that we wouldn’t have had. It’s about looking at things slightly differently,’ she told the conference.
Of course, there is also much to be cautious about when it comes to AI – and the legal profession is more cautious than most. It resisted using email until information-handling rules were put in place – and client confidentiality remains crucial in the new era of AI.
Data protection is one of the reasons cited by law firm Mishcon de Reya for hitting pause on staff using ChatGPT. Making sure that intellectual property and client data is ring-fenced and protected will be one of the key questions any firm embracing the tech needs to ask before it uses such tools.
Ahead of the Innovate event, the SRA asked lawyers how they would be using AI to deliver services. More than half said it would be a significant factor in how they delivered services in future, proving most law firms are not being complacent about the role the tech will play.
Most currently seem to be taking a sensible approach to AI, playing around with the tools to work out how they can add value to what they do. Ignoring AI is clearly not an option, but it is worth remembering its limitations: it is far from ready to don legal robes and probably won’t be for many years to come.