Are you using AI yet? How lawyers can use it to their advantage

Are you using AI yet? How lawyers can use it to their advantage

By Jane Wang and Sharyn Ch’ang

Share this article


September 2023

All lawyers, from in-house counsel and law firms, to barristers and judges, need to embrace the use of generative artificial intelligence (AI). Why? Because it will become an indispensable productivity tool across the legal profession. Goldman Sachs estimates that 44% of current legal work tasks could be automated by AI (all industries average is 25%). This doesn’t mean that generative AI heralds the demise of the profession, more that those who do not use AI will be displaced by those that do. And those that do use it may be doing legal work in a manner not done today which brings transformative opportunity. So how can generative AI be used today, and what are the risks and opportunities?

Have you heard of Harvey? 

Harvey is a game-changing generative AI platform built on OpenAI’s GPT-4, especially for lawyers. It uses natural language processing, machine learning and data analytics to automate and enhance various aspects of legal work. PwC is the only one of the Big 4 licensed to use Harvey. There have been more than 15,000 law firms on the Harvey waiting list. We have seen its extraordinary power, and you’d be hard-pressed to find a lawyer who’s seen it in action who hasn’t also been impressed.

Like ChatGPT, users simply type instructions about the task they wish to accomplish, and Harvey generates a text-based result. However, unlike ChatGPT, Harvey includes multiple tools specifically for lawyers, where users can:

  • Ask free-form questions that are legal or legal-adjacent, including research, summarisation, clause editing and strategic ideation;

  • Ask for a detailed legal research memorandum on any aspect of law;

  • Request a detailed outline of a document to be drafted, including suggested content for each section; 

  • Ask complex, free-form questions about, or request summaries of, any uploaded document without any pre-training or labelling required.

And yes, Harvey can generate this output in seconds. In multiple languages. 

Generative AI, like Harvey, can turbocharge the drafting of written material from scratch, and make edits and recommendations for replacement text. Its other superpowers enable it to analyse, extract, review and summarise faster and at scale, beyond human capabilities.

So, will I lose my job?

No. There are many skills and tasks that generative AI cannot do, that only humans can. 

While AI will automate routine tasks and assist with research, analysis, drafting and similar work, nuanced and complex aspects of legal practice require human expertise, empathy and judgement. This includes client interaction, relationship building, negotiation, strategy development and courtroom advocacy.

Lawyer’s skills vs Artificial Intelligence

Lawyer’s skills

AI

Lawyer’s skills

AI

Analysing

X

Problem solving

 

Attention to detail

 

Reasoning

 

Commercial acumen

 

Relationship building

 

Communication – oral and listening

 

Research

X

Communication – written

X

Resilience

X

Contextualising information

 

Reviewing

X

Decision making

 

Summarising

X

Drafting

X

Tasks done at scale

X

Emotional intelligence

 

Tasks done concurrently

 

Experience

 

Tasks done to deadlines

X

Extracting information

X

Teamwork

 

Independence

 

Time management

 

Initiative

 

Think abstractly

 

Judgement

 

Think critically

 

Negotiating

 

Think logically

 

Generative AI like Harvey is not intended to replace but augment the work of lawyers. While it can seem that generative AI models know a lot, given the scale of the data the large language model (LLM) has been trained on, and the nature of the fine-tuning they receive, the reality is that they are not truly “intelligent”. The model is only processing patterns to produce coherent and contextually relevant text. It is not thinking or reasoning.

While some junior lawyers, paralegals and legal assistants may be more concerned than most, they needn’t be. With generative AI freeing up time in their day, they can focus on higher-value, more engaging work, develop specialised expertise and participate in the strategic aspects of legal practice – all earlier in their career than usual. And isn’t there a compelling argument that if junior lawyers are doing less repetitive, boring work, and in more rewarding roles, they will be happier and more satisfied? The same is true for more mature lawyers who are ready to adopt new ways of working.

Google’s CEO Sundar Pichai has an interesting prediction. In addition to AI making the legal profession better, he's "willing to almost bet" there will be more lawyers a decade from now, because “the underlying reasons why law exists and legal systems exist aren't going to go away, because those are humanity's problems."

What are the risks?

Generative AI as a tool for lawyers is still in its infancy. That doesn’t mean that it shouldn't be used, just carefully so. 

Many will be familiar with the recent lesson learnt by US lawyer Steven Schwartz of Levidow, Levidow & Oberman, who naively used ChatGPT to cite six previous judicial decisions to support his case – all non-existent. But the fault lies with the lawyer, not the tool. He failed to check the content of the motion before filing it. Lawyers must exercise judgement and always validate the content, date and source of data of the AI-generated outputs to ensure accuracy, appropriateness and reliability for the legal context.

The unreliability of AI-generated content is just one of the risks. There are more. Amplification of bias, if the training data that has gone into the model reflects historical biases. Cybersecurity risks – as with any networked computer system – along with data privacy and confidentiality concerns, risks over intellectual property ownership of AI-generated content and liability for outputs. Despite these, the potential benefits of generative AI outweigh the shortcomings, which major AI market makers like OpenAI, are consciously working to improve.

What new AI skills and knowledge do I need?

In our recent survey, more than 80% of Chief Legal Officers and General Counsels in Asia Pacific rated their knowledge of legal AI as below average. Thankfully, CEOs in the region have seen the light, with investment into upskilling workers in AI (73%) their number one priority. Some of the new skills and knowledge required are:

  1. Identification and assessment of what generative AI to use and for what purposes. Determine the specific use cases that will help you, and the process and support needed to select the relevant generative AI tool or platform. 
  2. Familiarity with AI concepts and models. A foundational understanding of AI concepts and models is needed. This includes machine learning, deep learning, neural networks, natural language processing and LLMs, and appreciation of the risks, limitations and benefits of AI. Given the rapid advances in AI innovation, this is needed on an ongoing basis.
  3. Prompt engineering for legal. Knowledge of how to formulate precise and contextually appropriate legal queries or instructions to AI models, minimising unnecessary jargon and avoiding ambiguity, to obtain accurate and relevant outputs. 
  4. Data literacy and ethics. Strong data literacy skills are essential to assess the quality and reliability of training data used by AI models. Lawyers must also navigate the ethical considerations associated with AI adoption, such as data privacy, bias mitigation and transparency, and together with the broader business, implement Responsible AI – a methodology designed to enable AI’s trusted, ethical use.
  5. Awareness of new regulations. Keep up to date with new laws and regulations. China recently became one of the first in the world to regulate the use of generative AI, set to take effect from 15 August 2023. The European Union’s draft Artificial Intelligence Act, may also have an impact in our region.
  6. Awareness of court changes. Keep abreast of industry news. Some courts are introducing rules about the use of generative AI – to minimise potential breaches of confidential information in the input data and prompts. They are also seeking greater transparency of the content and data sources relied upon in lawyer-submitted court documents and judge-made judicial decisions.
  7. Critical evaluation of AI outputs. There will be no substitute for a lawyer’s final review of AI-generated legal documents upon which any client or third party may rely, and lawyers must possess the ability to critically evaluate and validate AI outputs to ensure their accuracy, relevance and compliance.

Maybe I’ll leave it for now

Familiarisation and adaptation to generative AI will take time, so for a profession notoriously slow to adopt any innovation, there’s sound justification for all lawyers to welcome it now, not later. 

AI tools can augment human capabilities, drive efficiency and productivity, and deliver greater value to clients. This collaboration between lawyers and AI systems is key to providing superior legal services. Once regulation and ethical guidelines are in place, generative AI will revolutionise aspects of legal practice and shape the future of a more modern legal profession. Wouldn’t you rather be on the front foot?

Get in touch

Jane Wang

Partner, NewLaw, PwC China

Email

Sharyn Ch'ang

Director, Asia Pacific NewLaw, PwC Hong Kong

Email

Follow us