MANCHESTER, England--(BUSINESS WIRE)--As AI continues to evolve at an alarming rate, threatening to change the world of work as we know it, another development has made headlines. Luminance, a UK company specialising in AI for the legal profession, is set to launch a new tool that fully automates contract negotiation.
It aims to slash the time needed for legal professionals to pore over tedious and lengthy contract work. Demonstrations and tests have shown the tool can capably handle contract negotiations—even when pitching AI against each other in contesting contracts—in a matter of minutes.
This raises the question: can AI replace legal ‘grunt work’?
Many industries are already benefitting from using AI to replace their menial tasks. And now it seems the legal sector could be following suit…
But the issue remains as to whether this sort of tech can be relied on completely. Alastair Brown, Chief Technology Officer at BrightHR, weighs in on the subject:
“When implemented correctly, AI can help businesses streamline operations, reduce costs, and improve customer experience. So, the decision to embrace AI in the world of work is generally a positive one.
“For instance, our AI-powered question and answer tool has given business’ the equivalent of over one million minutes of free advice in the last 12 months alone. And that equates to around four million saved in solicitors’ fees. It’s a brilliant tool to secure accurate, qualified information in response to those day-to-day questions that arise from business owners, and quickly—something that’s of vital importance in today’s fast-paced world that shows no signs of slowing.
“Our decision to roll out AI is largely down to the fact that it frees our advisors up to spend more time on the emotive areas of the law that AI has not yet mastered the nuances of. Those high-risk topics, those situations fraught with emotion, require human experience to explore the vulnerabilities and interpretations and understanding of circumstances that AI might not yet be able to bring together. Speaking with an experienced advisor helps settle nerves, gives that reassurance that we’re acting in their best interest, and that our actions are empathic rather than solely data-driven.
“We recognise our clients often want to speak to a real person, understanding of all these things. And that’s a notion mirrored by the findings of a recent study by The Conversation where respondents showed the most support for a 75% human to 25% AI decision-making collaboration, or a 50%-50% split, suggesting that there is absolutely a need for AI to play a part in the modern-day world of work—but this still must be supplemented by human input.
“One recent case that highlights this is the New York lawyer with 30 years of experience who earlier this year, employed AI to perform legal research for a lawsuit against an airline. It was revealed that six of the cases cited to convince the judge to proceed with the case were in fact completely fabricated by ChatGPT. And just last month, hip hop group Fugees’ rapper accused his lawyer’s use of AI of tanking his case when several references made in the closing statement were incorrect.
“Though Luminance tools have been trained using 150 million legal documents, and our own BrightLightning trained using over 50 years’ of employment law experience instead of public internet content like ChatGPT, these cases still serve a stark reminder of the complications that can arise.
“This brings into consideration accountability. When humans make a mistake, there is generally accountability taken by that individual and perhaps their employer. But when we’re relying on machines for information, the question is: where does this accountability lie? This alone highlights the need for AI and human input to work in tandem, not least to provide a layer of accountability.
“And let’s not forget the issues surrounding bias. All humans have some form of implicit bias, but we’re generally becoming increasingly more aware of this and the need to challenge it. AI, on the other hand, was recently determined to display stronger racial and gender bias than people in a recent report by Bloomberg. Why? Because AI is trained on data that is likely to be incomplete or biased. Whilst AI can dole out advice and information, it does not have the room for the consideration and interpretation that humans do.
“Using AI-powered tools to obtain advice can be a real lifesaver when it comes to reducing stress and saving time. However, it's important to remember that when it comes to legal matters, it's always best to consult with a qualified professional to ensure accuracy, avoid any potential legal issues, and ensure the best outcome for your business.
“So, will AI development mean legal roles are replaced by robots? Not likely. It’s more a case of roles evolving, as they always do with time. Just as machines have changed the face of manufacturing, there is now a need for human roles to manage them.”