Expert Comment: New Judicial Guidance on the use of AI
Authored jointly by Craig Smith, Lecturer in Law at Salford Business School, and Graham Thomson from Irwin Mitchell, this piece explains the new judicial guidance for using artificial intelligence and the caution which must still be exercised.
Recently the UK Judiciary published official judicial guidance on Artificial Intelligence (AI) demonstrating the significance of AI, not only in society and the legal sector, but within the court and tribunal system itself. For many this guidance reads as an ‘Acceptable Use Policy’ for generative AI (genAI) in court proceedings. The guidance offers a level of clarity in an area that continues to evolve rapidly and while judicial office holders should be aware of the potential risks associated with AI, this guidance document provides a clear position from the judiciary and a critical observation that some AI tools are inherently open source and publicly accessible.
The guidance extends its approval for the utilisation of genAI tools, both publicly accessible and subscription-based, with clearly defined rules around confidentiality, cybersecurity, and usage limitations. These guidelines are pivotal in ensuring responsible and ethical deployment of AI in court proceedings. Where employed, genAI tools have the potential to significantly enhance the speed and efficiency of text-based tasks, such as document summarisation and content generation. The capabilities of genAI range from document analysis to language processing, offering valuable support to legal professionals in their workflows, and the introduction of guidance is timely with just over a year since the launch of OpenAI’s ChatGPT and the recent launch of Google Gemini.
Nevertheless, amidst the potential benefits, it is essential to prioritise cybersecurity, safeguard privacy, and uphold compliance standards. After all, cybersecurity is a critical aspect of responsible business operation and develops trust from stakeholders. The judiciary guidance seeks to achieve all this and specifically emphasis the need for caution in handling sensitive and confidential information, prohibiting such information from being entered into public genAI technologies.
In line with these principles, we strongly advocate for the adoption of private and secure genAI tools across all business applications. This not only aligns with best practices in data protection but also reinforces a commitment to maintaining the confidentiality of legal information. Interestingly, the guidance document from the judiciary hints at the prospect of a private genAI tool specifically tailored for court applications, suggesting a potential evolution in the landscape of legal technology. These developments imply an era of intriguing possibilities and provide an example of the intersection of AI and the legal sector. Indeed, these are interesting times, as the legal community explores innovative avenues to leverage AI tools, with many law firms now investing in genAI it suggests a future where customised AI solutions become integral to the intricacies of both legal and court proceedings.
It poses the question; do you have an Acceptable Use Policy for the use of generative AI in your business? Microsoft offers a starting point, considering responsible use of AI, defining six principles of AI development, and advancing policy for AI from experts. Microsoft's Copilot, previously known as Bing Chat Enterprise, is a commendable option for any business. It comes free with a Microsoft 365 E3/5 subscription, or as a standalone product for a modest monthly fee. Alternatives include OpenAI Enterprise and Google Vertex AI, both renowned for their robust security measures. This new tech is demonstrably useful and is certainly here to stay. Yet there is no need to compromise on cybersecurity. Secure and private genAI tools are easy to buy and set up and worth time to investigate for any business and even the judiciary it seems.
The full judicial guidance is outlined in the official document.
For all press office enquiries please email communications@salford.ac.uk.
Share: