As artificial intelligence continues to have a profound impact on industries and daily life, Australian judiciaries are grappling with the impacts of AI in courtrooms.
Since the recent rise of AI programs such as ChatGPT, Microsoft Copilot and Google Gemini, lawyers from all over the globe have been caught – and often penalised for – misusing generative AI tools in court.
In response, jurisdictions around Australia have released guidelines for lawyers and self-represented litigants regarding the use of AI in litigation, with varying approaches to its regulation.
Chief Justice of the South Australian Supreme Court Chris Kourakis has called for practitioners, professional bodies, employers and any interested individuals to give their input regarding AI in the legal profession through a survey and consultation process.
This marks a step towards the Chief Justice responding to the current landscape for South Australian litigators to harness AI’s benefits and to restrict its use where necessary.
Other jurisdictions have taken varying approaches to regulating the use of AI in the courts.
New South Wales has taken a less accepting approach, prohibiting the use of AI to draft affidavits, witness statements and expert reports. On the other hand, Victoria has taken a more accommodating approach, asking litigants to exercise caution and be mindful of AI’s shortcomings without disallowing its use.
“AI has a fairly wide range of potential uses in the courtroom”, said Dr Mark Giancaspro, Senior Lecturer at the Adelaide Law School and Special Counsel at South Australian commercial law firm DW Fox Tucker.
Dr Giancaspro told InDaily that AI has potential for “improving efficiencies within processes such as Court transcription, witness dialogue and case management”.
Additionally, Dr Giancaspro said there are opportunities for AI as part of case preparation that could save costs for litigants.
“For example, algorithms are far more capable than humans of scanning masses of information more quickly,” Dr Giancaspro said.
“But, like any sort of powerful tool, if you use it irresponsibly or with oversight, it will do more harm than good.”
Common AI-induced issues include both unrepresented parties and lawyers relying on non-existent or “hallucinated” cases and quotes, delaying court processes as judges attempt to verify sources.
AI has been used increasingly by self-represented litigants in Australia in recent years. While widespread adoption of AI has increased access to justice and the democratisation of legal services, Dr Giancaspro said it “works both ways.”
Dr Giancaspro said that while the uptake in AI programs allows for more people to access forms of legal services, unrepresented parties should be aware of the risks of using AI in a courtroom setting.
In April, a self-represented litigant in his employment dispute in a New York State Appeals Court attempted to use an artificially generated video of a lawyer who would make the man’s arguments for him. The judge shut down the video, having not been informed that the man’s counsel would be artificially generated. The clip of the courtroom footage has garnered attention online.
Dr Giancaspro said that virtual self-representation is an interesting theory, “but it goes without saying we should not be trusting robots to represent litigants. It could lead to catastrophic errors”.
“For a self-represented litigant who is on trial for murder and faces life in prison, delegating authority to an AI is just insane, because if it gets it wrong, they’ve lost all liberty,” he said.
“If you have a multi-billion-dollar corporation with access to the most advanced legal AIs, versus your average Joe or Jane on the street who’s being sued and who can only access the free version of ChatGPT, the disparity in quality would be obvious in terms of what the AI produces. It’s an incomplete democratisation.
“Yes, it’s useful, it’s exciting, it’s important, but end of the day, it’s not human. It’s designed to help, not to replace. That’s probably the number one rationale for regulating AI.”
The Chief Justice is calling for any person wishing to make a submission addressing whether, and if so, how the State Courts might respond to the use of AI in litigation by advising [email protected] by 30 June 2025.