FEATURE ARTICLE -
Advocacy, Issue 102: December 2025
The increasing use of generative artificial intelligence has caused the Supreme Court of Queensland to issue a Practice Direction requiring practitioners to ensure accuracy in references to legislation, authorities or other sources in written and oral submissions and to clearly identify the person responsible for them. Non-lawyers must endeavour to ensure the accuracy of references to legislation, authorities or other sources in any document prepared by them and relied upon in Court. The position in New South Wales is different. There, non-lawyers are required to comply with the same obligations with respect to their use of generative AI, as legal practitioners. The recent case of Gribble v Essential Energy [2025] NSWDC 344 (Gribble) illustrates this approach, where Gibson DCJ struck out a pleading containing AI-generated material filed by a self-represented litigant, and commented on the growing challenge of managing such misuse in litigation.
Practice Direction 5 of 2025 – Accuracy of References in Submissions
Issued by the Chief Justice of Queensland on 24 September 2025, Practice Direction 5 of 2025 requires that the responsible person for submissions to the Court be identified and personally accountable for its content.
Paragraph 9 requires:
For written submissions, the responsible person must –(a) verify the accuracy and relevance of any references to legislation, authorities or other sources; and(b) ensure that the document is expressed in terms which reflect their judgment as to the proper discharge of their professional and ethical obligations.
Paragraph 10 states that by placing their name on a written submission, or by allowing that to occur, the legal practitioner informs the Court that they have performed this obligation.
Paragraph 11 requires:
For oral submissions, the responsible person must –(a) verify the accuracy and relevance of any references to legislation, authorities or other sources; and(b) ensure that the oral submissions are expressed in terms which reflect their judgment as to the proper discharge of their professional and ethical obligations.
Paragraph 13 states that the obligations in paragraphs 9 and 11 reflect the professional and ethical obligations imposed under the Barristers’ Conduct Rules (rr 12, 25, 26, 37, 41, 57) or the Solicitors’ Conduct Rules (rr 3, 4, 5, 17, 19, 37).
Paragraph 14 states the consequences of non-compliance:
Legal practitioners who are responsible persons for written or oral submissions which are found to contain reference to non-existent cases, legislation or other material, may be the subject of a referral to the Legal Services Commissioner for investigation and/or be required to show cause why a costs order should not be made against them personally.
Self-represented litigants are addressed separately.
Paragraph 16 requires that:
Self-represented litigants must also endeavour to ensure the accuracy of references to any legislation, authorities or other sources referred to in any document prepared by them and relied upon in Court, and in any oral submissions made by them, for example, by referring to publicly available legal resources such as:
- Australasian Legal Information Institute (austlii.edu.au)
- Queensland Judgments (queenslandjudgments.com.au)
- Queensland Legislation (legislation.qld.gov.au)
- Commonwealth Legislation (legislation.gov.au)
Paragraph 17 states the potential consequence of non-compliance:
Relying on a document which contains reference to non-existent cases, legislation or other material may result in an adjournment of the hearing and potentially an adverse costs order against the party who relied on the document.
Self-represented litigants are referred to The Use of Generative Artificial Intelligence (AI) Guidelines for Responsible Use by Non-Lawyers (Guidelines) for further assistance.
The Use of Generative Artificial Intelligence (AI) Guidelines for Responsible Use by Non-Lawyers
The Guidelines are directed to non-lawyers (self-represented litigants, McKenzie friends, lay advocates and employment advocates) and state that users are responsible for ensuring that any material relied upon in Court is accurate. The Guidelines warn that information generated by AI tools may be inaccurate and can include references to fabricated cases, citations, quotations, legislation, articles or legal texts. Users are advised to verify all information with a lawyer (if possible) or by consulting reputable sources such as AustLII, Queensland Judgments and Queensland Legislation.
The Guidelines warn at the foot of page 3, that:
If you rely on a document which contains non-existent or fake citations, or inaccurate AI-generated information, in court, and this causes delay in hearing the matter, a costs order may be made against you.
Different Standards for Legal Practitioners and Non-Lawyers
Under Practice Direction 5 of 2025, legal practitioners must verify the accuracy of each reference cited in their oral and written submissions. By advancing oral submissions to the Court or by placing their name on a written submission, a legal practitioner personally attests to that verification. Non-lawyers are simply required to “endeavour to ensure the accuracy” of references cited in any document prepared by them and relied upon in Court.
For legal practitioners, a failure to verify the accuracy of references under Practice Direction 5 of 2025 amounts to a breach of the obligations under the Barristers’ Conduct Rules and Solicitors’ Conduct Rules, exposes them to a referral to the Legal Services Commissioner and/or to personal costs orders. Self-represented litigants and other non-lawyers may face an adjournment and potentially an adverse costs order.
The position in New South Wales is different. The Supreme Court Practice Note SC Gen 23 – Use of Generative Artificial Intelligence (Gen AI) applies to both legal practitioners and unrepresented parties.
Paragraph 16 requires that:
Where Gen AI has been used in the preparation of written submissions or summaries or skeletons of argument, the author must verify in the body of the submissions, summaries or skeleton, that all citations, legal and academic authority and case law and legislative references:
(a) exist,
(b) are accurate, and
(c) are relevant to the proceedings,
and make similar verification in relation to references to evidence in written submissions or summaries or skeletons of argument to evidence (whether the evidence be contained in affidavits or transcript).
Paragraph 18 states:
Any use of AI to prepare written submissions or summaries or skeletons of argument does not qualify or absolve the author(s) of any professional or ethical obligations to the Court or the administration of justice.
This raises a question for consideration: Should unrepresented parties in Queensland also be required to verify that any AI-generated references in their submissions have been checked for accuracy, mirroring the verification required of legal practitioners under Practice Direction 5 of 2025?
The NSWDC’s Response in Gribble
The consequences of AI generated material in court documents were recently addressed by Gibson DCJ in Gribble. The case concerned a self-represented plaintiff whose pleading and submissions showed clear evidence of the use of generative AI by way of reference to imaginary cases.
Gibson DCJ observed that the misuse of generative AI had become “an increasingly serious problem for the courts” and noted that to prevent this misuse, the Supreme Court Practice Note SC Gen 23required not only legal practitioners but unrepresented parties to comply with its contents when presenting affidavits, submissions and expert reports to the Court.
Gibson DCJ observed that where an unrepresented party refers to imaginary cases and misconceived legal principles, there is much to be said for a “robust approach” which may include declining to consider such arguments for two reasons. Firstly, as a matter of judicious case management, judges should not be required to find the correct authority or to seek to interpret what the litigant is saying and secondly, because repeating false authorities risks further replication in other AI sources.
Gibson DCJ also considered that it may be desirable for unrepresented parties to be required to provide a verification in their pleadings confirming that generative AI has not been used.
Paragraphs [33]–[44] of the judgment set out the Court’s reasoning in full:
[33] Many applications come before me in the Defamation List for rulings, and, as a general rule, although I usually provide reasons, I rarely publish them on Caselaw. I have done so in this case, not because the issues of pleading discussed above relate to any issue of significance, but because the pleading shows clear evidence of use of Generative Artificial Intelligence (“Gen AI”). This misuse of Gen AI is becoming an increasingly serious problem for the courts, to which the proliferation of recent judgments referring to its use attests. It is the second time this year that Gen AI has been cited to me by a litigant in person, and it is a practice that must be stopped.
[34] It was to prevent this misuse of Gen AI that, on 18 December 2024, the District Court adopted the Supreme Court Practice Note SC Gen 23 – Use of Generative Artificial Intelligence. There is thus a consistent requirement in New South Wales (see paragraphs 7–25), not only for legal practitioners but also unrepresented parties to comply with its contents when presenting affidavits, submissions and expert reports to the court. Paragraph 16 sets out restrictions in relation to written submissions and summaries of argument, to the effect that the author must verify, in the body of the submissions, that all citations, legal and academic authority and case law and legislative references not only exist but are accurately summarised.
[35] The District Court General Practice Note 2 – Generative AI Practice Note commenced 3 February 2025 and has been operational at all relevant times for this litigation. The plaintiff has in fact stated in the opening paragraphs of his affidavit that Gen AI was not used to generate his affidavit and any annexure or exhibit to his affidavit.
[36] It is not clear how much of the plaintiff’s pleadings and submissions have been prepared using Gen AI. The most telling example is the reference to a wholly imaginary judgment for which the plaintiff claims that I am the author. Not only does the plaintiff provide a completely false case name, but he claims it reflects a principle of law which is equally imaginary. He cites a similarly false case name for a judgment in another court which he claims reflects the same imaginary principle of law.
[37] Following the practice recommended by the Court in Luck v Secretary, Services Australia [2025] FCAFC 26, I have not repeated the name and reference of the wholly imaginary judgment in question. I will identify the parties as being “Trkulja v Yahoo!7 Pty Ltd” but I have redacted the 2013 Caselaw reference. It is a genuine reference, but for a decision of another judge of this court, on an issue wholly unrelated to defamation law. I have also redacted the name of another judgment referred to by the plaintiff in his pleading, in a different court, as it does not appear to be genuine either.
[38] Although a litigant in person, the plaintiff is well aware of his obligations in relation to the use of Gen AI, as he swore an affidavit in these proceedings in which he stated that he had not used Gen AI. When I asked him where he had found this reference to a judgment purportedly authored by me with this case name, and drew his attention to Mr Senior’s submissions as to the non-existence of this judgment and of the principles for which it was asserted to be the law, he told me he would have to look into the issue and get back to me. That is not an answer to his credit. He did however later concede that he had used Gen AI.
[39] Misuse by legal practitioners of Gen AI has been viewed as a serious matter and may lead to disciplinary action (Valu v Minister for Immigration and Multicultural Affairs (No 2) [2025] FedCFamC2G 95; 386 FLR 365 at [14]–[38]). Misuse by litigants in person has not been the subject of similar concern. In Nikolic v Nationwide News Pty Ltd [2025] VSCA 79, Beach J ignored the submissions in question. A similar approach was taken in Nash v Director of Public Prosecutions (WA) [2023] WASCA 75 (at [9]) and Luck v Secretary, Services Australia at [14]. More recently, in May v Costaras [2025] NSWCA 178, Bell CJ referred to the use of Gen AI by the respondent in the appeal, a litigant in person, although without referring to other Australian appellate courts and how they have dealt with this increasingly difficult problem.
[40] The censure expressed in Valu v Minister for Immigration and Multicultural Affairs (No 2) has much to commend it. The difficulty is that, if practitioners are expected to keep to proper standards, what should be the standards applied to litigants in person?
[41] This is an issue to which greater consideration should be given. Unfortunately, cases of this kind tend to occur in courts of inferior jurisdiction, and the policy of appellate courts is not to refer to the judgments of inferior courts. It might be said that it would be desirable for appellate courts to refer to similar decisions of appellate courts in other jurisdictions, but this is again not a commonly occurring event, where issues of this non-regional kind arise.
[42] What should be the course to take when a litigant in person refers to imaginary cases and misconceived legal principles asserted to be derived from them? There is much to be said for the approach taken by Beach JA in Nikolic v Nationwide News Pty Ltd, namely to take a robust approach which may include rejection of consideration of the arguments, on the basis that it is not in the interests of judicious case management for judges to be obliged to find the correct authority, or to seek to interpret what the litigant in person was saying.
[43] Even a consideration of the false authority and the asserted principles said to emerge in these cases is dangerous, as they risk repetition in other Gen AI sources. For this reason, when referring to the hallucinatory false judgment, it is wiser not to cite the decision in full. In Luck v Secretary, Services Australia at [14], the Court stated:
“The case referred to in the first paragraph of this extract does not exist. The judgment with the medium neutral citation referred to is a completely different matter which did not involve Rofe J. We apprehend that the reference may be a product of hallucination by a large language model. We have therefore redacted the case name and citation so that the false information is not propagated further by artificial intelligence systems having access to these reasons.”
[44] It may also be desirable for litigants in person to be required to provide verified pleadings which include a paragraph confirming that Gen AI has not been used, although that would require amendments to the UCPR.
Ultimately, the statement of claim was struck out, with leave to re-plead (excluding all AI-generated material), and costs were awarded to the defendant. This decision demonstrates that courts may respond to AI misuse from unrepresented parties by redacting false authorities, striking out affected pleadings, requiring specific verification of authorship and accuracy, and making adverse costs orders.
Conclusion
Queensland’s Practice Direction 5 of 2025 and the Guidelines reflect an effort by the Court to maintain the reliability of submissions in the era of generative AI, with an emphasis on accuracy and accountability. The NSW decision in Gribble highlights a question for consideration: should unrepresented litigants in Queensland, like legal practitioners, be required to verify that any AI-generated references in their submissions have been checked for accuracy? Further, as Gibson DCJ suggested, should unrepresented parties also be required to verify in their pleadings that generative AI has not been used?
The link to Practice Direction 5 of 2025 can be found here.The link to the Guidelines can be found here.The link to the NSW Supreme Court Practice Note can be found here.A link to the decision in Gribble can be found here.