FEATURE ARTICLE -
Issue 101: September 2025, Professional Conduct and Practice
Solicitor to Pay Costs for Submissions Affected by the Use of AI
BY
Carolyn Conway - Jeddart Chambers
363 Views
Wednesday 10th September, 2025
Solicitor to Pay Costs for Submissions Affected by the Use of AI
In Murray on behalf of the Wamba Wemba Native Title Claim Group v State of Victoria [2025] FCA 731 (22 April 2025) his Honour Justice Murphy of the Federal Court of Australia ordered that indemnity costs be paid by the solicitors for the applicants when documents contained fake citations that had been created by reliance on artificial intelligence in their preparation.
The Court referred to the growing problem regarding false citations in documents prepared using AI, noting the tendency of Generative AI to ‘fabricate’ or ‘hallucinate’ information that looks accurate and reliable but that is not based in fact.
Solicitors for the applicants were required to file affidavit evidence explaining how the defective documents were prepared, including an affidavit from the junior solicitor who prepared them.
The Court recorded:
- In a ‘Notice to the Profession’ titled “Artificial intelligence use in the Federal Court of Australia” on 29 April 2025, the Chief Justice stated that the Court is keen to ensure that any Guideline or Practice Note regarding the use of Generative AI in proceedings before the Court appropriately balances the interests of the administration of justice with the responsible use of emergent technologies in a way that fairly and efficiently contributes to the work of the Court. Her Honour noted that the Court is presently considering the practices of other courts, consulting with litigants conducting their own proceedings and consulting with the legal profession, before it finalises its position on the use of Generative AI. The Court has not, at this stage, sought to impose a total prohibition on the use of Generative AI.
- The Court’s position arises out of a recognition that the use of AI is a rapidly evolving issue in legal practice. It is apparent from the consultation with the profession which has taken place to date that many members of the legal profession use AI in some form, and that they see it as a useful tool in the conduct of litigation. I note that the Law Society of Western Australia recently ran a snapshot survey on how the legal profession in WA is using AI, doing so to assist the Supreme Court of WA with its consultation process. The results demonstrate that the use of AI in that State is increasingly common, with over 50% of survey participants incorporating it into their practice. The results also record that the most used safeguard to ensure the accuracy and ethical use of AI-generated legal content is human verification, by a lawyer: The Law Society of Western Australia, Summary of Results from the Law Society’s Use of Generative AI Survey, 2025.
- Whilst the use of AI in the legal profession is growing, practitioners must be aware of its limitations. It is critical that legal practitioners use proper safeguards to verify the accuracy of the work produced. Any use of AI must be consistent with the overriding duty of legal practitioners as officers of the Court and their fundamental obligation to uphold, promote and facilitate the administration of justice. As stated by the Chief Justice in the Notice to the Profession:
…the Court expects that if legal practitioners and litigants conducting their own proceedings make use of Generative Artificial Intelligence, they do so in a responsible way consistent with their existing obligations to the Court and to other parties. Further, it is also expected that parties and practitioners disclose such use if required to do so by a Judge or Registrar of the Court.
- In the Federal Circuit and Family Court of Australia decision Dayal [2024] FedCFamC2F 1166, a solicitor provided the Court with a list of fictional authorities and case summaries which had been generated using an AI tool. While accepting the solicitor’s apology as genuine, Humphreys J considered it in the public interest to refer the solicitor’s conduct to the Victorian Legal Services Board. Her Honour referred to the US District Court case of Mata v Avianca Inc 678 F Supp 3d 443 (S.D.N.Y. 2023) in which Mr Mata’s lawyers filed submissions containing fake authorities generated by ChatGPT. The practitioners initially stood by the submissions when called into question by the Court and they were found to have acted with bad faith and in violation of various court rules. There, the Court said (at 448-449):
Many harms flow from the submission of fake opinions. The opposing party wastes time and money in exposing the deception. The Court’s time is taken from other important endeavors. The client may be deprived of arguments based on authentic judicial precedents. There is potential harm to the reputation of judges and courts whose names are falsely invoked as authors of the bogus opinions and to the reputation of a party attributed with fictional conduct. It promotes cynicism about the legal profession and the American judicial system. And a future litigant may be tempted to defy a judicial ruling by disingenuously claiming doubt about its authenticity.
- Here, the applicant’s solicitor’s use of AI in the preparation of two court documents has given rise to cost, inconvenience and delay to the parties and has compromised the effectiveness of the administration of justice. But I do not consider the use of AI in this case means that it is appropriate to refer the solicitors’ conduct to the Victorian Legal Services Board. Here an inexperienced junior solicitor was given the task of preparing document citations for an amended pleading, and did so while working remotely and without access to the documents to be cited. In attempting to cite the relevant documents she used an (apparently AI-assisted) research tool which she considered had produced accurate citations when she previously used it. And as soon as Massar Briggs Law was told of the false citations the problem was addressed. The junior solicitor and the principal solicitor have apologised or expressed their regret to the other parties and the Court, and there was no suggestion that they were not genuine in doing so.
- The junior solicitor took insufficient care in using Google Scholar as the source of document citations in court documents, and in failing to check the citations against the physical and electronic copies of the cited documents that were held at Massar Briggs Law’s office. The error was centrally one of failing to check and verify the output of the search tool, which was contributed to by the inexperience of the junior solicitor and the failure of Mr Briggs to have systems in place to ensure that her work was appropriately supervised and checked. To censure those errors it is sufficient that these reasons be published.
The applicants were given leave to file amended documents correcting footnote errors.
The link to the full decision is here.