FEATURE ARTICLE -
Issue 102: December 2025, Professional Conduct and Practice
Introduction
On 28 March 2025 the Chief Justice of the Federal Court of Australia issued a Media Statement relating to the use of Generative Artificial Intelligence (AI) by Courts and in Court proceedings (the Issue). The Federal Court indicated in the Media Statement that it is keen to ensure that any Guideline or Practice Note appropriately balances the interests of the administration of justice with the responsible use of emergent technologies in a way that fairly and efficiently contributes to the work of the Court.
The Chief Justice also gave a preliminary indication that the Court expected that parties and practitioners to disclose the use of AI if required to do so by a Judge or Registrar of the Court. However, in relation to anything done for the purposes of a judicial proceeding, including the creation of documents, the legislation provides a defence to infringement.[1]
Some examples of misuse of AI in Court documents.
Mata v. Avianca, Inc., 678 F. Supp. 3d 443 In February 2022,[2] Mata filed a personal injury lawsuit in the U.S. District Court for the Southern District of New York against Avianca, alleging that he was injured when a metal serving cart struck his knee during an international flight. The defendant moved to dismiss Mata’s claim as being time-barred.
The plaintiff’s lawyers (Schwartz) used ChatGPT to generate submissions which contained numerous non-existent precedents involving airlines together with erroneous quotations and internal citations. The defendant’s lawyers contacted the chambers of the presiding Judge informing the Judge that they were unable to find the decisions cited by the plaintiff’s lawyers.
One interesting and relevant observation about this case, is the fact that there are numerous online comments about the Mata judgment citing the judgment dated 22 June 2023 as the authority. However, that judgment dealt only with the substantive issue of the motion to dismiss the claim. There is no reference in that judgment to Schwartz’ conduct. It appears by third party legal reporting, that the Judge ordered a separate hearing and for Schwartz to provide an affidavit to show cause why plaintiff’s counsel ought not be sanctioned.
In Handa & Mallick [2024] FedCFamC2F 957, Judge A Humphreys’ reasons for judgment noted at [2] that they related to the conduct of the legal representative (Mr B acting as agent) appearing on behalf of the husband. The substantive proceeding, which was set down to be heard on that date, involved an enforcement application relating to final property orders. Her Honour stood the matter down to allow the parties an opportunity to see if the enforcement application could be resolved. The Judge asked if either of the parties’ lawyers were “in a position to provide [her Honour] with any authorities that they sought to rely upon, for [her Honour] to read while the matter was stood down”: [5].
In response, Mr B tendered a single-page list of authorities. When the matter resumed, her Honour informed Mr B that neither her Honour nor her Honour’s Associate were able to locate the authorities cited in the list and noted that although her Honour’s Associate asked Mr B to provide copies of the authorities referred to in the list, he did not do so.
Judge A Humphreys ordered the solicitor preparing the list to provide to the court by way of email to the Judge’s Associate, submissions of not more than five pages, identifying any reasons as to why the solicitor ought not be referred to the Office of the Victorian Legal Services Board and Commissioner in relation to the list of authorities tendered by him at the hearing. Consideration of the issues raised by the solicitor’s conduct in relation to the list, was adjourned to another date.
The judgment of Judge A Humphreys in Dayal [2024] FedCFamC2F 1166, are the reasons relating to the subsequent hearing as to the solicitor’s conduct, which arose from the orders made in Handa. As stated above in relation to Handa, the solicitor in question tendered to the court a list and summary of legal authorities that did not exist. The solicitor had later informed the court the list and summary were prepared using an artificial intelligence (AI) tool incorporated in the firm’s legal practice management software they subscribed to. The solicitor acknowledged that the authorities identified in the list and summary tended to the court did not exist. The solicitor further acknowledged that he did not verify the accuracy of the information generated by the research tool before submitting it to the court.
Her Honour at [15] referred to guidelines issued by the Supreme Court and County Court of Victoria for the responsible use of generative AI by litigants and lawyers. Relevantly her Honour identified the following duties of Victorian solicitors:
- The paramount duty to the court and to the administration of justice, which included a specific duty not to deceive or knowingly or recklessly mislead the court;
- Other fundamental ethical duties, including to deliver legal services competently and diligently; and
- Not to engage in conduct which is likely to diminish public confidence in the administration of justice or bring the legal profession into disrepute.[3]
The solicitor offered an unconditional apology, voluntarily made a payment for costs thrown away by the other side and outlined steps he had taken to avoid the situation arising again.
Judge Humphreys noted these matters. However, her Honour considered that the appropriate body and official to determine whether there should be any further investigation of the solicitor’s conduct, was not the Court and ordered that the Principal Registrar of the Court or his delegate refer this matter to the Office of the Victorian Legal Services Board and Commissioner for consideration of the conduct of Mr Dayal, solicitor, and providing copies of the following:
- the directions and the reasons for the directions;
- the written submissions of Mr Dayal dated 19 August 2023 (Exhibit B);
- the settled ex tempore reasons delivered on 19 July 2024 in the matter of Handa & Mallick (MLC6910/2023); and
- the list of authorities and case summaries tendered by Mr Dayal at the hearing on 19 July 2024 (Exhibit A).
The case of Valu v Minister for Immigration and Multicultural Affairs (No 2)[4] mainly concerned the conduct of the applicant’s legal representative (the ALR) which involved the filing of submissions with the Court that contained citations of authorities and quotes alleged to be from a decision of the Administrative Appeals Tribunal (the Tribunal) (as it then was) which did not exist. ALR sent the Court an email, without copying in the other party or seeking their consent to send the correspondence, which attached amended submissions that had removed the citations to the non-existent case law and the purported quotes from the Tribunal’s decision.
The Court made orders which, inter alia, required the ALR to file and serve an affidavit addressing how the submissions filed 25 October 2024 were generated, including a full explanation as to why the submissions contained references to fictional authorities and purported quotations from the Tribunal’s decision. Judge Skaros ordered that the Principal Registrar of the Court or his delegate refer this matter to the Office of the NSW Legal Services Commissioner for consideration of the conduct of the applicant’s legal representative, providing copies of relevant documents.
In Dias v Angle Auto Finance Pty Ltd [2025] FWC 47, the applicant made an application to the Fair Work Commission for a remedy, alleging that she had been unfairly dismissed from her employment with the Respondent. The applicant appeared in person. At [28] of the decision, Commissioner Matheson noted:
“Despite presenting as actual cases from the Commission, Full Bench of the Commission and a Full Court of the Federal Court of Australia it is apparent that these are not cases related to the principles the Applicant advanced in opposition to the granting of legal representation or are not real cases. As will become apparent further in this decision, the Applicant has also advanced other cases that do not appear to be real in support of her substantive application.”
The Commissioner went on to say:
“While we live in an information age and parties may seek to rely on sources of information extracted from an online and/or artificial intelligence enabled environment, use of information that is not from a credible and/or reliable source creates a risk that the information from such sources may be wrong or misleading and parties before the Commission should exercise caution in this regard.”
In Weedbrook v Partlin [2024] QDC 194, Mr Partlin was self-represented in a claim for $350,000 arising from two written loan agreements. Judge Porter noted at [39] and [40] of his Honour’s reasons for judgment:
“When asked at the second hearing whether he had had assistance from a solicitor with his second outline, Mr Partlin only said he had had assistance. When asked later by email to provide copies of the cases which could not be located, he responded that he could not verify the “recommended references using non-official search tools” and took those references at face value because he struggled to find any cases through official sources. Although he did not say so directly, I infer from the form of his second outline and nature of the errors (i.e. non-existent references) that the erroneous references were recommended by some form of aritifical (sic) intelligence (AI) model. At least four authorities cited did not exist. Very few of the citations were fully correct. The erroneous citations appear to be hallucinations derived from a large language AI model.
[40] This incident reflects the damaging effect of using such models to generate legal submissions. The errors in Mr Partlin’s submissions took up the time of counsel for Mr Weedbrook. It also took up considerable Court time in checking the authorities. This kind of error impacts on the fairness and efficiency of legal proceedings. The use of large language models to generate submissions is unwise and should only be relied upon if the writer has personally checked both the citiation (sic) and the passage referred to in the response to the prompt.” (Citations not included)
Some observations from these cases
In the cases where lawyers were responsible for the Court documents which contained non-existent cases or references not on point, the lawyers responsible for their preparation:
- Failed to provide copies of the cases referred to in their submissions when asked to by the Court.
- Had no plausible explanation for the inclusion of non-existent cases and references in submissions to the Court. In one case (un-named) the solicitor’s explanation effectively was that he trusted Chat GPT.
It is implicit in documents prepared by a lawyer for the Court’s consideration in the determination of the contest, that the lawyers have applied their professional judgment and reasoning to an issue to be determined by the Court. Her Honour in Dayal at [15], referring to the Victorian Supreme Court guidelines, said that the AI generated submissions were not the product of reasoning and nor were they a legal research tool. Generative AI did not relieve the responsibility of a legal practitioner of the need to exercise judgment and professional skill in reviewing the final product to be provided to the Court.
In Dias and Partlin, the documents in issue, which contained non-existent cases, were prepared by self-represented litigants utilising results generated by AI. In the present state of affairs in relation to the use of AI in documents prepared for use by the Federal Court in the determination of proceedings, the Court must view these documents with a degree of caution. As stated by the Commissioner in Dias at [28], the “use of information that is not from a credible and/or reliable source creates a risk that the information from such sources may be wrong or misleading and parties before the Commission should exercise caution in this regard.”
The Court’s reasonable expectations
The Court is tasked to make judgments in disputes whether interlocutory or at trial. To make those judgments it does so based on law and the facts of an individual case as expressed in pleadings, evidence and submissions (the material). The Court must begin its process of assessment of the material on the premise that what is presented to the Court by an applicant identifies the complaints with sufficient clarity and precision for the respondents to answer the various allegations[5] and for the Court to adjudicate the controversy arising.[6]
To this end for a pleading prepared by a lawyer, the lawyer must include a statement certifying that the factual and legal material available to the lawyer provided a proper basis for each allegation (or response to an allegation), in the pleading.[7]
Similarly for lawyers, the Court has an expectation that lawyers who have prepared documents for the Court’s consideration, have exercised professional skill in framing their client’s case, relevant to the issues.
Separate from the general duties of lawyers to the Court as officers of the Court, these responsibilities are underpinned by the positive obligation to act consistently with the “overarching purpose”.[8]
Relevantly, it is an objective of the “overarching purpose” that the lawyers take into account the efficient use of the judicial and administrative resources available for the purposes of the Court. The examples given above, where the authorities and summaries have not been the subject of professional scrutiny by the lawyers, led to a waste of the Court’s and the opposing lawyers time. As stated in Partlin: “[t]he errors in Mr Partlin’s submissions took up the time of counsel for Mr Weedbrook. It also took up considerable Court time in checking the authorities.”
The position of self-represented litigants differs slightly from lawyers. It is true that “the parties” to a civil proceeding before the Court must conduct the proceeding (including negotiations for settlement of the dispute to which the proceeding relates) in a way that is consistent with the overarching purpose.[9] Similarly, a specific duty is imposed on lawyers in a civil proceeding.[10] The difference is that in the case of self-represented litigants, the Court has a duty to ensure a fair trial to assist self-represented litigants. In Tulett v Yourtown Pty Ltd [2024] FCA 513, O’Sullivan J at [48]ff referred to the judgment of Burley J in Lamont v Malishus Ltd (No 2) [2022] FCA 237at [79]ff, where Burley J considered the duties of the Court where self-represented litigants were involved. O’Sullivan J at [49] noted that Burley J had identified the following principles:
- A party does not have a right to representation in court, but where that party is not represented, courts have an overriding duty to ensure that the trial is fair: at [79].
- A judge has a duty to ensure a fair trial by giving self-represented litigants assistance so as to help to ensure that the litigant is treated equally before the law and has equal access to justice: at [80];
- This is to be balanced against the fact that the duty does not extend to advising an unrepresented litigant as to how his or her rights should be exercised;
- A further contextual matter is the apparent capacity of the self-represented party concerned: at [82], which includes an understanding of the circumstances and the characteristics of the litigant which includes his or her intelligence and literacy: at [85].
Accordingly, in the case of self-represented litigants, the Court could be expected, subject to the circumstances of the case, to treat the use of AI in court documents with greater tolerance.
Conclusion and Summary of the Author’s Recommendations on Court disclosure of AI
The author considers that it is reasonable to include in documents prepared for the Court’s consideration in a proceeding, a statement by the person preparing the document, as to whether the preparation of the document by the lawyer or self-represented litigant involved the use of AI. This will at the very least alert the Court of the potential risk that the AI tool utilised may be a large language models (LLM), which includes LLMs which are trained on data that has been gathered from the Internet, as opposed to recognised Australian repositories of Australian case law. At this stage, it appears that it is the fact of use of an AI assistant which is important and that some scrutiny of the AI results has been applied before the document was submitted to Court.
The author also considers that it is reasonable to include in Court documents, for example submissions, a statement that the person preparing the document with AI assistance, has, like the certification on pleadings, satisfied themselves as to the relevance of the material in the document to the proceeding, which is the result of using an AI assistant.
In this regard, there is no reason to distinguish between lawyers preparing the documents or self-represented litigants preparing the document. Both have a duty to conduct the proceeding in accordance with the overarching purpose, and because the primary purpose is to alert the Court of the use. Self -represented litigants and in my experience, represented litigants are using AI tools such as ChatGPT in the hope of reducing the legal fees of their lawyers, or, encouraged by what appears to be credible legal arguments, undertaking the litigation as self-represented parties.
However, the author does not consider it reasonable to require the party preparing the document to identify the extent of use of the AI assistant. This is both impractical and burdensome. It is impractical because each author will have a different approach. For example, some lawyers and parties will look at the AI results and go to the case references provided by the AI assistant, to verify the AI statement in the results supports their position. Others may not at their peril. Then there will be all those shades in between.
The public’s reasonable expectations
AI can be a time saving resource in the field of research. Its use can extend to drafting letters, submissions and court documents. The extension of its assistance from research to drafting is in the opinion of the author, presents a very different consideration to drafting.
Subject to the terms of the request, the research conducted by the AI tool can produce better quality results at a fraction of the time, however, it is still up to the lawyer to examine the results that are considered relevant by AI and craft them into the argument supportive of the case. For want of a better analogy, you have the ingredients to make the cake.
If this model is followed, there is clearly a creative contribution by the lawyer which creates a very arguable claim to copyright in the work. Of course, the importance of copyright in the legal environment is nullified by the defence to infringement claims in the Copyright Act 1968 (Cth), in respect of judicial proceedings and giving legal advice by the identified attorneys.[11]
Drafting however diminishes the creative contribution of the individual. Why? In drafting, AI has made the cake mixture and unless the result is a total failure, the creative content will be to make adjustments to the flavour or consistency. This input involves a far less creative contribution by the lawyer and runs the risk that AI is the author and copyright will not subsist in the work.[12]
Is that important? Not really in relation to court documents and documents reproduced or communicated in the course of giving legal advice. However, when these concepts are considered in the context of judges using AI to assist in the production of a judgment, a very different issue arises.
It is not whether a human author provided the creative input to support ownership and a copyright claim, it is whether the unsuccessful party considers that the judgment is the result of human authorship not AI authorship.
The principles behind apprehended bias of a judge in Australia may be helpful here. These principles are well established and repeatedly affirmed by key authorities.[13] The test there is whether a fair-minded lay observer might reasonably apprehend that the judge might not bring an impartial and unprejudiced mind to the resolution of the matter required to decide.
Assuming for this exercise only, that a judge was required to disclose to the parties whether AI had been used in the preparation of the judgment. The question might be:
“Would a fair-minded lay observer reasonably apprehend that the judge might not have made a contribution of authorship to the published judgment?”
In order to assist in determining the question, the judge might be required to identify the questions or commands presented to AI. Beyond this it is not only impractical as it is in the case of lawyers to seek to inquire as to the extent of reliance placed on AI by a judge but places unnecessary pressure on a judge – the judge is not on trial!
However, “The role of a judge is a complex one. It can incorporate activism, complex interactions with people, dispute settlement, case management, public and specific education activities, social commentary as well as adjudicatory functions that might be conducted with other judges or less commonly in some jurisdictions with lay people (juries)”.[14]
Justice Needham of the Federal Court of Australia shared her Honour’s thoughts on the topic in a paper of artificial intelligence including the topic of Judges and the use of generative AI.[15]
Amongst other things, her Honour helpfully identifies in an Appendix to the paper, a summary of the procedures (including Practice Notes and Guidelines) currently in force for the Federal Court of Australia, the NSW State Courts (Supreme Court, Land and Environment Court, and District Court), the Victorian Supreme and County Court, and the Queensland Supreme Court, noting that each of them takes a different approach to the uses of AI.
For example, the NSW State Courts (the Supreme Court, Land and Environment Court, and District Courts have each adopted in effect the same practice notes). The Chief Justice’s Practice Note SC Gen 23, which commenced 3 February 2025, has taken a very strong position against the use of AI. It proscribes the use of generative AI in the preparation of evidence and expert reports. Generative AI must not be used to generate the contents of affidavits, witness statements, character references, although preparatory steps are acceptable.
In relation to judges, SC Gen 23 states at [4]:
“Judges in New South Wales should not use Gen AI in the formulation of reasons for judgment or the assessment or analysis of evidence preparatory to the delivery of reasons for judgment.” (emphasis in original)
Further at [5] SC Gen 23 states:
“Gen AI should not be used for editing or proofing draft judgments, and no part of a draft judgment should be submitted to a Gen AI program.” (emphasis in original)
However, the use of Generative AI for legal research purposes or “any other purpose” is permitted but the Practice Note encourages judges to look for “red flags” and to familiarise themselves with some limitations of large language model Generative AI such as “hallucinations”, the possibility of misinformation, a scope for bias in the results, case law and submissions which diverge from the general understanding of the applicable law in the case.[16]
The position reflected in SC Gen 23 as it relates to judges, is in contrast to the well publicised position expressed in the UK by Lord Justice Birss of the Court of Appeal, speaking extra-judicially, said he found AI-generated summaries of areas of law with which he was familiar “jolly useful”.[17]
The Rt Hon. Lord Justice Birss, Deputy Head of Civil Justice, delivered a speech at the Life Sciences Patent Network European Conference in London on 3 December 2024.[18] Lord Justice Birss considered, subject to ethical and human rights considerations that:
“[l]ooking further into the future, one could imagine that AI may very well be able to assimilate much larger quantities of data than a normal human judge. One could then be faced with the situation in which an AI system might be a better decision maker than a human being in those circumstances. I should make clear that I do not believe we’re anywhere near that yet, but from what I read in the literature of the capabilities of AI, I would not like to bet against the idea that this capability will arrive in a not-too-distant future.”[19]
His Lordship identifies that in such a case the question of bias in the training of large language models would be important.[20]
Her Honour makes a very critical point in her Honour’s paper – the integrity of the resource pool from which the results have been taken. Like her Honour, the author has road tested Lexis Plus AI and Thomson Reuters’ Westlaw Precision (Co-counsel), both of which draw from the Australian legislation and their case resource pool. Incidents involving case lists provided by solicitors have arisen through the use of systems which draw from the entire web such as Chat GPT.[21]
The author has never used AI resources to draft court documents or correspondence. There is a danger that the results may be so impressive that the senses may subconsciously relax relying on the past performance accuracy. Of course, using a precedent from a prior case has its own issues. However, lawyers are paid to apply their skill in drafting and settling documents. It is expected that they have applied that skill and care in checking that AI truly reflect the best way of presenting their client’s case.
Judges, however, have a very different consideration. Judgments must reflect that the judge has applied his or her mind to the case presented for each party. The consequences of a judgment which contains sections which have been “cut and pasted” from one party’s submissions or another judgment of similar (but not the same) circumstances, undermines the confidence in the judgment in that there is doubt that the judge’s mind has been directed to the relevant circumstances or argument of a party in the proceeding.[22]
In Maazuddin[23] Judge Gostencnik of the Federal Circuit and Family Court of Australia was considering an application for an order under s 477(2) of the Migration Act 1958(Cth) (Act) to extend the 35-day period within which a judicial review application of a migration decision of the (then) Administrative Appeals Tribunal may be made. Although listed as an interlocutory matter, the parties agreed and the Judge was willing to deal with the substantive judicial review application if an extension of time were granted.
The applicant was the holder of a student (Temporary) (Class TU) (Subclass 500) visa. The delegate of the (then) Minister for Immigration, Citizenship, Migrant Services and Multicultural Affairs concluded the applicant failed to maintain enrolment in a registered course that, once completed, would provide a qualification from the Australian Qualifications Framework (AQF) at the relevant level.[24]
The delegate’s decision was affirmed by the (then) Administrative Appeals Tribunal. The applicant contended that the Tribunal fell into jurisdictional error in two respects:
- The Tribunal failed to bring its own independent mind to bear on what would be the correct or preferable decision on the review. This is because, the Tribunal is said to have copied, without attribution, significant portions of the reasons of the delegate whose decision the Tribunal was tasked to review.[25]
- The Tribunal denied the applicant procedural fairness, because contrary to its obligation under s 359A(1)(a) of the Act to give to the applicant clear particulars of any reason the Tribunal considered to affirm the decision under review, the Tribunal failed to put the substantial parts of the delegate’s decision the Tribunal proposed to adopt as its own to the applicant.[26]
On 7 May 2020, the applicant sought judicial review of the Tribunal’s decision by lodging an application in the (then) Federal Circuit Court of Australia for a remedy in the exercise of the Court’s original jurisdiction under s 476 of the Act.
In paragraph 38 of the Judge’s reasons for judgment, Judge Gostencnik attaches a comparison table comparing the delegate’s reasons with the AAT’s reasons and concludes at [59] of the reasons for judgment:
“The overall impression gained from the analysis above is that the material findings of fact and the reasoning adopted by the Tribunal were that of the delegate, which were reproduced in the Decision without attribution or acknowledgement. The Tribunal failed to bring its own independent mind to the review. Consequently, I consider the Tribunal failed to discharge the statutory task imposed on it to consider the matter on review for itself afresh, so as to make a decision it considered the correct and preferable one.”
The decision of the Tribunal was quashed, and the review of the delegate’s decision was remitted to the Administrative Review Tribunal for determination according to law.
Conclusion
In the end, it is clear that an AI system or product properly resourced in the Australian legislation and caselaw can be a useful tool that reduces research time and can allow a practitioner to produce the result such as an opinion, in a fraction of the time. It must be understood that the result of AI generated research are decisions which it considers relevant to the issue identified in the instructions. Customising that information and tailoring it to the circumstances of a case to present the case to the client, the opponents, the Court or for judges. In the case of the judiciary, the human contribution is required for a different reason – to reflect to the parties and the public that the Court or Tribunal has brought an independent mind to resolve the dispute.
[1] Copyright Act 1968 (Cth) s 43(1).
[2] OPINION AND ORDER: re: 16 MOTION to Dismiss pursuant to Fed. R. Civ. P. 12(b)(6) filed by Avianca, Inc. Avianca’s motion to dismiss is GRANTED. (ECF 16.) The Clerk is respectfully directed to terminate the motion. SO ORDERED. (Signed by Judge P. Kevin Castel on 6/22/2023) (ama) Transmission to Orders and Judgments Clerk for processing.
[3] References in the guidelines not included.
[4] (Federal Circuit and Family Court of Australia (Division 2) [2025] FedCFamC2G 95; (2025) 386 FLR 365.
[5] Tulett v Yourtown Pty Ltd [2024] FCA 513 at [21] per O’Sullivan J.
[6] Similarly, a cross-claim by the respondent bears the same responsibility
[7] Federal Court Rules 2011 (Cth).
[8] Federal Court of Australia Act 1976 (Cth) s 37M and s 37N.
[9] Federal Court of Australia Act s 37N(1).
[10] Federal Court of Australia Act s 37N(2).
[11] Copyright Act s 43.
[12] Telstra Corporation Limited v Phone Directories Company Pty Ltd [2010] FCAFC 149 (Keane CJ, Perram and Yates JJ).
[13] Mobil Oil Australia Ltd v Lyndel Nominees Pty Ltd (1998) 81 FCR 475 (Lockhart, Lindgren, Tamberlin JJ).
[14] “Judge v robot? Artificial intelligence and judicial decision-making” Judicial Commission of New South Wales:
< https://jirs.judcom.nsw.gov.au/public/assets/benchbooks/judicial_officers/ >.
[15] Justice Jane Needham “AI and the Courts in 2025 – Where are we and how did we get here?”: https://www.fedcourt.gov.au/digital-law-library/judges-speeches/justice-needham/needham-j-20250627.
[16] SC Gen 23 [11].
[17] Referred to in her Honour’s paper at [27].
[18] The Impact and Value of AI for IP and the Courts – a speech by Lord Justice Birss – 5 December 2024: “The Impact and Value of AI for IP and the Courts – a speech by Lord Justice Birss – Courts and Tribunals Judiciary”.
[19] The speech at [13].
[20] The speech at [16].
[21] The Paper at [29].
[22] In Atanaskovic Hartnell Corporate Services Pty Ltd v Kelly [2024] FCAFC 137, the Full Court of the Federal Court of Australia set aside a decision of the Federal Circuit Court of Australia (as it was previously known). The issue on appeal concerned the trial judge who had copied portions of the submissions of one of the parties resulting in what was said to reflect a lack of consideration by the judge of the appellant’s case. See also the circumstances surrounding the resignation for health reasons of Federal Magistrate Jennifer Rimmer in 2006, who it was alleged plagiarized a colleague’s work: <https://www.afr.com/politics/ magistrate-judge-in-copycat-furore-20060324-jfobl>.
[23] Maazuddin v Minister for Immigration and Multicultural Affairs [2024] FedCFamC2G 1349 (Judge Gostencnik, 10 December 2024).
[24] Maazuddin [2].
[25] Maazuddin [3].
[26] Maazuddin [3].