The presentations of the Australian Murder Case Court include fake events and non -existent judgments generated by AI

The presentations of the Australian Murder Case Court include fake events and non -existent judgments generated by AI

/ News/ AP

The growing threat of the propaganda of AI

The presentations of the Australian Murder Case Court include fake events and non -existent judgments generated by AI

The growing threat of the propaganda of AI 04:37

A main lawyer in Australia apologized to a judge for presenting the presentations in a case of murder that included false quotes and judgments of non -existent cases generated by artificial intelligence.

The error in the Supreme Court of the State of Victoria is another in a lethath of mishaps that AI has caused in justice systems around the world.

Defense lawyer Rishi Nathwani, who has the prestigious legal title of King’s lawyer, assumed the “full responsibility” of presenting incorrect information in the presentations in the case of a teenager accused of murder, according to judicial documents seen by News on Friday.

“We feel deeply and ashamed about what happened,” Nathwani told Judge James Elliott on Wednesday on behalf of the defense team.

The errors generated by the AI caused a 24 -hour delay in the resolution of a case that Elliott had hoped to conclude on Wednesday. Elliott ruled on Thursday that Nathwani’s client, who cannot be identified because he is a minor, was not guilty of murder due to mental disability.

“At the risk of underestimation, the way in which these events have been developed is not satisfactory,” Elliott told lawyers on Thursday.

“The ability of the court to trust the precision of the presentations made by the lawyer is fundamental for the due administration of justice,” Elliott added.

False presentations included appointments made of a speech to the state legislature and the appointments of non -existent cases supposedly of the Supreme Court.

The errors were discovered by Elliot’s associates, who could not find the aforementioned cases and requested that defense lawyers provide copies, the Broadcasting Corporation of Australia reported.

The lawyers admitted that the quotes “do not exist” and that the presentation contained “fictional quotes”, the judicial documents say.

The lawyers explained that they verified that the initial appointments were precise and mistakenly assumed that the others would also be correct.

The presentations were also sent to prosecutor Daniel Porceddu, who did not verify his precision.

The judge pointed out that the Supreme Court published guidelines last year on how lawyers use AI.

“It is not acceptable that artificial intelligence be used unless the product of that use is independent and completely verified,” Elliott said.

False quotes from Australia
People leave the Supreme Court of Victoria in Melbourne, on Friday, August 15, 2025. ROD MCGURK / AP

Judicial documents do not identify the artificial intelligence generative system used by lawyers.

In a comparable case in the United States in 2023, a federal judge imposed fines of $ 5,000 to two lawyers and a law firm after Chatgpt was blamed for its presentation of the fictitious legal investigation in a claim of aviation injuries.

Judge P. Kevin Castel said they acted in bad faith. But it accredited their apologies and corrective steps taken by explaining why harder sanctions were not necessary to ensure that they or others do not allow artificial intelligence tools to promote them to produce a false legal history in their arguments.

Later that year, more fictitious judicial failures were cited in invented by legal documents Filed by lawyers for Michael Cohenformer personal lawyer of the president of the United States, Donald Trump. Cohen blamed, saying that he did not realize that the Google tool he was using for legal investigation was also capable of the so -called AIucinations of AI.

The Judge of the British Superior Court, Victoria Sharp, warned in June that providing false material as if it were genuine could be considered contempt of court or, in the “most heinous cases”, pervert the course of justice, which entails a maximum sentence of life imprisonment in prison.

The use of artificial intelligence is reaching the US courts in other ways. In April, a man named Jerome Dewald appeared before a New York court and presented a video that presented an avatar generated by AI to deliver an argument in his name.

In May, a man who was killed in an incident of anger on the road in Arizona “He spoke “During the sentence hearing of his murderer After his family used artificial intelligence to create a video of him reading an impact statement of the victim.

  • Australia
  • Artificial intelligence
  • Murder judgment

Leave a Reply

Your email address will not be published. Required fields are marked *