🤖 AI Summary
Australia’s High Court Chief Justice Stephen Gageler warned that judges are increasingly being forced to act as “human filters” for machine-generated or machine‑enhanced legal material, telling the Australian Legal Convention that AI use in litigation has reached an “unsustainable phase.” He cited widespread reliance on AI to draft submissions, prepare evidence and frame arguments by both self‑represented litigants and trained lawyers, and noted real-world harms — including proliferation of false precedents and a recent Victorian lawyer stripped of practising rights for relying on AI‑generated, unverified citations. While he acknowledged AI’s potential to make civil justice “just, quick and cheap,” Gageler warned the pace of development is outstripping the judiciary’s capacity to assess risks and benefits.
Technically, Gageler’s remarks underline two urgent issues for the AI/ML and legal communities: the need for robust verification, provenance and auditability of machine‑produced legal content, and clear regulatory/practice frameworks governing AI’s role in drafting versus decision‑making. Most jurisdictions have issued practice guidelines and Victoria is conducting a specialist review, but Gageler called for deeper engagement on whether and how AI should inform judicial decisions — an “existential” question that also raises wellbeing concerns for judges handling increased workload, public scrutiny and traumatic cases, and broader system failures in responding to family and sexual violence.
Loading comments...
login to comment
loading comments...
no comments yet