Deloitte to partially refund Aus government after using AI in $440K report (www.theguardian.com)

🤖 AI Summary
Deloitte has agreed to partially refund the Australian federal Department of Employment and Workplace Relations after a $440,000 review it produced contained multiple errors — including nonexistent references and a misquoted court case — and Deloitte disclosed it had used generative AI in producing parts of the report. The firm amended the report, correcting footnotes and adding an appendix that says a toolchain based on Azure OpenAI GPT-4o (hosted on DEWR’s Azure tenancy) was used; Deloitte maintains the substantive findings and recommendations are unchanged. Academic reviewers flagged “hallucinations” where the model appears to have invented or misattributed sources, prompting public criticism and a senator’s claim that Deloitte has a “human intelligence problem.” For the AI/ML community, the episode is a practical reminder of LLM limitations around hallucination and provenance when used in high‑stakes, evidence‑dependent tasks. Even when models are run in controlled enterprise environments, they can fabricate citations or weave unsupported claims into narrative outputs; human verification, retrieval‑augmented grounding, stronger citation protocols, and audit trails remain essential. The case also underscores governance and procurement implications: contracts with consultancies should require transparency about AI use, documented data sources and human oversight, and technical mitigations (e.g., RAG with vetted corpora, citation validation, post‑generation fact checking) to prevent similar failures.
Loading comments...
loading comments...