🤖 AI Summary
The NSW Reconstruction Authority has confirmed a data breach after a former contractor uploaded a Microsoft Excel file (10 columns, >12,000 rows) containing personal information from the Northern Rivers Resilient Homes Program to the AI platform ChatGPT between 12–15 March 2025. Early forensic work indicates up to 3,000 applicants may be affected; disclosed fields include names, addresses, email addresses, phone numbers and some personal/health information. There is no evidence so far that the data has been accessed or published by a third party. The RA has engaged forensic analysts, notified the NSW Privacy Commissioner and Cyber Security NSW, begun dark‑web monitoring, implemented immediate technical controls to block unauthorised AI uploads, initiated an independent review, and will contact impacted individuals with details and support (including compensation for identity-document replacement).
For the AI/ML community this is a concrete example of an operational data governance failure rather than a systems hack: sensitive data exposure can occur when staff or contractors use unsanctioned AI tools. Technical and policy implications include the need for strict DLP controls, endpoint/application blocking, clearer contractual clauses about uploads to third‑party models, staff training on model use, and audit trails for data handling. The incident reinforces that responsible AI deployment requires both secure infrastructure and organisational processes to prevent inadvertent sharing of personally identifiable information with external LLM services.
Loading comments...
login to comment
loading comments...
no comments yet