An Australian attorney is under scrutiny after he utilised AI software, ChatGPT, to draft court filings for an immigration case. The filings contained non-existent case references, leading the Federal Circuit and Family Court to refer him to the Office of the NSW Legal Services Commissioner (OLSC).
The attorney's use of AI was deemed problematic as it caused unnecessary efforts by court officials to validate the incorrect citations. The attorney claimed that time constraints and health issues led him to use AI, but acknowledged his failure to verify the accuracy of the citations generated by the software. The court underscored the misuse of AI in legal proceedings as an issue of public interest that needs to be addressed and monitored.
This incident represents the second case in Australia where an attorney has been reported to a regulatory body for improper use of AI. This comes on the tail end of a directive from the NSW Supreme Court that sets guidelines for attorneys on using AI in their practice, specifically prohibiting its use in generating affidavits, witness statements, and other evidence materials. The court's decision points towards the growing concern and the need for stringent guidelines in the utilisation of AI in legal proceedings.
- CyberBeat
CyberBeat is a grassroots initiative from a team of producers and subject matter experts, driven out of frustration at the lack of media coverage, responding to an urgent need to provide a clear, concise, informative and educational approach to the growing fields of Cybersecurity and Digital Privacy.
If you have a story of interest, a comment, a concern or if you'd just like to say Hi, please contact us
We couldn't do this without the support of our sponsors and contributors.