eSafety Commission Warns AI Tools will Automate Child Grooming By Predators

Julie Inman Grant says the possibility of chat bots being created to contact young people sets up ‘sinister new avenues for manipulation’
26 May 2023
Image by CyberBeat

The eSafety Commission is concerned about the possibility of generative AI programs being used by predators to groom children. As the federal government moves to regulate the fast-growing new technology.

The eSafety Commissioner, Inman Grant, is worried that predators might create chat bots to contact young people. She is also concerned about the potential use of generative AI to automatically groom children on a large scale. This could be combined with metaverse applications that predict user behavior through eye movements which creates a sinister new way of manipulation.

Julian Hill, a Labor MP, warns that consumer products such as ChatGPT and Bard are like "the canary in the coalmine" and suggests having a new federal body to monitor this area. Hill believes that Australia's AI capability and governance gap both in public and private sectors is worrying. He emphasises that the decisions that shape the future of society cannot be left alone to the private interests of technologists or multinationals.

The eSafety Commission is happy that AI companies want more regulation as their products become popular. However, they reportedly released generative AI tools without regulations, which is not ideal. Australian regulators are working with international counterparts to develop policy to address these concerns.

- CyberBeat


About CyberBeat

CyberBeat is a grassroots initiative from a team of producers and subject matter experts, driven out of frustration at the lack of media coverage, responding to an urgent need to provide a clear, concise, informative and educational approach to the growing fields of Cybersecurity and Digital Privacy.

Contact CyberBeat

If you have a story of interest, a comment, a concern or if you'd just like to say Hi, please contact us

Terms & Policies >>


We couldn't do this without the support of our sponsors and contributors.