The eSafety Commission is concerned about the possibility of generative AI programs being used by predators to groom children. As the federal government moves to regulate the fast-growing new technology.
The eSafety Commissioner, Inman Grant, is worried that predators might create chat bots to contact young people. She is also concerned about the potential use of generative AI to automatically groom children on a large scale. This could be combined with metaverse applications that predict user behavior through eye movements which creates a sinister new way of manipulation.
Julian Hill, a Labor MP, warns that consumer products such as ChatGPT and Bard are like "the canary in the coalmine" and suggests having a new federal body to monitor this area. Hill believes that Australia's AI capability and governance gap both in public and private sectors is worrying. He emphasises that the decisions that shape the future of society cannot be left alone to the private interests of technologists or multinationals.
The eSafety Commission is happy that AI companies want more regulation as their products become popular. However, they reportedly released generative AI tools without regulations, which is not ideal. Australian regulators are working with international counterparts to develop policy to address these concerns.
CyberBeat is a grassroots initiative from a team of producers and subject matter experts, driven out of frustration at the lack of media coverage, responding to an urgent need to provide a clear, concise, informative and educational approach to the growing fields of Cybersecurity and Digital Privacy.
If you have a story of interest, a comment, a concern or if you'd just like to say Hi, please contact us
We couldn't do this without the support of our sponsors and contributors.