AI chatbot warning
Share and Follow
Australian teachers and parents are concerned young students are developing unhealthy emotional attachments to artificial intelligence (AI) chatbots.

We find ourselves on the brink of a transformative AI revolution, where the distinction between cyberspace and the tangible world is becoming increasingly indistinct. This shift is raising subtle yet significant concerns across educational institutions nationwide: could young Australians be at risk of slipping into an AI-dependent void?

In Queensland, Principal Mike Curtis is taking proactive measures to address this issue, aiming to preempt any potential pitfalls for his students before they arise. His primary concern isn’t about AI tools like ChatGPT being used to churn out essays.

AI chatbot warning
AI’s uncharacteristic ability to simulate empathy is driving more and more people to develop a relationship with their robot companion. (Graphic: Polly Hanning)

Rather, Curtis has sounded the alarm for parents, cautioning them about the possibility that their children might be clandestinely developing unhealthy attachments to AI companions.

He’s not so concerned about ChatGPT writing essays.

Instead, he has warned parents that children may be secretly forming unhealthy bonds with their AI companion.

Curtis became concerned when he read about a 14-year-old boy in the US who took his own life after developing a romantic relationship with an AI companion.

“It was really disturbing, and I had never really heard of such a thing, so I thought this is something that our parents need to know about,” the Glasshouse Christian College principal told 9news.com.au

“Particularly the fact it can be so insidious to get inside a kid’s mind and change it in that way.”

Sunshine Coast Christian school Principal, Mike Curtis
Glasshouse Christian College principal Mike Curtis. (Supplied)

In Curtis’ blog post, titled Who Is Your Child’s Best Friend?, he told parents to be wary of kids spending too much time with a virtual companion.

“Real friendships are messy, imperfect and challenging; however, that is a healthy reality,” he said.

“Spending hours with a virtual companion that will tell you whatever it thinks you want to hear is a dangerous fantasy.”

Little kid using system AI Chatbot on mobile application to do his homework at home. Chatbot conversation, AI Artificial Intelligence technology. Futuristic technology. Virtual assistant on internet.
Curtis is not so concerned about ChatGPT writing essays or helping kids with homework. (Getty)

Curtis said the best antidote was open communication and spending quality time with loved ones.

“It’s the same as any red flags when you think your child is going down a rabbit hole online, and that is that they withdraw a lot more, they don’t want to talk about things other than their computer,” Curtis added.

“Kids do tend to be very secretive about it, and just the fact that it’s a phone, it’s a very difficult thing to even just cop a glance at the screen.

“It’s about keeping the conversation open and honest and non-judgmental.”

A spokesperson for Victoria’s Department of Education said it continues to take children’s online safety seriously and provides help to parents concerned about excessive AI use.

“All schools support the development of health and positive relationships, and provide mental health and wellbeing supports for students,” a spokesperson told 9news.com.au.

“These supports can assist families to support young people if they are at risk due to engagement with chatbots.”

The state’s 2025-26 budget saw a $3.5 million investment into digital literacy resources, which includes navigating social media and AI.

Dr Catriona Davis-McCabe, professor of psychology at the Cairnmillar Institute, told 9news.com.au that children may seek “companionship”, romantic or otherwise, from an AI bot as a response to stress or loneliness.

“Some children are using AI as a source of information on mental health, such as seeking advice or to help understand their feelings,” Davis-McCabe said.

“In some situations, this can be appropriate especially where the young person is not yet ready or able to talk to a person about their experiences.

“However, this depends on the AI model providing accurate, evidence-based, age-appropriate information.”

AI’s uncharacteristic ability to simulate empathy is driving more and more people to develop a relationship with their robot companion.

London, UK - 05 10 2025: Apple iPhone screen with Artificial Intelligence icons internet AI app application ChatGPT, DeepSeek, Gemini, Copilot, Grok, Claude, etc.
A YouGov survey found one in five Australians have opened up or been “emotionally vulnerable” with an AI chatbot. (Getty)

It comes as a recent YouGov survey found one in seven Australians could imagine themselves “falling in love” with an AI chatbot.

Australians aged between 18 and 24 were the most likely to become romantically attached, according to the data.

A further one in five Aussies admitted to opening up or becoming “emotionally vulnerable” with an AI chatbot.

Davis-McCabe said this concept is nothing to scoff at – and almost anyone could be susceptible.

“Many people – even with good social connections – can be at risk of developing an emotional attachment with an AI chatbot, due to the stimulation of the brain’s reward pathways that encourages reliance similar to other problematic dependencies/addiction,” she said.

“Chatbots are often designed to encourage ongoing interaction, which can feel ‘addictive’ and lead to overuse and even dependency.”

Her advice, like Curtis’, is for parents to gently encourage their children to step away from screens and re-enter the real world.

“Discuss the difference between artificial and genuine relationships,” Davis-McCabe said.

“The best approach is to express curiosity and concern, and to remind children that they are not alone, and that if they have an online experience that concerns them, or if they make a mistake and realise belatedly that they are at risk, that they can talk to you about it and will not get into trouble for mistakes.”

The Australian Psychological Society (APS) is calling for more investment and research to understand the “full psychological impacts of AI use”.

“The use of AI can change fundamental human experiences including relationships, decision-making, autonomy and so on,” Davis-McCabe added.

Readers seeking support can contact Lifeline on 13 11 14 or beyond blue on 1300 22 4636.

Share and Follow
You May Also Like

Tragic Gold Coast Accident: Toddler Killed, Another Hospitalized After Severe Car Crash

A tragic incident on the northern Gold Coast has resulted in the…

Sydney Resident Accused of Impersonating Federal Officer Following Seizure of Replica Firearms

A man from Sydney’s eastern suburbs has been charged after allegedly posing…

Anthony Albanese and Donald Trump Embark on Key Asia Visits: Discover Their Agendas

Less than a week after meeting at the White House, Australian Prime…

Rising Costs for Senior Australians: Increased At-Home Care Fees Effective Next Week

Aged care advocates fear an incoming sector shake-up means new fees for…

Trump Suggests He’s Still Holding a Grudge Against Rudd for Previous Comments

United States President Donald Trump says he does not forget when people…

South Australia Premier Unveils $500 Million Initiative to Tackle Housing Crisis Ahead of Election

Adelaide’s cityscape may soon undergo a transformative change, spurred by a bold…

U.S. Enacts Sanctions Targeting Colombia’s Leader Gustavo Petro

The United States has imposed sanctions on Colombian President Gustavo Petro, as…

Kamala Harris Teases 2024 Presidential Bid: ‘I’m Just Getting Started

Former US vice president Kamala Harris has hinted she could make another…