Dame Jacqueline recently brought back Ellie, Magda and Nadine in her adult fiction book Think Again, a continuation of her Girls in Love novels.
But in extreme examples chatbots have been accused of giving harmful advice.Character.ai is currently the subject of legal action from a mother whose 14-year-old son took his own life after reportedly becoming obsessed with one of its AI characters. According to transcripts of their chats in court filings he discussed ending his life with the chatbot. In a final conversation he told the chatbot he was "coming home" - and it allegedly encouraged him to do so "as soon as possible".
Character.ai has denied the suit's allegations.And in 2023, the National Eating Disorder Association replaced its live helpline with a chatbot, but later had to suspend it over claims the bot was recommending calorie restriction.In April 2024 alone, nearly 426,000 mental health referrals were made in England - a rise of 40% in five years. An estimated one million people are also waiting to access mental health services, and private therapy can be prohibitively expensive (costs vary greatly, but the British Association for Counselling and Psychotherapy reports on average people spend £40 to £50 an hour).
At the same time, AI has revolutionised healthcare in many ways, including helping to screen, diagnose and triage patients. There is a huge spectrum of chatbots, and about 30 local NHS services now use one called Wysa.Experts express concerns about chatbots around potential biases and limitations, lack of safeguarding and the security of users' information. But some believe that if specialist human help is not easily available, chatbots can be a help. So with NHS mental health waitlists at record highs, are chatbots a possible solution?
Character.ai and other bots such as Chat GPT are based on "large language models" of artificial intelligence. These are trained on vast amounts of data – whether that's websites, articles, books or blog posts - to predict the next word in a sequence. From here, they predict and generate human-like text and interactions.
The way mental health chatbots are created varies, but they can be trained in practices such as cognitive behavioural therapy, which helps users to explore how to reframe their thoughts and actions. They can also adapt to the end user's preferences and feedback.conducted near Australia and New Zealand's waters in February, Marles said that while it was "disruptive, and we believe that it could have been done in a better way", ultimately "China was acting in accordance with international law".
"The guiding light, the bedrock here, needs to be compliance with international law. That's what we keep talking about, is the rules-based order."Marles was also asked about Hegseth's call for Indo-Pacific partners to increase defence spending as a bulwark against the threat of China.
Marles said "we actually are taking steps down this path… we understand it, we're up for it." US President Donald Trump has called on Australia to increase its spending to 3%, but Canberra has yet to publicly commit to that number.Marles added that part of that spending would come under Aukus, a pact among Australia, the UK and the US to build up a fleet of nuclear-powered submarines.