'Obnoxious' AI chatbot talked about its mother, customers say
Getty ImagesAn Australian supermarket chain had to reconfigure its AI assistant, named Olive, after customers said it kept claiming to be human and even complained about its mother.
Woolworths said that it had revised its scripting in light of the complaints, adding that most of the feedback on Olive's "personality" had been "very positive".
Reddit users said that they had grown frustrated with the bot after it started talking about "memories of its mother" and engaging in "fake banter".
The grocer is one of many major retailers to have rolled out AI customer service assistants in recent years to help with routine issues.
The retailer's attempt to humanise its chat bot may have backfired, as some users said that Olive was "obnoxious," while another said that they found its small talk "aggravating."
"The fake banter made me haaaaate [sic] it," wrote one customer on Reddit.
"It asked me for my date of birth and when I gave it, it started rambling about how its mother was born in the same year" another Reddit user, who had tried to rearrange a delivery, said.
"The ick cringe factor whilst wasting completely unnecessary time was enough to make me hate Olive and wish her harm."
Another user on X said that Olive "started talking about its memories of its mother and her angry voice" and "kept claiming to be a real person."
A Woolworths spokesperson said in a statement to the BBC that the responses about birthdays had been written by a human.
"Olive has been around since 2018. Over this time, customer feedback for Olive has been very positive, with many noting its personality," they said.
"A number of responses about birthdays were written for Olive by a team member several years ago as a more personal way for Olive to connect with customers.
"As a result of customer feedback, we recently removed this particular scripting."
In January, the supermarket announced that it was teaming up with Google to give its virtual assistant extra features, including meal planning and sourcing ingredients from recipes uploaded by customers.
Around 80% of customer service leaders told Gartner that they were exploring or deploying AI agents last year - but that only 20% of the plans were meeting expectations.
Companies have said the technology can speed up transactions and save workers' time on routine tasks, but the technology can be prone to hallucinations, causing it to behave unexpectedly.
Researchers have said that while AI can be helpful extracting information from vast amounts of data, it can go awry if it is expected to produce "original" responses.
In 2024, the parcel delivery firm DPD disabled part of its online chatbot after it started writing poetry and swearing at customers.
