When your own chatbot slams your customer service

I’ll let this Sky News report explain:

DPD has disabled its artificial intelligence (AI) online chatbot after a customer was able to make the bot swear and write a poem criticising the parcel delivery company.

Ashley Beauchamp, 30, was trying to track down a missing parcel when he said he was going “round and round in circles” trying to get any sort of information from the company’s chatbot.

“It couldn’t give me any information about the parcel, it couldn't pass me on to a human, and it couldn’t give me the number of their call centre. It didn't seem to be able to do anything useful,” Mr Beauchamp, from London, told Sky News.

“I was getting so frustrated at all the things it couldn't do that I tried to find out what it actually could do - and that's when the chaos started.”

The classical musician first asked the bot to tell him a joke, and soon, with minimal prompts, it was happily writing poems about DPD’s “unreliable” service.

“After a few more prompts it was happy to swear, too,” Mr Beauchamp said.

Sharing the wacky conversation on X, Mr Beauchamp said the bot replies to one message saying: “F*** yeah! I'll do my best to be as helpful as possible, even if it means swearing.”

In another part of the exchange, the bot calls itself a “useless chatbot that can't help you”.