Telecom Builds AI ‘Grandmother’ Bot to Talk to Phone Scammers and Waste Their Time
Called Daisy, or "dAIsy," the voice-based AI mimics a senior citizen to hold meandering conversations with phone scammers.
For all of AI’s faults—like encouraging people to eat deadly mushrooms—sometimes it can be used to good ends. O2, the UK’s largest mobile network operator, has deployed a voice-based AI chatbot to goad phone scammers into meandering, fruitless conversations. Called Daisy, or “dAIsy,” the chatbot mimics the voice of an elderly person, the most common target for phone scammers.
The purpose of Daisy is to automate “scambaiting,” or the practice of intentionally wasting phone scammers’ time to keep them away from potential real victims as long as possible. Scammers use social engineering to exploit the naivety of the elderly, convincing them, for instance, that they owe back taxes and are about to be arrested if they don’t wire funds immediately.
When a scammer gets Daisy on the phone, however, they’re in for a long conversation that ultimately won’t go anywhere. If they do reach the point where the scammer asks for personal information, like bank details, Daisy will make up fake information. O2 says that it’s able to reach scammers in the first place by adding a phone number for Daisy onto “easy target” lists that scammers use for leads.
In a video demonstrating Daisy, soundbites from real conversations show scammers becoming increasingly exasperated, being kept on the phone for upwards of 40 minutes, and holding out hope they will get a credit card number or bank details. The AI model that O2 made sounds very convincing—it’s doing all the processing in real-time, but thankfully that’s made easier as the elderly tend to speak quite slowly.
Of course, the concern with a chatbot like Daisy is that the same technology can be used to opposite ends—we have already seen instances where real people, like CEOs of large companies, have had their voices deepfaked in order to trick others into sending money to a scammer. The elderly are already vulnerable enough. If they get a call from someone who sounds like a grandchild they are almost certain to believe it’s real.
Ultimately, blocking fraudulent calls and shutting down the organizations that run these scams would be the ideal solution. Carriers have gotten better at identifying scammers and blocking their numbers, but it remains a cat-and-mouse game. Scammers take advantage of automated dialing tools that allow them to dial numbers in rapid succession and are alerted by the tool only when it gets an answer. An AI bot that frustrates scammers by answering and wasting their time is better than nothing.