- AI chatbots mimic elderly voices to thwart scammers
- Developed by Prof Dali Kaafar at Macquarie University
- Bots engage scammers, gather data, improve over time
A scammer calls and asks for a passcode. Malcolm, an older man with an English accent, appears perplexed.
What’s this business you’re talking about?” Malcolm asks.
Another day, another fraudulent phone call.
This time, Ibrahim, a helpful and polite man with an Egyptian accent, answers. “Frankly, I am not too sure I can recall buying anything recently,” he informs the aspiring con artist. Maybe one of the kids did,” Ibrahim continues, “but that’s not your fault, is it?
The scammers are legitimate, but Malcolm and Ibrahim are not. Prof Dali Kaafar and his team have produced two talking artificial intelligence bots, among others. Kaafar developed Apate based on his study at Macquarie University, named after the Greek goddess of deception.
Apate’s goal is to fight worldwide phone fraud using conversational AI, leveraging existing systems that allow telecommunications firms to redirect calls they suspect are from scammers.
Kafaar was motivated to take on telephone fraudsters after he pulled a “dad’s joke” on a scam caller in front of his two children while they picnic in the sun. He kept the scammer on the line with meaningless conversation. “The kids had a perfect laugh,” he says. “And I thought the goal was to deceive the scammer and waste their time so they wouldn’t talk to others.
“Scamming the scammers, if you like.”
The next day, he summoned his colleagues from the university’s Cyber Security Hub. He reasoned that there must be a better approach than his “dad joke” method. And there had to be something more intelligent than the popular existing piece of technology, the Lennybot.
Lenny existed before Malcolm and Ibrahim.
Lenny is an elderly Australian man who enjoys a long conversation. He’s a chatbot who was created to prank telemarketers.
Lenny repeats several words on a loop, his voice thready and tinted with a little whistle. Each phrase begins after 1.5 seconds of silence, mimicking the pace of a conversation.
The anonymous designer of Lenny shared on Reddit that they designed the chatbot to be a “telemarketer’s worst nightmare … a lonely old man who is up for a chat, proud of his family, and can’t focus on the telemarketer’s goal”. The act of tying up scammers is known as scam baiting.
Apate bots to the rescue!
Since December 2020, Australia’s telecommunications operators have stopped nearly 2 billion fraudulent phone calls.
There may be hundreds of thousands of “victim chatbots” presently, thanks partly to $720,000 in financing from the Office of National Intelligence. Bots of varying “ages” speak English with different accents. They exhibit a variety of emotions, personalities, and behaviours. They can be naive, sceptical, or rude.
If a telecoms firm discovers a scammer and routes it to a system like Apate, the bots will keep the scammers occupied. They experiment with various tactics to see what works best to keep scammers on the line longer. The machines improve their patterns through success and failure.
As they do so, they gather intelligence and detect new frauds, collecting data on how long the call lasts, when the scammers are most likely to call, what information they seek, and their strategies.
Kafaar hopes Apate will undermine the scam-calling business model, which massive, multibillion-dollar criminal organizations frequently organize to use the gathered intelligence to anticipate and deal with scams in absolute real-time.
“We’re talking about real criminals making our lives miserable,” Kafaar explains. “We’re talking about the risks to real people.
Humans may lose their life savings, be handicapped by debt, or be psychologically harmed by shame.
According to Richard Buckland, a cybercrime lecturer at the University of New South Wales, technology such as Apate differs from other methods of scambaiting, which can be amateur or amount to vigilantism.
“Normally, scambaiting is problematic,” he explains. “But this is clever.”
He warns that mistakes can be made when people take matters into their own hands.
“You can attack the wrong person.” He claimed that many scams are perpetrated by persons in service, nearly slavery, “and they’re not the bad guys.”
“[And] some scammers are inclined to go one step farther and break the law themselves. To hack back or interact with them. That’s problematic.”
However, he claims that the Apate model looks to be employing AI for good, serving as a “honeypot” to attract and learn from criminals.
Buckland advises that there must be high confidence that telecom firms are diverting scammers to AI bots because misidentification occurs everywhere. He also warns that criminal organizations utilize AI technology to train their systems.
He claims that the technology used to trick the trickers may also be used to fool people.
Scamwatch is managed by the National Anti-Scam Centre (NASC) on behalf of the Australian Competition and Consumer Commission. According to an ACCC spokeswoman, scammers typically imitate well-known organizations and counterfeit authentic phone numbers.
According to the spokesman, criminals create a sense of urgency to compel their intended victims to act quickly. They frequently attempt to persuade victims to reveal personal or bank account information or allow remote access to their machines.
Criminals may already have information about their intended victims, such as their name or address, that they illegally obtained or purchased through a data breach, phishing, or other scam.
This week, Scamwatch had to issue a warning on a meta-scam.
Scammers purporting to be from the NASC called innocent people and informed them that they were being probed for their involvement in fraud.
“Seize the opportunity: Sign up for Webull UK and receive free shares!”
According to the NASC, individuals should quickly hang up on scammers and “not attempt to engage with criminals.” The spokesman stated that the organization was aware of “technology initiatives to productions scambaiting using AI voice personas” such as Apate and would be interested in examining any examination of the platform.
Meanwhile, a vibrant scambaiting community is online, and Lenny is still regarded as a cult hero.
In one notable recording, Lenny urges the caller to wait a minute while ducks begin to honk in the background. “Sorry about that,” Lenny apologiapologizes were you saying again?”
“Are you next to your computer?” the caller inquires, impatient. “Do you own a computer?” “Can you get next to the computer now?”
Lenny continues until the scammer loses control: “You shut up. You need to stop talking.”
“Could you just hang on?” Lenny inquires as the ducks resume their quacking.