Bots of Love

AI experts warn new chatbot could charm - and scam you

1,228

by Milica Anic

Artificial intelligence experts say chatbots are becoming more sophisticated, making it easier to scam victims out of money and steal personal information.

ChatGPT, released three months ago, is such a convincing chatbot that it can be confused for a real person.  

Designed to simulate conversations with humans, chatbots have been used to deceive people through romance scams by persuading them to send money or stealing their information. 

 A worldwide online protection organization, McAfee, released a research report last month showing two-thirds of the 5,000 people surveyed were unable to tell if a love letter was written by ChatGPT. 

An informal poll conducted by the Voice showed that 24 of 32 Langara students were afraid of being scammed by chatbots on dating apps

Experts warn about love scams

UBC computer science professor Jeff Clune previously worked as a research team leader for OpenAI, the company that developed ChatGPT. He said it is increasingly more difficult to tell the difference between computer-generated and human-generated texts.

Clune said scams begin with a fake romantic relationship and lead to stealing people’s money and personal information. With AI, these can be done on a larger scale given the cost is lower, he said.

“If you gain trust, as any scammer knows, eventually, you can take advantage,” Clune said.

Tools to identify chatbot texts “will never be good, and almost certainly will never be perfect,” he said. 

AI specialist Jesse Hoey, a computer science professor at the University of Waterloo, said today’s advanced chatbots can gather information on dating techniques and behaviours and use that information to scam more victims.

He said some techniques used can be misogynistic and are used to manipulate somebody, especially the most vulnerable people. 

“People who are looking for a romantic encounter might be willing to just believe what they read,” Hoey said. 

Justin Yao, deputy chief information officer at Langara College, said in an email to the Voice that scammers who set up malicious chatbots have the capacity to install harmful software on a person’s computer.

He warned people should not reveal too much about themselves and should be careful about the information they disclose. He also suggested people should try to find out more about the person they are interacting with and do a reverse search on any photo provided. Yao also advised people to take their time in their interactions as scammers usually try to “rush you into things.”

“Be suspicious and curious,” he said. “If something is too good to be true, it usually is.”

Fear being catfished

First-year Langara accounting student Deborah Nwankwo said she was duped on Tinder and understands the dangers of deception.

She was matched on Tinder with a person she thought was a “tall Italian.” Nwankwo became alarmed when his pictures suddenly changed. She realized it was someone who was catfishing, faking an identity.  

Nwankwo said if she could be deceived by a person, she could not rule out being tricked by a chatbot. 

“I don’t think I’ll ever know,” she said.

Comments are closed.

buy metronidazole online