Artificial Intelligence Voice Fraud Gains Momentum
Artificial Intelligence Voice Fraud Gains Momentum

Artificial Intelligence Voice Fraud Gains Momentum

McAfee conducted a study in which it found that artificial intelligence technology is increasingly being used to clone people’s voices and deceive their loved ones.

McAfee researchers interviewed just over 7,000 people from various countries, about 25% of whom said they have already encountered voice fraud. Of those, 77% had lost money as a result of the deceptive scam.

In addition, experts analyzed the availability and effectiveness of AI voice cloning tools. More than a dozen free programs were found on the Internet, requiring absolutely no experience or knowledge to use. And in one of the tools tested, an audio sample lasting just three seconds was enough to copy a voice with 85% accuracy.

The voice is the unique identifying feature of each person. In phone calls, trust is instantly established between familiar interlocutors if they hear and recognize each other’s voice. However, with the growing popularity of AI, the fundamental thing of trust has become much easier to manipulate.

Fraudsters use AI to clone the voice and then send fake voice messages or call the victim’s contacts in real time, pretending they need help. That said, 70% of respondents to the McAfee study aren’t sure they could tell the difference between a fake and a real voice, especially if the calling scammer is pretending to be someone other than the victim’s closest acquaintance.

45% of those surveyed said they would respond to a message from a friend or relative in need of money, especially if they thought it was their significant other (40%), a parent (31%) or a child (20%). And the most likely reasons respondents gave for asking for help were an accident (48%), robbery (47%), loss of phone or purse (43%), and problems abroad (41%).

Either way, the cost of a mistake can be high: more than a third of voice fraud victims have lost more than $1,000, and 7% have lost between $5,000 and $15,000.

The survey also showed that because of the proliferation of dipfakes and other kinds of misinformation, people have become more cautious about what they see online. Thirty-two percent of those surveyed said they now trust social media much less.

“Artificial intelligence offers incredible opportunities, but any technology has the potential for abuse in the wrong hands. That’s what we’re seeing today: the availability and ease of use of AI is helping cybercriminals scale in increasingly compelling ways,” said Steve Grobman, McAfee CTO.

Grobman also gave some simple tips on protecting against voice fraud that will obviously be useful to everyone. It’s only a matter of time before this type of scam fully arrives in our country.

Set up a “codeword” to verify it. Make an agreement with your relatives or friends to always ask for a code word that only they will know when calling or sending a message for help or sudden money transfers.
Always check the source. You should be especially careful if the call or message comes from an unknown contact. But even if the number is familiar to you, you should stop for a second and think. Is it really the voice of someone you know? Does it sound plausible, without distortion? Would this person have contacted you for help on principle? If in the slightest doubt, you should hang up and call the person directly back to their number. And if there is no such possibility, before sending money, you should clarify such information that only your real acquaintance knows.
Think twice before sharing information about yourself online. Who makes up your social networking circle? Do you really trust these people? The wider your social circle, the more you share information about yourself – the more you put yourself or your loved ones at risk. Fraudsters are just as likely to copy your own voice to call your relatives as they are to find out information on social networks that they can effectively turn against you using voices you know.

Source: securitylab.ru

Loading

FavoriteLoadingAdd to favorites
Spread the love

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.