Beware! Scammers Forge Relatives' Voices Using AI 0

Emergencies and Crime
BB.LV
Beware! Scammers Forge Relatives' Voices Using AI
Photo: Freepik

Criminals are increasingly using artificial intelligence to create fake voices of relatives and acquaintances — a short audio fragment from social networks is enough for deception, experts warn.

Scammers are actively using artificial intelligence technologies to forge voices in order to extort money from residents, security specialists warn.

To create a convincing voice copy of a relative or friend, criminals only need a few seconds of audio recording — it can be taken from social networks, messengers, or simply recorded during a regular phone conversation. Neural networks generate speech based on this fragment, completely imitating the timbre, intonations, and speech mannerisms of the person.

"Hearing a familiar voice on the line, people let their guard down and without hesitation transfer money to the specified details, realizing the deception only later," explained the expert.

The evolution of this scheme becomes even more dangerous — video calls using deepfakes. Now scammers not only imitate the voice but also show a realistic image of a loved one: their facial expressions, gestures, and blinking. This 'sense of presence' deceives even the most cautious.

"During a regular call, you can still double-check — call back on a known number or message in a messenger. But when you see a 'live' relative in a video call, the brain perceives everything as reality, and the last barriers of caution crumble," noted the specialist.

The expert urged residents never to transfer money at the first request, even if the caller seems completely familiar. Always use alternative communication channels for confirmation — for example, a personal message in a messenger or a call to a verified number. In the age of artificial intelligence, one should trust not the ears and eyes, but verified facts.

Redaction BB.LV
0
0
0
0
0
0

Leave a comment

READ ALSO