Skip to main content
Advertisement
Live broadcast
Main slide
Beginning of the article
Why are voice greetings on answering machines interesting to scammers
How Scammers use voice greeting data
What are the dangers of fraud schemes with voice deepfakes?
Methods of protection
Select important
On
Off

Fraudsters can use personal voice greetings on answering machines to create deepfakes, experts have warned about this. These recordings allow attackers to mimic the voice of a potential victim using neural networks and then use it in attacks on her environment. For more information about how scammers use answering machine recordings to create deepfakes, how dangerous it is and how to protect themselves from such threats, read the Izvestia article.

Why are voice greetings on answering machines interesting to scammers

A personal voice greeting on an answering machine is the same voice pattern as, for example, a real phone conversation or a voice message from a stolen account, Evgeny Egorov, a leading analyst at the Digital Risk Protection department of F6, says in an interview with Izvestia. At the same time, intruders have been using voice deepfakes created using artificial intelligence (AI) for a couple of years.

— Today, there are many tools for creating fake videos and audio messages that criminals can use even without special technical knowledge, — says the expert.

голосовое сообщение
Photo: Global Look Press/Karl-Josef Hildenbrand

At the same time, much depends on the length of the voice greeting or recording, Maxim Buzinov, head of the R&D Laboratory at the Solar Group Cybersecurity Technology Center, adds. For example, a three-second passage is suitable for recreating the timbre of a voice in monotonous speech, but it will take a longer episode to copy emotions and characteristic features.

"Today, a short recording is enough to repeat the voice, but its quality is likely to be not very convincing," agrees Dmitry Anikin, head of data research at Kaspersky Lab. However, if the owner of the answering machine records a longer voice greeting, it can become a very dangerous tool in the hands of fraudsters.

How Scammers use voice greeting data

It is quite difficult to record direct cases of cybercriminals using voice greetings on an answering machine, since cybersecurity experts most often encounter ready-made voice imitation facts, Nikita Novikov, an expert on cybersecurity at Angara Security, says in an interview with Izvestia.

"However, it is likely that answering machines are one of the most popular ways in which attackers can get a person's real voice for subsequent forgery," says the expert.

хакер
Photo: IZVESTIA/Sergey Lantyukhov

The answering machine can play phrases, keywords, intonation and manner of speech of a person. And the attacker records the audio track and enters it into the AI system, which, after analyzing the recording, reproduces the voice with the necessary text necessary for the fraudster, explains Nikita Novikov.

—An answering machine is an accessible source of information, because fraudsters do not need any hacking tools to access it," says Alexandra Shmigirilova, GR director of the Security Code information security company. — Without any effort, they receive two types of data at once: a phone number and a voice sample.

мошенники
Photo: IZVESTIA/Alexander Kazakov

At the same time, forged audio messages are used in many common schemes, says Evgeny Egorov. Scammers send such messages, for example, asking to lend money to relatives, colleagues of the victim or in group chats in order to mislead them and make them believe that the request comes from a familiar person. In schemes with fake acquaintances FakeDate or with messages from the head of FakeBoss, attackers get even more opportunities to deceive victims, creating the illusion of communication with a real person.

What are the dangers of fraud schemes with voice deepfakes?

Voice messages that hackers fake with the help of AI are becoming more and more high—quality - the intonation of the voice is becoming more natural, and there are fewer and fewer defects in the recordings that give out fake, says Evgeny Egorov. The deepfake algorithms themselves are a consequence of the development of AI algorithms, which greatly simplified the lives of intruders and lowered the entry threshold for criminals.

"A high—quality audio fake with a variety of content will require a highly qualified and expensive specialist," says Maxim Buzinov. — Therefore, high-quality complex fakes with a variety of content belong to targeted phishing.

голосовое сообщение
Photo: Global Look Press/Karl-Josef Hildenbrand

Izvestia's interlocutor compares AI algorithms for creating audio tracks with technologies for printing high-quality counterfeit bills. But if there are means to quickly and accurately validate fakes to identify counterfeit money, then there are few such opportunities in the case of voice fakes. At the same time, voice falsifications are often noticeable by ear upon careful examination.

However, at the moment, an unprepared person does not have the time and skills to analyze messages from a fake boss or relative, Maxim Buzinov notes. At the same time, major software developers are developing tools to combat voice forgery. There are already many algorithms for detecting voice artifacts and detecting artificially generated content.

— In most cases, they identify certain types of artifacts and do it at a good level, — says the specialist. — However, there is still no means to analyze all cases of forgeries with high speed and quality of work.

телефон в руках
Photo: Global Look Press/Karl-Josef Hildenbrand

The situation is complicated by the fact that ordinary people use various distortions in communications, for example, to improve their voices. You can simply sing on a video message to the soundtrack, which is now filled with social networks, so that the anti-fake technology gives a false positive. These aspects complicate the work of qualitative identification of deepfakes, which benefits fraudsters, Maxim Buzinov emphasizes.

Methods of protection

In order to protect yourself from fraud schemes related to the use of voice greetings on answering machines, it is better not to record them yourself, but to use the services of telecom operators who have their own tools, advises Alexandra Shmigirilova. Using such tools, you can leave a message asking you to call back, but they do not give the caller a voice sample, since the robot is responsible for the user.

— So that an attacker does not have the opportunity to simulate a voice, it is recommended either to abandon the answering machine in principle, or to use some kind of robotic analogue, agrees Nikita Novikov.

Телефонный звонок
Photo: IZVESTIA/Anna Selina

In order not to fall for the tricks of scammers who send voice messages from the numbers of acquaintances or phone callers, the expert advises to always check the sender's information by calling the number saved in the phone book. You should also follow the rules of safe communication in messengers and generally double-check all incoming information.

телефон в руках
Photo: IZVESTIA/Andrey Erstrem

It is not yet possible to completely protect yourself from voice copying using neural networks, and such protection is unlikely to appear in the near future, says Dmitry Burmashov, senior information security engineer at R-Vision. However, the ability to recognize fraudulent schemes remains an important element. So, if you are contacted by an official and you doubt his authenticity, please specify whether it is possible to perform the required actions through the official website or application. If the actions can only be performed over the phone, this is a reason to be wary.

— If friends call you asking you to transfer something or transfer money, call them back yourself and ask two questions: one that only they know, and the second is obviously wrong for them, but common for an outsider, - the expert concludes.

Переведено сервисом «Яндекс Переводчик»

Live broadcast
Следующая новость
На нашем сайте используются cookie-файлы. Продолжая пользоваться данным сайтом, вы подтверждаете свое согласие на использование файлов cookie в соответствии с настоящим уведомлением и Пользовательским соглашением