Skip to main content
Advertisement
Live broadcast
Main slide
Beginning of the article
Озвучить текст
Select important
On
Off

There will be twice as many cases of fraud with voice messages by the end of the year compared to 2024, according to Izvestia's consensus forecast. Experts are already seeing the development of this trend. For example, fraudsters actively extract speech recordings of top managers of companies and try to use them to deceive employees. Analysts stressed that today every tenth Russian has experienced such attacks. What measures will help in the fight against intruders can be found in the Izvestia article.

Russians are being massively deceived with fake voicemails.

By the end of 2025, the number of cases of deception of Russians using fabricated voice messages will increase 2.2 times, according to the consensus forecast of Izvestia. This is facilitated by the rapid development of speech and video synthesis technologies: today, you can simulate the voice of a public figure or a company executive in just a few minutes.

Kontur, an information security expert, agrees with this assessment.Daniel Borysoslavsky's aegis — according to him, cases of such deception are on the rise, especially before the New Year due to emotional involvement, reduced attention and active use of digital services. At the same time, attempts of such attacks may double by the end of the year — from 3% of all types of fraud in 2024 to 5-6% by the end of this year, said Sofya Lukinova, head of the legal department at VMT Consult.

Дипфейк
Photo: IZVESTIA/Anna Selina

In the first half of 2025, there were significantly more incidents with deepfakes than in the same period last year - the growth is estimated at about 100-150%, said Yaroslav Kabakov, Director of Strategy at Finam IC. There is evidence that more cases were recorded in the first quarter of 2025 than in the whole of 2024, he added. Taking into account these dynamics, the number of attacks using deepfakes may increase at least twice by the end of the year, the expert concluded.

A disappointing forecast was given by Alexey Kozlov, a leading analyst at the Spikatel Information Security monitoring department. By the end of 2025, according to him, the total number of deepfakes in Russia may grow fourfold compared to 2024, and by the end of 2026 it will grow at least twice more. The reasons for this are a decrease in the threshold for entry into technology, an increase in the availability of tools and a fairly high activity of cybercriminals, he summed up.

How scammers use deepfakes today

There are several times more fraudulent schemes using pre-recorded voice deepfakes than in 2024. This assessment was given by Sergey Golovanov, chief expert of Kaspersky Lab, in an interview with Izvestia. Attackers can use any speech recordings they can find on the Internet for this purpose.

— Scammers imitate the voice messages of public people, such as company directors, those with whom recordings have been published on the Internet at least once. Moreover, they are constantly changing their legends," said Sergey Golovanov.

бизнесмен
Photo: Global Look Press/Oskar Eyb

If the victim does not respond to a message from the fake director, the scammers can send a voicemail on behalf of his deputy. In such audio, the "head" warns about a call from a "regulator" or "law enforcement officers", who then urgently demand to transfer money, Kaspersky Lab noted. In this case, the victim is further pressured by the authority of the boss, said Vladimir Ulyanov, head of the Zecurion analytical center.

The fake boss scheme was 30% more common in the third quarter than in the second quarter, said Alexey Luzhnov, head of Bi.Zone AntiFraud. He warned that scammers pre-collect enough data to create the most convincing content.

Such attacks often use multiple steps, said Ivan Goryachev, manager of the Servicepipe educational project. For companies, voice deepfakes are becoming the first stage for hacking internal systems. When attacking citizens, attackers try different formats, from audio messages in messengers to imitation of real calls, he added.

наушник
Photo: IZVESTIA/Anna Selina

According to various estimates, every tenth Russian citizen has already encountered deepfake voice technology, Ivan Goryachev said. Their creation has become more affordable both in terms of cost and due to a noticeable increase in the quality of synthesized voice — sometimes an inexperienced listener is not able to recognize the substitution. Scammers actively use these opportunities.

It is impossible to estimate the exact volume of such attacks, because they are not officially recorded — companies prefer not to make such cases public, said Alexander Bleznekov, head of the information security department of the Telecom Exchange. However, the growth dynamics can be traced by indirect signs: the number of calls to security services, the activity of discussions on professional platforms and available reports — these indicators are growing.

МВД РФ
Photo: IZVESTIA/Sergey Lantyukhov

Izvestia asked the Ministry of Internal Affairs for information on how the agency is countering fraud using deepfakes.

How to identify a fake voice message

Many people are already experiencing fake voice messages, Kontur, an information security expert, confirmed.The Aegis of Daniel Borislavsky. According to him, his colleague's messenger account was hacked and fake videos and audio were created using his personal correspondence.

— A video arrived, an ordinary "circle" with frames that were often sent, and a synthesized voice was added to it. The text sounded like this: "Yes, it's me, really, I need help," but the voice and video were collected from existing materials, so everything looked as natural as possible, Daniil Borislavsky clarified.

телефон
Photo: IZVESTIA/Andrey Erstrem

Denis Kalemberg, SafeTech Group CEO and expert, told another story. The potential victim was a 48-year-old woman who received a "circle" in the messenger from an old acquaintance who works in law enforcement agencies.

— He greeted her in his characteristic manner, "Hello, Nadiukha," asked about his son's health, and then said that she had been noticed in transfers that were used to finance terrorism, and his colleague would call her soon. The woman believed unconditionally, with subsequent calls "from the authorities" she talked for 32 hours with a break for sleep. During this time, she managed to withdraw money from the deposit, take out a loan, sell the car and give everything to them. But as a result, the deal did not take place," Denis Kalemberg shared.

Nikita Lyakhovetsky from the press service of the project "For the Rights of Borrowers" told Izvestia that his friend was sent fake voice messages on behalf of the leaders of her former company. Similar cases have occurred with her ex-colleagues.

Lipetsk Mayor Roman Chentsov reported on the creation of a fake account on his behalf in social networks in March. He personally warned that the video with his participation was a fake created using deepfake technology.

симкарта
Photo: IZVESTIA/Yulia Mayorova

The best protection against fraudsters with voice fakes and deepfakes is not to rely on messages and calls, but to always check information through a second channel: a corporate chat, an official phone number or a personal message, said Natalia Arkhipova, a volunteer expert at the Association for the Development of Financial Literacy (ARFG). One should not be afraid to disturb the management — it is more correct to prevent deception, added Alla Khrapunova, deputy head of the project "For the Rights of Borrowers", curator of the Moshelovka platform.

The authorities have already adopted two packages of anti-fraud measures. The second one provides for criminal liability for the use of artificial intelligence (AI) in fraud, extortion, theft and other crimes. In addition, it is planned to introduce a ban on calls from foreign SIM cards. Telecom operators will be required to block numbers from the register of suspicious ones, and in case of violation, compensate the victims.

Taibat Agasieva, Maria Kolobova, Yana Sturma, Valeria Mishina worked on the material.

Переведено сервисом «Яндекс Переводчик»

Live broadcast