- Статьи
- Economy
- False challenge: every second Russian will face a deepfake attack by the end of the year

False challenge: every second Russian will face a deepfake attack by the end of the year

Every second Russian will face a deepfake attack before the end of the year, artificial intelligence experts told Izvestia. Today we are talking about voice messages and video clips, but in 2026 experts predict an increase in deepfake calls, when an attacker will talk to a person in the voice of his acquaintance. Previously, it took a lot of time and serious skills to create such convincing fakes, but today this technology risks becoming widespread due to the development of artificial intelligence.
How scammers deceive Russians
Today, deepfake technologies are developing at a rapid rate, and it is becoming increasingly difficult for users to distinguish the "mask" from reality. Just a couple of years ago, only 10% of netizens faced such attacks, but by the end of this year, every second Russian may become a victim of deepfake attacks, the MTS AI press service told Izvestia.
If fake instant messenger videos and voice messages became widespread in 2024-2025, then in a year fraudsters will be able to simulate a real-time conversation with a "daughter in trouble," "a friend asking for money," or "a colleague urgently demanding to transfer funds," the company noted. Moreover, unlike early fakes, modern deepfakes are increasingly difficult to recognize, even for an experienced user.
— Video mugs and voice messages in instant messengers are one of the most popular ways of communication, therefore they pose the potential greatest threat for fraudsters to use after calls. At the same time, unlike calls that have caller IDs and other security features, voice messages and circles do not have the ability to verify their source, the company said.
It is also impossible to determine whether they were created inside the messenger or manually generated and uploaded. Because of this, scammers are actively switching to them, the MTS AI press service emphasized. Fake mugs and voice messages were already actively used by attackers in 2024. However, this year, the technology for creating and modifying videos, as well as cloning voices, has improved significantly.
— This technology has become massively available, and its quality has increased significantly. Previously, creating convincing fakes required a lot of time and serious skills, but now they can be produced much faster and easier — in fact, anyone can learn how to do this. In addition, it is already difficult to distinguish fake from real photos, videos or voices, and in the future it will become almost impossible," they explained.
Earlier, the REN TV channel reported on one of the cases when fraudsters stole 1.5 million rubles from an elderly Muscovite. The man received a call from the "secretary of the mayor of Moscow," who warned the pensioner that Sergei Sobyanin would contact him soon.
The victim later said that the man on the video call really looked like the mayor of the capital, he convinced the man that public funds had been stolen through his accounts and cooperation with the FSB was needed to solve this problem. Later, the man received an invitation to a video conference by e-mail, allegedly from the mayor's office. The "general" also contacted the pensioner, confirming the words of the "mayor", after which the man transferred the funds.
And this is far from an isolated case. In April 2025, the head of Kemerovo, Dmitry Anisimov, warned citizens about the creation of his digital double.
"The attackers create fake videos with my image and make video calls on my behalf via messengers. Under the guise of personal treatment, they try to mislead. I ask you to remain vigilant," Dmitry Anisimov wrote on his Telegram account.
In addition to this scheme, attackers often get to know girls in dating apps. So, one of the representatives met an attractive guy, and over time their communication flowed into Telegram, where the man sent her voice messages and called her on video. He put on a mask during video calls.
— The girl, for reasons that we will not name for ethical reasons, transferred several million rubles to him. After that, the man disappeared. He was from an African country, but he presented himself as a white American," MTS AI shared the story, noting that they know at least five such stories.
How deepfakes are created
Technologies for creating deepfakes are neutral in themselves — what matters is the purpose for which people use them. Gradually, access to such tools is becoming easier. However, despite the existence of various paid and free solutions for creating voice or video fakes, it is still difficult for attackers to make a truly plausible fake, says Dmitry Anikin, head of data research at Kaspersky Lab.
— In the case of video fraud, fraudsters need to collect a set of images of a potential victim, for example, by splitting the video into frames, and in large numbers — the more, the more realistic the result will be. This is due to the fact that the model must "see" a person's face from different angles and in different lighting conditions, the expert said.
According to him, voice forgeries require less data, but the result may be unstable, which means the attacker will have to make more attempts at generation.
Nevertheless, scammers use various tricks to hide the flaws of deepfakes, the expert said. In particular, fake videos in messengers (for example, in circles) are one of these techniques, since they are usually small and often blurry, which helps hide artifacts.
— In addition, people are used to this format of communication and consider it safer, so they ask less questions. Sometimes there is not even sound on such videos — the main task of intruders in this case is to attract the victim's attention and reduce her vigilance," Dmitry Anikin explained.
Tatiana Deshkina, head of the Visionlabs product portfolio, added that criminals use several types of applications and software. The first ones allow you to overlay any face on an already recorded video, where the scammer asks for an urgent transfer of money, and the second ones change the soundtrack.
— For example, a video of a relative is stolen, his voice is cloned and a new text is superimposed: "Masha, I was in an accident, I urgently need 30 thousand rubles. rubles, otherwise they'll put me in jail," she said.
If we are talking about organized groups of scammers, they can have all the tools in one application developed by them independently. They also most likely have their own teams for further training of neural networks, the expert concluded.
Today, it is impossible to accurately estimate what proportion of voice messages is criminal and how many people have already been affected, since users usually report fraud only in the hope of getting money back, the MTS AI press service concluded.
Переведено сервисом «Яндекс Переводчик»