Skip to main content
Advertisement
Live broadcast
Main slide
Beginning of the article
Озвучить текст
Select important
On
Off

In 2024, according to VisionLabs (a structure of MTS), the number of diphake attacks amounted to tens of thousands of cases, and in 2025 this number may increase tenfold. Moreover, the real scale of attacks using dipfakes in Russia significantly exceeds the open official data. The victim may receive a call from a "close relative" or the "head" of a company or division. One of the most unpredictable threats today is the ability to generate a diphake call in real time. The Izvestia article provides an overview of current attacks and their features.

Scale of the problem

IS researchers have recently observed an increase in the use of dipfeaks for attacks, including attacks on vulnerable groups such as teenagers, women and pensioners. There are no separate statistics for these categories - there is only a general summary of dipfake fraud, including attacks on bank or telecom operator processes, socially relevant fakes, and attacks on specific individuals.

- We could be talking about tens of thousands of cases," says Tatiana Deshkina, a representative of VisionLabs. - At the moment, there is no centralized system for tracking or registering all cases of dipfake attacks in real time.

создание сфальсифицированного изображения
Photo: Izvestia/Anna Selina

Izvestia's sources in Roskomnadzor confirmed this information.

Deepfakes can appear on a variety of platforms, including social networks, messengers and video hosting, and often go unnoticed or undetected. Many cases of defacement occur on private or closed networks, making them difficult to detect and record.

Members of vulnerable populations are more likely to be targeted specifically for identity attacks. For example, there was a case when an elderly person was tricked by a dipfake of the mayor of Moscow to extort money.

VisionLabs estimates that the total losses from cyber fraud in 2024 are expected to reach 300 billion rubles. The damage directly caused by diplomfakes is large, but it is very difficult to identify a specific share.

деньги
Photo: IZVESTIA/Sergey Lantyukhov

At the same time, according to Ideco, the total damage from the activities of fraudsters, which was attributed to banking transactions made without the consent of the client, at the end of only the last nine months of 2024 exceeds 15 billion rubles.

- A standard example of a dipfake attack in the business environment is the substitution of an executive's identity," says Ideco director Dmitry Khomutov. - There is an example when a fake CEO's voice led to the theft of $243,000 from the accounts of a large British energy company.

Due to the active development of AI in recent months, there have appeared many programs that will help fraudsters to fake both the voice and create fake video portraits, adds Konstantin Ilyinykh, head of the system administration department of the IT company Simpl. Thus, Telegram can send a non-existent message in the format of a "mugshot" and deceive users. It is better to call the person back or come to discuss the issue in person if a manager or colleague asks you to transfer money.

Telegram
Photo: Global Look Press/IMAGO/Rüdiger Wölk

VisionLabs predicts that the number of attacks in 2025 will increase tenfold.

- The problem is that now even some complex and high-quality algorithms of dipfeaks can be easily found in the public domain," complains Tatiana Deshkina. - The entry threshold for generating fakes has also decreased - there are applications that allow creating dipfeaks even for people without minimal programming experience.

VisionLabs warns that one of the most dangerous methods of fraud is video calls. They are trustworthy because they take place right here and now. The trouble is that today it is possible to generate such a dipfake call in real time. At the same time, poor quality can be blamed on poor internet connection or laptop camera errors. Such video calls can be used both in an attack on a bank (let's say a dipfeik of a grandmother's face, attackers make a loan on her) and in extortion of pensioners.

Телефонный звонок
Photo: Izvestia/Mikhail Tereshchenko

Generated dipfeik call in real time adjusts to the conversation, thereby creating a sense of validity of the dialog. Swindlers use various methods of psychological pressure, presenting themselves as bailiffs, civil servants or bank employees. But the worst is when the voice of a relative forces a person or a company employee to disclose confidential information. It is these types of calls from hackers that are most dangerous to vulnerable groups.

The elderly, children and employees who are not trained in the basics of safe online behavior are most susceptible to diphake calls. According to Ideco, the overall dynamics of crimes in the sphere of information and communication technologies continues to grow: in the last three quarters of 2024, their number amounted to 564 thousand, which is almost 16% more compared to the same period last year.

Main types of diplomatic attacks

Several types of diplomatic threats can be distinguished.

- Face replacement. The face of one person is inserted instead of another in a video file. Most often this type of attack is used in video calls or "mugshots" in messengers.

- Mimicry transfer. For example, another voice is substituted for the lip movements of a person from the original video.

- Creating non-existent faces. Attackers generate images of new, realistic faces that are used for anonymous fraud or misinformation.

создание сфальсифицированного изображения
Photo: Izvestia/Anna Selina

Why do dipfakes look realistic? In video calls and short videos on messengers, scammers replace one person's face with another, creating a believable fake identity. If the attack involves voice, technology is used that copies the manner of speech and sound, making it look like the original. Such techniques make dipfakes very convincing and difficult to distinguish from a real person, which increases the risk of deception.

- A serious problem, especially in the West (in Russia this practice is still less widespread), is the use of dipfakes with pornographic content or montage," says Tatiana Deshkina of VisionLabs. - Such materials can be used to blackmail and harass women.

мошенники
Photo: Izvestia/Alexander Kazakov

Unfortunately, voice-altering services can be downloaded from developers' web sites, they are practically freely available and often do not require special skills for installation.

How to protect yourself from video impersonations

To combat fraudsters use dipfake detectors, which are constantly being improved, which is critical given the regular improvement in the quality of fakes. These detectors can detect all major types of video attacks, including face replacement, facial expression transference and the creation of non-existent faces.

And while banks and telecom operators, as a rule, have technical and organizational resources to protect against dipfakes - for example, they can restrict access to critical operations via voice assistants - in terms of socially significant fakes the situation is much more alarming, says Dmitry Sokolov, an expert of the ARPP "Domestic Soft" and head of the information security service "MoyOffice".

депрессия
Photo: Izvestia/Andrei Ershtrem

Of particular concern are diplomatic calls, which have become a new and extremely dangerous tool for fraudsters. This is a threat that can befall anyone, but especially those who may not be aware of the possibilities of technology. For them, a call, for example, from a "loved one" who is supposedly in trouble, is a real threat to lose money, take out loans in their name or even lose their homes.

The danger of such calls is in their plausibility - technology today allows you to imitate not only the voice, but also intonation, which makes fraud almost impossible to detect.

- Let's imagine a hypothetical attack where a fraudster uses a video call with a pre-generated video avatar of a loved one speaking in his voice," Sokolov says. - This is exactly what dipfake technologies are used for, which can animate a person based on a single photo, as well as synthesize speech based on voice samples.

телефон в руках пенсионера
Photo: Global Look Press/Judith Thomandl

One of the simplest but most common scenarios is sending a generated image of an alleged loved one to the police or hospital. Seeing this, an unprepared person can easily let their guard down and pass on confidential information to a fraudster. Therefore, experts urge, one should be prepared for such life scenarios as well.

Specialists point out that, as in the case of any type of fraud, first of all it is necessary to remain vigilant and carefully check all unusual appeals. If there is even the slightest doubt, it is always worth requesting information that only the person who is actually calling would know.

- With regard to dipfeaks generated in real time, which are the most dangerous and unpredictable in terms of consequences, it is worth being especially vigilant ," says Olga Popova, a leading lawyer at Staffcop (Atom Security is part of SKB Kontur Group). - Hearing the voice of a loved one, supposedly asking for help, almost any of us will lose the ability to think critically and may immediately begin to follow the instructions of intruders. Such calls create psychological pressure and nervous tension, which can lead to illegal actions, emotional exhaustion. The best quick and easy way of self-control is to call back and double-check everything.

Live broadcast