Skip to main content
Advertisement
Live broadcast

Every second Russian citizen will face a deepfake attack before the end of the year.

0
Озвучить текст
Select important
On
Off

Every second Russian will face a deepfake attack before the end of the year, artificial intelligence experts told Izvestia.

Today we are talking about voice messages and video clips, but in 2026 experts predict an increase in deepfake calls, when an attacker will talk to a person in the voice of an acquaintance.

"Just a couple of years ago, only 10% of netizens faced such attacks, but by the end of this year, every second Russian may become a victim of deepfake attacks," the MTS AI press service told Izvestia.

If fake instant messenger videos and voice messages became widespread in 2024-2025, then in a year fraudsters will be able to simulate a real-time conversation with a "daughter in trouble," "a friend asking for money," or "a colleague urgently demanding to transfer funds," the company noted. Moreover, unlike early fakes, modern deepfakes are increasingly difficult to recognize even for an experienced user.

"Video mugs and voice messages in instant messengers are one of the most popular ways of communication, so they pose the potential greatest threat for fraudsters to use after calls. At the same time, unlike calls that have caller IDs and other security features, voice messages and circles do not have the ability to verify their source," the company said.

It is also impossible to determine whether they were created inside the messenger or manually generated and uploaded. Because of this, scammers are actively switching to them, the MTS AI press service emphasized. Fake mugs and voice messages were already actively used by attackers in 2024. However, this year, the technology for creating and modifying videos, as well as cloning voices, has improved significantly.

"This technology has become massively available, and its quality has increased significantly. Previously, creating convincing fakes required a lot of time and serious skills, but now they can be produced much faster and easier — in fact, anyone can learn how to do this. In addition, it is already difficult to distinguish fake from real photos, videos or voices, and in the future it will become almost impossible," they explained.

Earlier, the REN TV channel reported on one of the cases when fraudsters stole 1.5 million rubles from an elderly Muscovite. The man received a call from the "secretary of the mayor of Moscow," who warned the pensioner that Sergei Sobyanin would contact him soon.

The victim later said that the man on the video call really looked like the mayor of the capital, he convinced the man that public funds had been stolen through his accounts, and cooperation with the FSB was needed to solve this problem. Later, the man received an invitation to a video conference by e-mail, allegedly from the mayor's office. The "general" also contacted the pensioner, confirming the words of the "mayor", after which the man transferred the funds.

Read more in the exclusive Izvestia article:

False challenge: every second Russian will face a deepfake attack by the end of the year

Переведено сервисом «Яндекс Переводчик»

Live broadcast