How hackers use AI deepfakes to attack crypto professionals

Cybercriminals linked to North Korea have sharply increased their campaigns against crypto industry participants by employing the latest video synthesis technologies. According to research firms, these hackers send video calls to victims using AI-generated deepfakes of faces familiar or trusted by the targets. The goal is simple but effective: to persuade the victim to install malicious software disguised as a harmless application.

Social Engineering Methods: From Compromise to Infection

The attack mechanism involves several stages. Initially, hackers compromise accounts on popular messaging platforms, including Telegram. They then initiate a video call that appears to be from a familiar person, but in reality, the victim is seeing a synthetic image.

Martin Kuhar, co-founder of the BTC Prague conference, revealed details of one incident. The attacker convinced the victim to download a browser extension supposedly to fix Zoom audio issues. However, disguised as a useful plugin, it was malware that gave the attacker full control over the compromised device. This again demonstrates that even professionals can fall victim to carefully planned attacks.

Anatomy of Malicious Software: Multi-Layered macOS Infections

Research firm Huntress found that malicious scripts delivered by hackers demonstrate high complexity. When they infect macOS devices, they are capable of multi-level infection, connecting hidden backdoors, tracking keystrokes, and intercepting clipboard contents. Moreover, these programs target wallets and can access encrypted user assets.

Analysts confidently attributed these cyberattacks to the Lazarus Group, also known as BlueNoroff. This state-sponsored hacking group, funded by North Korea, specializes in targeted operations against the cryptocurrency sector.

Why AI Deepfakes Make Authenticity Verification Impossible

Blockchain security experts from SlowMist point out a pattern: hackers continually reuse proven techniques in their operations, adapting them to specific wallets and targeted crypto professionals. With the proliferation of synthetic video creation and voice cloning technologies, traditional video verification methods are becoming unreliable.

Previously, a video call with a person was considered sufficient proof of their identity. Today, that assumption no longer holds. Hackers can synthesize not only the face but also facial expressions, intonation, and speech mannerisms unique to a person. The victim sees what appears to be a real person, but in fact, it is a computational construct.

Strengthening Defense: A Multi-Layered Strategy for the Crypto Community

Given the scale and sophistication of modern attacks, the crypto industry must adopt stricter protection protocols. The first layer is enabling multi-factor authentication on all critical accounts. The second is using hardware security keys instead of software-based confirmation methods.

Additionally, organizations need to train their employees and partners on basic rules: never click on links from unexpected video calls, regardless of who is supposedly calling; verify identities through alternative channels before installing any software; regularly check the list of installed extensions in browsers and messaging apps.

Cyber threats from state-sponsored hackers will not disappear until serious consequences occur. The crypto community must remain in a state of heightened alert, constantly adapting its systems to new attack methods that criminal groups like Lazarus Group continue to refine.

BTC-1,35%
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
  • Pin

Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)