‘This is fake’ — How North Korea uses AI and deepfakes as a weapon

RFA Perspectives — Deepfake and AI videos are created from tools anyone can download. North Korean hackers are already using the same tools as a weapon.

Video: ‘This is fake’ How North Korea is weaponizing AI and deepfake technology

Recently, South Korea’s cybersecurity firm Genians revealed that a North Korean hacking group used AI-generated deepfake military IDs to impersonate defense agencies and launch phishing attacks.

Their targets? Officials, journalists, human-rights activists, and researchers.

This isn’t new.

North Korean IT workers have long used AI and deepfakes to build fake identities—sometimes even stealing U.S. identities to apply for jobs.

They appear in video interviews with AI-made faces and voices.

Cybersecurity expert Dawid Moczadło, co-founder of Vidoc, shared a video on LinkedIn that experts believe shows these workers in action.

At first glance it looks real, but if you watch closely—something feels off.

If these workers get hired, they don’t just collect a paycheck.

They can plant malware, steal company data, and funnel money back to North Korea’s weapons programs—helping the regime dodge sanctions.

AI can make life easier for everyone.

But in North Korea’s hands, it becomes a weapon—one that threatens your personal data, private companies, and even national security.

For more on North Korea hackers, watch RFA Korean’s three-part series “Whack a Mole”:

Part One: Kim Jong Un’s secret soldiers — the hackers

Part Two: Cryptocurrency heist

Part 3: Are you a North Korean worker?

Radio Free Asia

Related posts

Leave a Comment