Home Artificial Intelligence North Korean hackers using deepfakes to land remote jobs, researchers warn

North Korean hackers using deepfakes to land remote jobs, researchers warn

by Todd Humber
A+A-
Reset

Security researchers demonstrated the creation of a convincing real-time deepfake in just 70 minutes using widely available tools, highlighting an emerging threat as North Korean IT workers reportedly use this technology to infiltrate organizations through remote work positions.

Unit 42, Palo Alto Networks’ threat intelligence team, revealed Monday that individuals linked to North Korea are leveraging synthetic identities during job interviews to evade detection and potentially generate revenue for the sanctioned regime.

“A single researcher with no image manipulation experience, limited deepfake knowledge and a five-year-old computer created a synthetic identity for job interviews in 70 minutes,” researchers said in their report. “The ease of creation demonstrates how dangerously accessible this technology has become to threat actors.”

Evolving tactics

The investigation connects recent reports of deepfake job candidates to known North Korean operations. Researchers found evidence linking these incidents to previous North Korean tactics, including the use of compromised personal information and AI image manipulation services.

“DPRK IT workers incrementally advanced their infiltration methodology by implementing real-time deepfake technology,” according to the report. “This offers two key operational advantages. First, it allows a single operator to interview for the same position multiple times using different synthetic personas. Second, it helps operatives avoid being identified.”

Detection methods

The report identified several technical weaknesses in current deepfake technology that can help interviewers spot synthetic identities:

“Rapid head movements caused noticeable artifacts as the tracking system struggled to maintain accurate landmark positioning,” researchers noted. Other telltale signs include inconsistent handling of partially obscured faces, problems with lighting adaptation and slight delays between lip movements and speech.

Simple interview techniques like asking candidates to pass a hand over their face, make rapid head movements or perform specific gestures can help expose deepfakes by disrupting the facial tracking systems.

Recommendations for employers

Unit 42 researchers emphasized the need for collaboration between HR and information security teams to defend against this threat.

For HR teams, recommended strategies include recording video interviews with proper consent, implementing comprehensive identity verification workflows and training recruiters to identify suspicious patterns.

Security teams should secure the hiring pipeline by tracking application IP addresses, enriching provided phone numbers to check for VoIP carriers commonly associated with identity concealment and maintaining information sharing agreements with partner companies.

“No single detection method will guarantee protection against synthetic identity threats, but a layered defense strategy significantly improves your organization’s ability to identify and mitigate these risks,” the researchers said.

You may also like

Leave a Comment

About Us

HR News America is a trusted, national source of news, information, and best practices for human resources professionals and senior leaders.

Featured Posts