More and more people are using deepfakes to apply for remote tech jobs, FBI says

email:


Gabriel & Co. fine jewelry


An AFP journalist views a video on January 25, 2019, manipulated with artificial intelligence to potentially deceive viewers, or "deepfake" at his newsdesk in Washington, DC.
The FBI said on Tuesday that more people were using deepfake videos during job interviews. In this image, a person views a deepfake video on January 25, 2019, manipulated with artificial intelligence to potentially deceive viewers.

  • People are using deepfake technology to pose as someone else in job interviews, the FBI said.
  • They seem to focus on IT roles that would grant them access to sensitive data, the agency said.
  • Anti-deepfake technologies are still not foolproof, but there are simple ways to detect deepfakes.

More and more people are using deepfake technology to pose as someone else in interviews for remote jobs, the FBI said on Tuesday.

email:

In its public announcement, the FBI said it has received an uptick in complaints about people superimposing videos, images, or audio recordings of another person onto themselves during live job interviews. The complaints were tied to remote tech roles that would have granted successful candidates access to sensitive data, including "customer PII (Personally Identifiable Information), financial data, corporate IT databases and/or proprietary information," the agency said.

Deepfake videos could be used for entertaining purposes, but they could also be extremely harmful. In March, Meta said it removed a deepfake video that claimed to show Ukrainian President Volodymyr Zelenskyy demanding Ukrainian forces to lay down their arms amid Russia's invasion.

Equally concerning is the harm that private individuals could face from being targeted by deepfakes, as in the cases highlighted by the FBI on Tuesday. "The use of the technology to harass or harm private individuals who do not command public attention and cannot command resources necessary to refute falsehoods should be concerning," the Department of Home Security warned in a 2019 report about deepfake technology. 

Fraudulent applicants for tech jobs are nothing new. In a November 2020 LinkedIn post, one recruiter wrote that some candidates hire external help to assist them during the interviews in real time, and that the trend seems to have gotten worse during the pandemic. In May, recruiters found that North Korean scammers were posing as American job interviewees for crypto and Web3 startups.

What's new in the FBI's Tuesday announcement is the use of AI-powered deepfake technology to help people get hired. The FBI did not say how many incidents it has recorded.

Anti-deepfake technologies are far from perfect

In 2020, the number of known online deepfake videos reached 145,227, nine times more than a year earlier, according to a 2020 report by Sentinel, an Estonian threat-intelligence agency.

Technologies and processes that weed out deepfake videos are far from foolproof. A report from Sensity, a threat-intelligence company based in Amsterdam, found that 86% of the time, anti-deepfake technologies accepted deepfakes videos as real.

However, there are some telltale signs of deepfakes, including abnormal blinking, an unnaturally soft focus around skin or hair, and unusual lighting.

In its announcement, the FBI also offered a tip for spotting voice deepfake technology. "In these interviews, the actions and lip movement of the person seen interviewed on-camera do not completely coordinate with the audio of the person speaking. At times, actions such as coughing, sneezing, or other auditory actions are not aligned with what is presented visually," the agency wrote.

The FBI said people or companies who have identified deepfake attempts should report it the cases to its complaint website.

Read the original article on Business Insider

Ad. Choose Clickbank university to learn the fundamentals of making money online

Revealed: The Secrets our Clients Used to Earn $3 Billion


email:

Leave a Reply

Your email address will not be published.

Sign up to our newsletter!