Press "Enter" to skip to content

‘Deepfake’ videos online almost double in less than a year

It used to be that seeing is believing. But deepfake technology, artificial intelligence (AI) technology that can manipulate images, videos and audio to show a person doing or saying something they didn’t do in real life, is challenging that assumption and making it easier than ever.

Deeptrace, a cybersecurity company that specializes in detecting AI-generated synthetic videos, reported a spike in the number of deepfake videos online from 7,964 in December 2018 to 14,678 in July 2019. That’s an 84 percent jump in less than a year.

Analyzing videos from video hosting sites, community forums and deepfake apps, the company said 96 percent were pornographic in nature, often with the computer-generated face of a celebrity replacing that of a real-life adult actor in a sexually explicit scene. It found that the most frequently targeted individuals were actresses from Western countries with South Korean pop singers second and third in line.

The more elusive fake news

Given the proliferation and accessibility of deepfake technology, it is no surprise there’s been a spike in the number of deepfake videos circulating online. Earlier this year, Chinese app Zao went viral for its ability to superimpose selfies onto celebrities in movie scenes. The app takes a series of selfies where users make various facial expressions and digitally transfers it to a movie scene.

“READ MORE…”

Breaking News: