- The software is based on face-swapping algorithms. A deep-learning neural network is trained to identify someone's face in a still video frame – such as an adult actress in a blue movie – and swap it with someone else's face – such as a TV celeb or singer. Repeat this at 30 or 60 frames per second, and you've got an AI-doctored video.
For example, in this safe-for-work video, the app has pasted Star Wars actress Daisy Ridley onto someone else's body.
But it's not super easy (yet):
- The process to create these incredibly damaging faked videos is, thankfully, not trivial, but not impossible. You need the software – a desktop program dubbed FakeApp – plus a big batch of photos of your victim to train the application's deep-learning neural network, the video to paste the face onto, and a little tweaking here and there to render the output believable.
That's ingenious. Say what you will about it being used for porn, but — the technology is fantastic.
here's a question: would you rather that war or pornography drive technology? Or, I guess, rephrased, would you prefer the human vice of violence or the human vice of pleasure drive technology?
Yeah, and I wasn't really criticizing the idea of porn driving technology. I do obviously worry about the implications of this tech (even if it was inevitable). It's only a matter of time until people can start doing this with someone they know. I'm not sure it's necessarily better that it's currently limited to someone already in the public eye, but that's at least a more nuanced question.
This is possible now. From what I understand, if you have 500 high quality images of someones face, this program can learn enough to make a passable fake. How many people under the age of 25 have 500+ high resolution images of their face available for public consumption? Tons is the answer. It's only a matter of time until people can start doing this with someone they know.
That's a fair point. I don't know enough about how this works to say whether that's enough, or if it needs certain kinds of images, or whatever.
i'd rather both vices, so i choose... pornography
It's not all prurient, of course. Here is a video where someone replaced Angela Merkel's face with Trump's.