The software is based on face-swapping algorithms. A deep-learning neural network is trained to identify someone's face in a still video frame – such as an adult actress in a blue movie – and swap it with someone else's face – such as a TV celeb or singer. Repeat this at 30 or 60 frames per second, and you've got an AI-doctored video.

The process to create these incredibly damaging faked videos is, thankfully, not trivial, but not impossible. You need the software – a desktop program dubbed FakeApp – plus a big batch of photos of your victim to train the application's deep-learning neural network, the video to paste the face onto, and a little tweaking here and there to render the output believable.

Yeah, and I wasn't really criticizing the idea of porn driving technology. I do obviously worry about the implications of this tech (even if it was inevitable). It's only a matter of time until people can start doing this with someone they know. I'm not sure it's necessarily better that it's currently limited to someone already in the public eye, but that's at least a more nuanced question.

It's only a matter of time until people can start doing this with someone they know.

This is possible now. From what I understand, if you have 500 high quality images of someones face, this program can learn enough to make a passable fake. How many people under the age of 25 have 500+ high resolution images of their face available for public consumption?