The video, created by Reddit user deepfakes by training a machine learning algorithm on stock photos and videos of Gadot, featuring a woman who takes on the rough likeness of Gadot, with the actor’s face overlaid on another person’s head.

Experts warn the technique is ‘no longer rocket science.’ "It's not going to fool anyone who looks closely. Sometimes the face doesn't track correctly and there's an uncanny valley effect at play, but at a glance it seems believable," said Motherboard, which first spotted the unsettling video.

The algorithm was trained on real porn videos and images of Gal Gadot, allowing it to create an approximation of the actor’s face that can be applied to the moving figure in the video.

The amateur video has worrying implications, showing how freely available resources could be used to create fake films in just a matter of days or even hours.

‘Everyone needs to know just how easy it is to fake images and videos, to the point where we won’t be able to distinguish forgeries in a few months from now,’ AI researcher Alex Champandard told Motherboard.

‘Of course, this was possible for a long time but it would have taken a lot of resources and professionals in visual effects to pull this off. ‘Now it can be done by a single programmer with recent computer hardware.’