After much hullabaloo earlier this year about "deep fakes," the machine-learning based fake videos that Senator Marco Rubio called the modern equivalent of nuclear weapons, it turns out that low-tech doctored videos can be just as effective a form of disinformation, as a fake video promoted by the White House this week demonstrates—an attack that could just as easily be deployed against you or your enterprise.

Watching both videos makes it clear that the original has been edited. The doctored video was first published on Twitter by Infowars editor Paul Joseph Watson, and has been modified to make it look like Acosta "karate-chopped" the intern. The video also removes clear audio of the reporter saying "Pardon me, ma'am."

It remains unclear whether Watson did more than remove the audio and zoom in on Acosta's arm. Buzzfeed News reports that the video-to-gif conversion process may be responsible for the jerkiness of the video. Gif uses many fewer frames per second than regular mp4 video. "Digitally it's gonna look a tiny bit different after processing and zooming in, but I did not in any way deliberately 'speed up' or 'distort' the video," Watson told Buzzfeed. "That's just horse shit."

The incident underscores the fears that video can be easily manipulated to discredit a target of the attacker's choice—a reporter, a politician, a business, a brand. Unlike so-called "deep fakes," however, where machine learning puts words in people's mouths, low-tech doctored video hews close enough to reality that it blurs the line between the true and false.

FUD (fear, uncertainty and doubt) is familiar to folks working in the security trenches, and deploying that FUD as a weapon at scale can severely damage a business as well as an individual. Defending against FUD attacks is very difficult. Once the doubt has been sowed that Acosta manhandled a female White House intern, a non-trivial portion of viewers will never forget that detail and suspect it might be true.

Reputational damage for an enterprise can cause stock prices to plummet, and result in long-term consequences for customers and shareholders who may no longer trust the truth about your business when they hear it going forward.

This is the real danger of FUD, of course—when no one can be sure any longer what is true and what is false, attackers who wish to manipulate citizens, consumers, or shareholders at scale can do so with ease. Defending against such attacks is extremely difficult, because even an individual or enterprise that has done nothing wrong, such as CNN's Acosta, still faces a tarnished reputation as a result.