The Future Inappropriate Use of AI Software

Quite awhile ago I got all hot and bothered that people were planning to use ‘artificial intelligence‘ to in essence create more detailed photos out of thin air…

Upsampling image and video files usually leads to pixelation and soft textures, simply because algorithms are not capable of replacing non-existing image detail. But scientists at the Max-Planck Institute for Intelligent Systems in Germany have come up with a clever solution that is capable of producing better results than anything we’ve seen so far.

The team has developed a tool called EnhanceNet-PAT, which is capable of creating high-definition versions of low-resolution images, using artificial intelligence. It’s not the first attempt at solving the super-resolution task but the approach is a new one.

The additional information is not representative of the flora or fauna being photographed.

This will all lead to inappropriate assumptions both by the general population and specific use cases such as in science and the justice system.

Just because something is possible does not mean we should do it.
People apparently have learned nothing from Jurassic Park.

But ohhhh was I naive to be worrying about the thing that nobody will see as a problem until it’s too late… There are much bigger and immediate problems with people’s use of “AI”.

There’s a video of Gal Gadot having sex with her stepbrother on the internet. But it’s not really Gadot’s body, and it’s barely her own face. It’s an approximation, face-swapped to look like she’s performing in an existing incest-themed porn video.

…

It’s not going to fool anyone who looks closely. Sometimes the face doesn’t track correctly and there’s an uncanny valley effect at play, but at a glance it seems believable. It’s especially striking considering that it’s allegedly the work of one person—a Redditor who goes by the name ‘deepfakes’—not a big special effects studio that can digitally recreate a young Princess Leia in Rogue One using CGI. Instead, deepfakes uses open-source machine learning tools like TensorFlow, which Google makes freely available to researchers, graduate students, and anyone with an interest in machine learning.

And that’s pasting people’s features onto other people that just happen to be in pornography.

But hey, why stop at one thing that can hurt society when you can go all the way and fundamentally violate other people’s autonomy. This isn’t going to get better. It’s going to get worse. Our culture simply is not adjusting fast enough. We are not stopping to think about what we’re doing. All that is happening is that people see that they can do it so they are doing it… other people be damned.