Samim Winiger, a self-described artificial-intelligence experimenter, designed an image-recognizing computer brain trained on dozens of Taylor Swift songs. When the bot looks at photos, it now describes them in a Swiftian way. Winiger then set the impressive, but absurd, results to music:

Winiger's project is based on new code released by a group at University of Toronto that creates "neural-storytellers." It's a recurrent neural network, an artificial intelligence algorithm that is good at handling and learning language. After looking through a lot of text, it "generates little stories" about images it's shown.

"New machine-learning experiments are enabling us to generate stories based on the content of images," Winiger wrote in a blog post. The stories that machines can generate depend very much on the data that they're shown, so Winiger thought it be fun to repeat the experiment with Taylor Swift lyrics.

As experiments like this happen more, they could raise interesting legal issues. A song-writer has sued Swift alleging she was inspired by his lyrics. Could Swift in turn sue an AI like Winiger's for copyright infringement?

In case that wasn't impressive enough (or you're a Swift-hater), Winiger took the same technology and taught a bot to write its own erotica, similar to the erotibot I wrote earlier this year.

While these are toy applications, Winiger views them as windows into a future where machines will be tools, or perhaps even autonomous agents, in the artistic process. AI, he told me, doesn't have to be utilitarian. It can be creative too. This is already starting to happen.