Try Deep Learning in Python now with a fully pre-configured VMI love to write about face recognition, image recognition and all the other cool things you can build with machine learning.

If you aren’t a long-time Linux user, it can be really hard to figure out how to get a system fully configured with all the required machine learning libraries and tools like TensorFlow, Theano, Keras, OpenCV, and dlib.

To make it simple for anyone to play around with machine learning, I’ve put together a simple virtual machine image that you can download and run without any complicated installation steps.The virtual machine image has Ubuntu Linux Desktop 16.04 LTS 64-bit pre-installed with the following machine learning tools:Python 3.5OpenCV 3.2 with Python 3 bindingsdlib 19.4 with Python 3 bindingsTensorFlow 1.0 for Python 3Keras 2.0 for Python 3Theanoface_recognition for Python 3 (for playing around with face recognition)PyCharm Community Edition already set up and ready to go for all these librariesConvenient code examples ready to run, right on the desktop!Even the webcam is preconfigured to work inside the Linux VM for OpenCV / face_recognition examples (as long as you set up your webcam to be accessible in the VMware settings).

So don’t the VirtualBox version unless you don’t have any other choice.You need VMware to run this virtual machine image.

Right-click on the code window and choose “Run” to run the current file in PyCharm.If you configure your webcam in VMware settings, you can access your webcam from inside the Linux virtual machine!

And while many of the tech giants working on AI like Google and Facebook have open sourced some of their algorithms, they hold back most of their data.In contrast, blockchains represent and even incent open data.

For example: creating a decentralized Uber requires a relatively open dataset of riders and drivers available to coordinate the network.The network effects and economic incentives around these open systems and their data can be more powerful than current centralized companies because they are open standards that anyone can build on in the same way the protocols of the internet like TCP/IP, HTML, and SMTP have achieved far greater scale than any company that sits atop them.

And oracle systems (a fancy way of saying getting people all over the world to report real world information to the blockchain in a way we can trust) like Augur will inject more data.This open data has the potential to commoditize the data silos most tech companies like Google, Facebook, Uber, LinkedIn, and Amazon are built on and extract rent from.

AIs trained on open data are more likely to be neutral and trustworthy instead of biased by the interests of the corporation who created and trained them.Since blockchains allow us to explicitly program incentive structures, they may make the incentives of AI more transparent.Simplified, AI is driven by 3 things: tools, compute power, and training data.

My guess is they shift to 1) creating blockchain protocols and their native tokens and 2) AIs that leverage the open, global data layer of the blockchain.

“Electric utilities are already using AI to better optimize power generation and the grid.

As AI enterprise technology becomes more pervasive and mainstream, businesses also need to ensure they are taking into account the importance of building trust between AI and the people who use the systems.”

“In 2017 we will move from the silent AI that supports search, image, speech and text analytics behind the curtain to the embrace of embedded visible AI.

“In 2017 we’ll see increased acceleration in the democratization of AI for every person and every organization.