The project is work in progress, and requires the combination of the above hardware, as well as an Emotiv EPOC, a device capable of translating specific thought patterns into actions, in order to interact with the prototype with your thoughts.

Zaharia has built basic demonstrations that give some idea of how the device could be used in education, the medical environment and a first-person shooter. In his city construction demo, Zaharia builds a city using his hybrid controller’s motion tracking functions, and then shrinks his virtual self down and navigates the city using his mind.

In a blog post, he explained what led him to create the prototype: “With my experience in the education industry through my start-up Zookal and keen interest in neuroscience, I had a thought around how these technologies could be used together to enhance education and at the same time, see how far can we go with using cognitive control in a virtual simulation.”

Although users can perform basic actions such as a grabbing objects or squeezing a trigger with their thoughts, Zaharia admits that his homemade setup has limitations, particularly in its user friendliness, largely because of the mind reading technology available.

“The Emotiv EPOC’s mind reading abilities are still quite primitive, with only being capable of having a limited amount of different actions mapped to thought patterns,” he said.