Dr Chris Brauer is a Senior Lecturer and Director of the MSc Management of Innovation programme and the Centre for Creative and Social Technologies (CAST) in the Institute of Management Studies (IMS) at Goldsmiths, University of London.

Our inner voice, or the internal dialog we have with ourselves, may one day have company. For our unspoken thoughts that prompt us to make everyday decisions may be joined by a new, artificially intelligent voice in our heads – the virtual assistant.

The virtual assistant (VA), a digital servant/master designed to serve/define our every need, is barely starting out on its journey from the realms of science fiction (think KITT in Knight Rider or HAL in 2001 Space Odyssey) to mainstream reality.

IBM is already investing heavily in the area; the company has revealed that it is pouring resources into the development of A.I. and cognitive computing. Microsoft’s Cortana has recently launched, and it will reportedly run on the desktop with Windows 10. Soon, it will be everywhere – and nowhere – in your life. Moving beyond screens, social, and search, the VA will be the new gateway to the Internet and all the people and things connected to it.

Cortana

Already we can start to see its trajectory from mobile and wearable, to ‘hearable’ and perhaps one day ‘implantable’. Project Virtual Assistant is a research collaboration between the Institute of Management Studies (IMS) at Goldsmiths, University of London and global media agency Mindshare. We set out to explore and understand the opportunities and challenges of bringing the virtual assistant of the future to life for users, regulators and service suppliers.

We worked intensely with 12 consumers in a focus group for six weeks, running workshops and experiments in ethnographic service design with the latest technologies for face and emotional recognition, beacon technologies like those in the revamped Regent Street, wearables, hearables, trackables, and even scent.

The virtual assistants of the future

Over the course of our research, it became apparent that the VA is likely to start off life on the mobile (where Google Now, Siri and Cortana currently live) with research finding that initially 40 percent of interactions would take place on the phone.

As smart watches bring access to digital information and connectivity closer to the user, the next step will be to strip back the physical hardware as far as possible, with the intelligence of the VA existing in the cloud, getting pulled in, and pushing its way into our lives on multiple devices on our bodies and in our offices, homes, and vehicles. 81 percent of those surveyed wanted the VA to be voice controlled, demonstrating the urge for the hardware to be a secondary concern.

Joaquin Phoenix falls for his VA in the movie Her

Your VA will be continually prompting you with suggestions and taking instructions, and will know more about you than perhaps you do yourself. The inevitable trajectory is towards a kind of ‘post-human’ condition where users and their VA technology are not separate and distinct but are instead embodied and unified as one.

To begin with, this will be a device that can be taken in or out, already 2 percent of interactions are predicted to take place via earpiece, but if people value and become increasingly dependent on their VA we can expect this device to be implanted and permanent.

Just as a desire for ease and convenience has moved contraception from the wearing of physical devices to implants, so we will see the same thing with virtual assistants. We will initially want to decide when and where we synch our physiology, decision-making and emotions with our VA. Then real-time synchronisation will just be easier and more efficient.