A vision of the future, some small notes

Audio through implants and/or hearing aid-like devices, which also pass through outside sound perfectly so they never need to be taken out

Visuals through retina projection or built into a contact lens, or even transmitted directly into the optic nerve through an implant

Sensors that can detect commands from movements of the body

We already have the beginnings of these technologies (bluetooth hearing aids already exist, and retinal projection and contact lens screens are in early development). The only remaining key technology is to develop some way to transmit commands silently – tracking eye movements, head, head or body movements, or an interface of some sort (keyboard, etc). Virtual interfaces will be infinitely customizable and will make physical ones unnecessary, though they will still be used in many well-defined applications (keypads, controls, etc). We need neural sensors and/or subvocalization of some sort to have silent interaction, and some sort of AI to help understand the commands.

The endgame is a computing ecosystem where we can do practically everything we can do with computers by thought (practically; a sufficiently advanced non-direct interface should be nearly like thought), like turning on the lights, changing the frequency on the radio in our car, or viewing an email.