User Experience consultant and digital product designer, New York City.

The Future is Happening Now

There are some remarkable examples of the future being invented right before our very eyes today. As designers, this is very exciting as it has big implications on the future of our work and what user experience might start to look like as technology and different verticals collide. Jobs for designing mobile apps and software interfaces might be replaced by companies looking for designers that can design virtual worlds and minimalistic interfaces to navigate them. Or maybe as opposed to designing digital experiences we’ll use our design thinking skills to design real experiences. It’s worth looking at some examples to imagine how our roles as designers might not only apply now, but evolve into the future.

Tesla

In my opinion, to truly understand how big of an innovation Tesla is, you must personally experience it. You may know that it’s an electric car, or that it can go from zero to 60 in under four seconds. You’ve probably seen pictures of the amazing digital displays that almost make it feel as though the car is on an operating table in a hospital. You cannot really see how this car is the future though until you ride in one. Until you nearly get whiplash from accelerating so fast and experience the car locate a parking spot and park itself while you can completely place your attention elsewhere. The excitement, bewilderment and sheer feeling of awe this car can give you is almost matched with a level of uneasiness and discomfort because the car is so in control. In many cases it’s better at driving than you are and that’s going to take some getting used to.

Open Agriculture Initiative

Did you know that a strawberry will taste different depending on where it comes from? Climate and all the variables that impact the growing of food factor into the quality and taste of the food. OpenAG is working on food computers that control climate variables such as carbon dioxide, air temperature, humidity, dissolved oxygen, potential hydrogen, electrical conductivity, root-zone temperature, and more which can also be controlled and monitored. It won’t solve any of our food crisis problems right now but the implications for the future are amazing, if the solution can be nailed on a small scale to start. The company has developed and open-sourced what they call “climate recipes.” Climate recipes, in their own words are:

With the creation of climate recipes, food computer users can import successful climates that have been created, tested, and perfected by other users. The recipes can be customized and optimized for different taste or yield preferences and for various food production needs. Imagine growing water-loving tropical fruits in the middle of the desert, or sun-loving summer berries in the midst of a snowy Boston winter. With climate recipes, you can grow local from anywhere!

Wow! I imagine a future where we can all have a food computer and make our own food in our home or apartment in the same way that we can make a physical object with a 3D computer. Children can learn about food by growing it themselves, at school!

Virtual/Augmented Reality & Holograms

This space is evolving so fast it’s hard to keep track. The most well known device is probably Oculus with a focus on gaming, movies and television. This has obvious mainstream appeal but what about being able to build a virtual world or teleport a friend into your apartment via hologram? These are just a couple things the Microsoft HoloLens aims to do.

You can control and interact with the HoloLens through gazing and gesture controls as well as voice commands. The HoloLens excels at what it calls Mixed Reality:

Augments the real world with helpful information

Blends holograms with your real world

Can transport you to a virtual world

Interact freely with holograms, people, and objects in your world because the holographic frame positions holograms where your eyes are most sensitive to detail, leaving your peripheral vision unobscured.

Arguably the most exciting innovations in this space are coming from Meta.

Meta boasts the widest field of view and the most intuitive access to digital information. How do they do this? Unlike Microsoft’s HoloLens, Meta’s focus on interaction is rooted in neural interface design. This makes the Meta 2 feel more like an extension of yourself as opposed to a device that you tell what to do. For example you can interact with a hologram of a car by picking it up, turning it around and upside down as if it were a 3D object. The resolution of the imagery is supposed to be quite impressive.

In his TED Talk, Gribetz painted his vision for The Neural Path of Least Resistance™, a neuroscience-based interface design approach for zero-learning-curve computing, saying, “We are creating an experience that merges the art of user interface design with the science of the brain, creating ‘natural machines’ that feel like extensions of ourselves rather than the other way around. For example, our natural hand motion doesn’t rely on clicks or buttons, thereby maintaining the flow and connecting people to each other and the moment.”

In addition to the technical capabilities, the Meta 2 is designed for comfort (can be worn for “hours”) and is just well designed period.