Kia R.E.A.D.s Your Moods & Adapts The Interior

Kia Motors is one of the rare automakers to tread a fine line between making conventional cars and “new energy” vehicles. The company has no problems highlighting its electric vehicles (EV) at trade shows, going as far as displaying them center stage. Now the company is looking even further ahead at the autonomous vehicle (AV) era with its holistic interactive system.

Watching Your Emotions — Kia Can Soon R.E.A.D. You

Kia showed how it views the interactive life inside AVs at CES 2019. The company has partnered with the Massachusetts Institute of Technology (MIT) Media Lab’s Affective Computing Group for its Real-time Emotion Adaptive Driving (R.E.A.D.) system, which will be used on its “SEED Car” concept with the company’s V-Touch technology. We’ll have more on the SEED Car in a coming article.

Kia’s R.E.A.D. system uses AI to recognize drivers’ and riders’ physiological emotions based on facial expressions, electrodermal activity, and heart rate, and then responding in a hopefully helpful way. CES visitors could see how the sensory controls react in real-time to their changing emotional state.

R.E.A.D. uses a music-response vibration system in the seats, like some game simulation chairs where the player feels the control reactions.

This sensory-based signal-processing technology adapts seat vibrations according to sound frequencies of the music played. My favorite part is how the seats can be set to massage mode.

But mostly, Kia says the system is designed to enhance safety by providing haptic warnings from the vehicle’s advanced driver-assist systems.

The SEED Car concept is a 100 km (62 mile) pedal-electric hybrid vehicle. It uses pedal input from the driver with a high degree of electric power assistance to make it effortless. For longer trips, Kia designed the BIRD Car, an AV shuttle with greater range. Both of them use R.E.A.D.

Lastly, the V-Touch is Kia’s virtual touch-type gesture control technology. It uses a 3D camera monitoring the drivers’ and passengers’ eyes, as well as fingertips. Anyone can manage in-car features through a heads-up display. Hand and finger gestures are something many mobility makers are working on as an intuitive way to interact with the cabin environment, such as lighting, heating, ventilation and air-conditioning (HVAC), and the entertainment system. Ultimately, this translates to fewer buttons, fewer touch screens, and more open spaces.

According to our last talks with James Bell, Director of Corporate Communications, at Kia Motors America (KMA), “the R.E.A.D. System analyzes a driver’s emotional state in real-time through bio-signal recognition.” This is done using sensors that read facial expressions, heart rate, and electrodermal activity on the steering wheel. The data gathered helps the AI change the interior environment, altering conditions to appease the five senses inside the cabin. Think of it as a way to make a driver experience a positive one by enhancing the cabin environment.

Kia uses its AI technology for deep learning of its passengers and drivers. It establishes a user behavior baseline and identifies certain patterns and trends. It can then customize the cabin to become a better-adapted environment.

How AI & AV Are Linked To Our Future Mobility Needs

It seems there isn’t a single auto press release without using the words AI and AV these days. Yet, both of these loosely defined terms are building blocks of our future mobility needs. And motorcyclists, don’t feel too smug: BMW is working on AI and AV motorcycles as well. The deeper question is: the introduction of personal computers in the late 1980s was supposed to free our time, so what about in AVs? How will this new enhanced digital life make our experience better?

Mr. Albert Biermann, President and Head of Research & Development Division of Hyundai Motor Group, at CES answered those questions like this:

“Kia considers the interactive cabin a focal point for future mobility, and the R.E.A.D. System represents a convergence of cutting-edge vehicle control technology and AI-based emotional intelligence. The system enables continuous communication between driver and vehicle through the unspoken language of ‘feeling’, thereby providing an optimal, human-sense oriented space for the driver in real-time.”

About the Author

Nicolas Zart Nicolas was born and raised around classic cars of the 1920s, but it wasn't until he drove an AC Propulsion eBox and a Tesla Roadster that the light went on. Ever since he has produced green mobility content on various CleanTech outlets since 2007 and found his home on CleanTechnica.
His communication passion led to cover electric vehicles, autonomous vehicles, renewable energy, test drives, podcasts, shoot pictures, and film for various international outlets in print and online. Nicolas offers an in-depth look at the e-mobility world through interviews and the many contacts he has forged in those industries.
His favorite taglines are: "There are more solutions than obstacles." and "Yesterday's Future Now"

The content produced by this site is for entertainment purposes only. Opinions and comments published on this site may not be sanctioned by, and do not necessarily represent the views of Sustainable Enterprises Media, Inc., its owners, sponsors, affiliates, or subsidiaries.