So I know with these two monikers under my belt, anything relating to Xbox Kinect would be natural for me to love. For anyone who hasn’t heard of it, the new Intel RealSense technology is similar to Kinect, in that it combines an RGB camera with 3D mapping technology so that users can operate with their computers through gesture recognition.

If this sounds familiar, you may have heard of something called Perceptual Computing. While there were some truly amazing things that came out of that last year, it was honestly just a trial balloon of sorts, to see if the concept has some legs. But we took a hard look at what we did, what our developer friends were able to come up with and we realized we were really on to something!

In all honesty, many of the 1st iterations of software created using Perceptual Computing (which had hereby been rebranded as Intel® RealSense™ technology) were mostly games. While a guy like me, a gamer and a geek, loves this stuff, when I walked around Augmented World Expo, it became quickly apparent that RealSense would get used for more than just “fun & games.” When you combine it with other technologies such as augmented realty, virtual reality, and 3D printing, things really start to get interesting!

Take the situation of augmented reality: RealSense helps to make it much more useful. For example, imagine a “Magic Mirror” in a Ladies dressing room. A lady could wear a bikini or undergarments and then the “mirror” (which would actually be a large display connected to RealSense) could then overlay outfits on her based on a selection she makes, not just the outfits available in her size, in that store, at that time. This experience would be a whole new paradigm for clothes shopping. Women could pick the styles they want and then get perfectly fitting clothes, because RealSense would know every measurement down to the millimeter, bespoke tailoring could become the norm with clothing shipped within a few days. In fact, when she gets her hands on her own RealSense-enable All-in-One or #2in1, she could even go shopping in the comfort of her own bedroom, potentially even through a web browser. How cool would that be?

Or Virtual Reality, when connected with RealSense systems, could make our experience of the world much easier. When you look at most virtual reality systems today, they are based on worlds that developers create. Imagine how much easier it would be if the worlds could just be based on scans made by a RealSense-enabled robot, going through, as an example, the Louvre. It would be like the difference between walking through the museum with a video camera or trying to capture the experience by painting a picture, taking two steps, then painting a new picture! Not only could thousands (if not millions) more people “experience” the Louvre, but it would also make it something that could be enjoyed by bed-ridden senior citizens that previously might have never been able to travel to visit.

Then there is the world of 3D Printing. With RealSense, not only does this technology bring about the concept of “copy & paste” to the real world, but it also enables the concept of “physical Photoshopping*” (Photoshop is a registered trademark of Adobe, but the verb form is commonly used to describe the act of using photo editing software to improve a picture). Specifically, in this case, imagine if a plastic handle was the wrong size for a person’s hand. You could unscrew the handle, scan it in detail with a RealSense enabled system, then scale it larger or smaller for a perfect fit. Or instantly make a scan of a beloved family pet so that a college-bound freshman could bring a reminder of home to a sterile and unfamiliar dorm room.

Don’t get me wrong, there will be plenty that Intel® RealSense™ technology will enable all on its own. Not only could people enjoy more intuitive gaming, like we’ve already seen, but the ability for facial expressions to be loosely interpreted so when a user looks very puzzled, perhaps the software could automatically open the help window. Or situations where say a surgeon, baker, or a sculptor wants to interact with software without TOUCHING anything. Whether a surgeon wants to adjust a medical scan or a baker wants to turn a page of a virtual cookbook, both options would be available. In the video below, I show how children can bring their real life world into a computer game:

While the above example is very simple, you can imagine a more complex scenario where a toy company focused on creativity could let kids build a space ship with their product, then fly that space ship they created through an entire, imaginary world. The possibilities are nearly endless, but blog posts never should be, so I’ll end this post with a simple question:

What Do YOU Think Would Be the Biggest Opportunity for RealSense?

(post an Answer in the comments below or let me know on Twitter at: @CaptGeek)

Related posts:

No related posts.

About Eric Mantion

Eric is the Nerd Herder for Intel's Software & Services Group responsible for the Intel RealSense Community
You can find him online at:
- www.Twitter.com/CaptGeek
- www.Linkedin.com/in/ericmantion |
Eric graduated from the US Naval Academy, with a bachelor of science in Physics. He has served on Nuclear Powered Submarines and was attached to SEAL Delivery Vehicle Team ONE. He has worked in the Semiconductor Industry for over 10 years and has held positions as a Product Marketing Engineer, a Senior Industry Analyst, a Competitive Intelligence Analyst, and a Technology Evangelist. In whatever spare time he has, Eric loves spending working out (especially Beach Volleyball), tinkering with computers, dabbling with Linux, exploring Android, and playing video games (especially StarCraft II & DOTA 2)