Categories

How the World Is Finally Ready For Virtual and Augmented Reality

By Mike Nichols, VP, Content and Applications at SoftKinetic

The year is 1979 and Richard Bolt, a student at MIT, demonstrates a program that enables the control of a graphic interface by combining both speech and gesture recognition. As the video of his thesis below demonstrates, Richard points at a projected screen image and issues a variety of verbal commands like “put that there”, to control the placement of images within a graphical interface in what he calls a “natural user modality”.

“Put-That-There”: Voice and Gesture at the Graphics Interface

Richard A. Bolt, Architecture Machine Group

Massachusetts Institute of Technology – under contract with the Cybernetics Technology Division of the Defense Advanced Research Projects Agency, 1979.

What Bolt demonstrated in 1979 was the first natural user interface. A simple pointing gesture combined with a verbal command, while an innate task in human communication, was and still is difficult for machines to understand correctly. It would take another 30 years for a consumer product to appear that might just fulfill that vision.

A new direction

In the years following Richard’s research, technology would advance to offer another choice to improve the Human Machine Interface (HMI). By the mid 80’s the mouse, a pointing device for 2D screen navigation, had evolved to provide an accurate, cost effective, and convenient method for navigating a graphical interface. Popularized by Apple’s Lisa and Macintosh computers, and supported by the largest software developer Microsoft, the mouse would become the primary input for computer navigation over the next 20 years.

“The Macintosh uses an experimental pointing device called a ‘mouse’. There is no evidence that people want to use these things.”

San Francisco Examiner, John C. Dvorak – image provided by…

In 2007, technology advancements helped Apple once again popularize an equally controversial device, the iPhone. With its touch sensitive screen and gesture recognition, the touch interface in all its forms has now become the dominant form of HMI.

The rebirth of natural gesture

Although seemingly dormant throughout the 80’s and 90’s, research continued to refine a variety of methods for depth and gesture recognition. In 2003 Sony released the Eye Toy for use with the PlayStation2. The Eye Toy enabled Augmented Reality (AR) experiences and could track simple body motions. Then in 2005 Nintendo premiered a new console, the Wii, which used infrared in combination with handheld controllers to detect hand motions for video games. The Wii controllers, with their improved precision over Sony’s Eye Toy, proved wildly successful and set the stage for the next evolution in natural gesture.

In 2009 Microsoft announced the Kinect for Xbox 360, with its ability to read human motions to control our games and media user interface (UI), without the aid of physical controllers.

What Richard Bolt had demonstrated some 30+ years prior was finally within grasp. Since the premier of Kinect we’ve seen more progress in the development of computer vision and recognition technologies than in the previous 35 years combined. Products like the Asus Xtion, Creative Senz3D, and Leap Motion have inspired an energetic global community of developers to create countless experiences across a broad spectrum of use cases.

The future’s so bright

To this day, Richard’s research speaks to the core of what natural gesture technology aims to achieve, that “natural user modality”. While advances in HMI have continued to iterate and improve over time, the medium for our visual interaction has remained relatively intact: the screen. Navigation of our modern UI has been forced to work within the limits of the 2D screen. With the emergence of AR and VR, our traditional forms of HMI do not provide the same accessible input as the mouse and touch interfaces of the past. Our HMI must evolve to allow users the ability to interact to the scene and not the screen.

Next, we’ll explore how sensors, not controllers, will provide the “natural user modality” that will propel AR and VR to become more pervasive than mobile is today. The answer, it seems, may be right in front of us…we just need to reach out and grab it.

I founded the Augmented Reality New York meetup (ARNY) exactly 4 years ago as a labor of love, and it developed a life of it’s own: attracting nearly 1500 members, introducing 200 Augmented Reality demos, helping advance AR in NYC, creating partnerships, helping AR enthusiasts find jobs, and spurring some fantastic AR startups.

In addition to our projects at the Bowery Wall (NYC) and Wynwood Walls (MIA), the launch of our free app coincides with the video release of our most recent project, which is a collaboration with MOMO to create an interactive digital mural in St. Louis:

+ Note: if you are unable to visit these cities, you can still trigger the augmented reality experience from mural images on the Re+Public website. Click Hereto view mural images.

While currently only available on mobile devices, the Re+Public app is a visionary initial step in the coming future of digitally augmented urban spaces that individuals will view and interact with through wearables. The Re+Public app will soon expand to include projects in more US locations and cities abroad.

A sincere thanks for your support and continuing to follow Re+Public. Please feel free to forward this email and help get the word out ;)

Introducing GVX, an Xtreme Reality head-mounted gaming console that adapts the game to your surroundings, allowing you to Live The Game.

Richmond Hill-based game technology company Sulon Technologies, Inc. (Sulon) announces their revolutionary new product, GVX, a head-mounted gaming console that offers avid gamers the freedom to play anywhere, whether indoors or outdoors. Sulon’s proprietary technology involves applying advanced Augmented Reality (AR) and Virtual Reality (VR) technology to create the most realistic and immersive gaming experience available, enabling players to have a Star Trek ‘holodeck’ experience in their own living space and bringing to life traditional table top games in 3D on any flat surface.

Sulon’s solution to a tired console market is a brand new gaming device that introduces innovative gameplay by combining the benefits of console quality gaming on a mobile platform. Sulon is a team of experienced engineers, product development specialists, researchers and science fiction and gaming enthusiasts who have discovered and proven how to make any physical environment into a “holodeck” zone. “I’ve always been interested in new technologies and how society is quick to absorb them into their everyday lives” said Dhanushan Balachandreswaran, Founder and CEO of Sulon Technologies. “We are excited about creating technology that we originally thought to be fiction and turning it into a reality.”

The GVX system introduces the concept of Xtreme Reality (XR) defined as the one-to-one integration of the real world and the virtual world with the ability to scale the whole spectrum of AR to full VR. XR blurs the lines between the real and virtual worlds to actually place the player into their game by adapting their entire physical environment into the game world. GVX uses complex and adaptive algorithms, high end graphics processing, motion tracking and position tracking to achieve the XR experience. The system has the unique ability to map the actual environment as the physical characteristics and boundaries of the game environment. Unlike any other device available, it applies AR and VR to transform any space (even outdoors!) into a completely new game environment in real-time. It is a “wear and play” experience that expands a player’s gaming space from the area in front of their TVs and PCs to their entire home. GVX runs on the Android Operating System where applications can be developed quickly and games can span the entire spectrum of casual to hard core genres. XR games on GVX are classified as Active or Surface gaming.

Active games are adrenaline inducing and interactive, where the system conducts an accurate and rapid scan of the player’s entire environment to adapt it into the game world using sophisticated SLAM algorithms. GVX is also completely wireless with all devices communicating via Bluetooth or Wi-Fi, giving users complete freedom of movement. Players can now live their video game by physically exploring and interacting with the virtual environment. With Active gaming, space limitations are not an issue as the system is able to generate new graphics and scenarios to the same space multiple times during one gaming session, allowing for an endless number of new gaming experiences.

Surface games on the other hand are a creative throwback to traditional gaming where games are augmented in 3D onto any flat surface. Players can watch their cities actively grow, wage virtual wars against other players around the world or enjoy a tabletop game with friends and family. Surface gaming provides a new reach for social gameplay and a new avenue and meaning for social interaction. Family game nights can be made possible even when family members are not physically present.“GVX really gives game developers creative freedom to take advantage of all the functions and capabilities of the system when designing games.” said Dhanushan Balachandreswaran.

What also makes GVX a game-changer is that Active and Surface gaming is not mutually exclusive and can be combined to create innovative gaming experiences. Jumanji is a great example of how surface and active gaming can be combined. The game board is augmented in 3D on a flat surface (surface gaming aspect) and game events that occur would require players to interact with the virtual environment (active gaming aspect). In addition to the XR experience of Active and Surface gaming, GVX is also highly flexible and capable of playing existing games such as PC games, games specialized for stereo VR or mobile games like Angry Birds.

“The concept of mixed reality has been around for years but nobody has succeeded in making it real” said Jackie Zhang, Vice President of Research and Development at Sulon Technologies. “With the latest technologies and innovation, we have made this concept possible. It’s a disruptive product whose limitation is the bounds of your imagination.”

GVX also features a removable component (the GVX Player) that offers players a variety of gaming options. The GVX player can be used on its own to play existing mobile games from the Google Play store as well as connect to a HDTV and a Bluetooth controller for those who enjoy traditional gaming on their television. These gaming options on the GVX Player can be combined with the XR experience on GVX (i.e. Active and Surface gaming) to create a multitude of unique one-of-a-kind gaming experiences. A game could feature as many or as little of these gaming combinations. For example, a game could begin as a simple mobile game, but an in-game event may prompt the player to switch to Active gaming in order to fully experience the event. These flexible gaming options allow players to easily pick and choose their preferred gaming experiences.

GVX is also a consumer-friendly gaming device that eliminates many of the problems associated with adopting AR and VR technology. Surface games (AR application) on GVX are hassle-free and do not require physical markers. VR gaming on GVX is safe and does not induce motion sickness. GVX features an option where players can adjust the level of graphics opacity. Lowering the graphics opacity allows more of the player’s physical surroundings to appear in the game environment, ensuring safe gameplay and providing players who are not ready for a full virtual environment an alternative. The opacity option, as well as the wand controller’s one-to-one true motion tracking prevents motion sickness because the player’s movement in the physical world matches their movement in the game world.

“GVX is truly a game-changer in the gaming and mobile space. Never before has a gaming platform been able to successfully bring to life that Star Trek “holodeck” experience in a simple and straight-forward product, while freeing the user from the constraints of a wired environment,” comments Ken Wawrew, the Chief Operating Officer at Sulon Technologies.

GVX developer kits are available for pre-order on the Sulon Technologies website.

Folks, here is my talk from ISTAS 2013 in Toronto on 29 July 2013. Following an intro to augmented reality I review a collection of AR experiences and test them against Lex Ardez’ “3 laws of augmented reality design”.

1. Augmentation must emerge from the real world and/or relate to it

2. Augmentation must not distract from reality, but make you more aware of it

These rules are very logical and simple and yet most AR implementations fail to meet these laws. When it comes to defining an AR experience, the #3 is the most important: do not implement AR only for a cool factor; if a traditional interaction technique (on computers, mobile devices etc.) does a good job – do not try to do recreate it with AR. Look for the specific experiences that can only be achieved with AR, even if it’s very niche.

The talk is based on my experience in the last 6 years building AR applications and reviewing practically every AR app that was published in that time frame. I have seen many applications that have a wow factor that lasts for 2 minutes – but most applications are not used more than once. Designers and producers need to look at it in a very different way than traditional user experiences. It’s important to understand that AR is about digitizing our interacting with the physical world. It should not be viewed as a traditional form of Human Machine Interaction (HMI). But rather be thought of as Human-World-Interaction, which requires a new thinking, new rules, and new experiences.

I believe that in the next few years we’ll see AR becoming an integral part of any aspect of our work and life. And it will completely change the way we interact with people, places and things. Of course traditional approaches (PCs, mobile touch) will still be best for certain things – and AR shouldn’t be forced for things that it’s not intended for – but it’ll create new categories of things that we can’t even imagine. AR has the power to enable us to do things and feel things we couldn’t otherwise. It can help us learn, and master skills instantly. AR Technology has reached a “good enough” level; it is up to designers to bring it to the masses in a meaningful way.

The Auggie Awards™ have been promoting excellence in Augmented Reality since 2010. It is now extending the 4th Auggie Award™ Competition with new exciting categories. Here’s the full download on Why, What, Which, When, Where, Who and How…

Looking forward to your AWEsome submissions by the deadline on May 15th!

Why Auggie Awards?

The Auggie Award™ Competition is an initiative by AugmentedReality.org, the producer of Augmented World Expo – the world’s largest gathering of Augmented Reality Professionals. The goal of the competition is to promote excellence in Augmented Reality in various platforms and categories. The competition is intended to bring the best in Augmented Reality from around the world to the public at large, and to recognize the best in AR: teams who develop technology and solutions that demonstrate how AR can empowers humans to interact with the world in a more advanced, engaging, and productive way.

Consistent with AugmentedReality.org’s mission to educate, connect, and hatch new initiatives in the augmented world, and AR.ORG’s global goal to inspire adoption of AR towards the goal of 1 billion active AR users by 2020, AR.ORG is co-sponsoring this competition, offering award trophies and cash prizes.

What Are the Prizes?

AWE and its award sponsors are proud to offer combined cash prizes of over $20,000, and 6 coveted Auggie Trophies – each one-of-a-kind hand made art sculpture by a renowned artist.

Looking forward to your AWEsome submissions!

Momentum is rapidly building for Augmented World Expo™ (AWE), the world’s largest expo dedicated to the augmented world. Registration is trending toward 1000+ AR professionals and over 100 leading augmented reality innovators have confirmed participation in AWE 2013. The agenda for our conference is shaping up to make this edgy conference the most important event of 2013.

Register now with ourEarly Bird Rate, which will be expiring in just one week (April 9th) and save $200!
After April 9th the full conference pass price will increase to $595.

If you’re interested to see what Augmented Reality is all about, you won’t want to miss this high-power meeting of the minds featuring 5 AWE-inspiring headliners, 35 hours of AWEsome sessions, shock-and-AWE activities, and an Expo that will leave you AWEstruck.

Keynote speakers confirmed: Bruce Sterling (“The prophet of AR”) and Will Wright (legendary creator of The Sims and reality gaming) will return with mind blowing keynotes. We’ve also added Philip Rosedale (creator of Second Life), Toni Ahonen (former Nokia executive and author of 12 best selling books on mobile tech), and Steve Mann(inventor who’s been designing and using AR eyewear for over 30 years).

Start building your own customized schedule today by saving to your calendar your preferred sessions out of this amazing 110 speaker lineup.

Top 3 Reasons to Attend AWE 2013

Network

1000+ AR professionals — find AWEsome products and partners

Learn

35 hours of AWE-Inspiring sessions by top industry innovators

Experience

100+ demos of “Augmented Humans in an Augmented World”

Augmented World Expo will showcase the best in augmented experiences covering all aspects of life: health, education, emergency response, art, media and entertainment, retail, manufacturing, brand engagement, travel, automotive, urban design, and more. It will be the largest exposition to bring together technologies for Augmented Humans in an Augmented World, including emerging interrelated technologies such as augmented reality, gesture interaction, wearables computing, smart things & robotics, cloud & big data, and 3D printing.

The AWE 2013 program includes:

Pre-Event Tutorial (Mon. pm) – review and comparison of the best AR SDKs

The ARt Gala – AR art displays by top artists and a food and drinks reception

The Startup Launch Pad – Showcase for innovative startups and a competition

The Conference Expo – Leading AR companies display their products in the expo

AWE 2013 will be held at the Santa Clara Convention Center, on June 4-5, 2013.
Don’t miss out on our special Early Bird Rate. Register now

About AugmentedReality.org

AugmentedReality.org is a global not-for-profit organization dedicated to advancing and promoting the true potential of augmented reality. As trusted partner of its members and supporters, AR.org facilitates and catalyzes the global and regional transformation of the AR industry. AR.org also owns and produces the largest international event for augmented reality and the global forum for AR innovation: Augmented World Expo (formerly Augmented Reality Event). All profits from the event are reinvested into AR.org’s industry services.