]]>The technology’s ability to combine real-world images with the vast amounts of online data in real time promises to change the way work is done across many industries. But short-term implementation challenges remain.

]]>Looks like there may be some consolidation in the augmented reality scene: TechCrunch reports that the U.K.’s Blippar has bought Dutch rival Layar. Both companies focus heavily on bringing print ads to virtual life. Layar is a real AR veteran; 5 years ago its original, non-marketing-centric app did a lot to popularize the concept (at least, among geeks.) Now both Blippar and Layar are trying to make AR finally take off through the use of Google Glass. If the deal’s real — I’ve been unable to get confirmation — I wonder what will happen to Layar’s interoperability pact with Metaio and Wikitude.

]]>Former Autonomy boss Mike Lynch has invested in a company called Neurence through his Invoke Capital fund. Based in Cambridge, the firm was itself formed by former employees of Autonomy’s Aurasma augmented-reality (AR) division after HP bought Autonomy. Neurence has a new iPhone AR app called Taggar, which lets users overlay their own photos, videos and stickers on top of real-world objects, for viewing by other users when they hold their phone up in front of the object. Invoke’s previous investment, in September, was a Cambridge security firm called Darktrace.

]]>For teachers, it can be tough to tell when students are actually absorbing new information. They’re often so focused on the materials and on trying to keep the class alert that they can easily miss body language that suggests that a student is completely lost.

Researchers at la Universidad Carlos III de Madrid have developed the Augmented Lecture Feedback System (ALFs), a HUD glasses interface that teachers can wear while giving a lecture. From the display, teachers can see little icons that appear above each student’s head, indicating their comprehension of the given lesson as well as an overall chart indicating how many students are “getting it” compared to the rest of the class. The symbols the system supports also include a request for the lecturer to slow down, and a notification that a student knows the answer to a question posed in class.

Students are able to show their status by connecting their smartphones to a server where the system is installed, and select the symbols accordingly.

The prototype was developed with a hacked Kinect, which utilizes recognition to place markers above the students head, and a heavier AR display connected directly to the computer system that runs software.

“It is hoped that in the next few years new models will come onto the market and these will be suitable for use in class, as might be the case with the new Google glasses, which could be adapted to this system,” UC3M researcher Ignacio Aedo said in a university article on the project.

For now, this is just a lab project — a proof-of-concept device, though the researchers have tried them in some university classes. But the project could point to an interesting use case for Google Glass — one that provides in-class help to students without disrupting the traditional lecture system or punishing those without a display.

]]>On Thursday, Metaio, the Germany-based augmented reality (AR) company, announced a deal with ST-Ericsson which will see the latter integrate a specialized AR processor into the next generation of its mobile chipsets.

This will be the first dedicated chip of its kind to see commercial deployment, and it should have a big impact on the power consumption of AR applications, which are today generally a big battery-suck due to their intensive use of graphics and, increasingly, 3D rendering. As Metaio CEO Peter Meier put it in the statement:

“The AREngine will do for augmented reality what the GPU did years ago for the gaming industry. This is a great leap in the AR space, and we strongly believe that the AR Engine working with ST-Ericsson platforms will help realize the augmented city — the idea of a completely connected environment powered by augmented reality and made possible with next-gen, optimized mobile platforms.”

Here’s the video the companies put out. Notice the emphasis on the use of the AREngine chip in smartphones:

[youtube http://www.youtube.com/watch?v=6br7NreTwD4&w=560&h=315]

That emphasis on handsets is understandable because ST-Ericsson’s business today is largely in smartphone chipsets – it is surely no coincidence that ST-Ericsson is supposedly going to be supplying its NovaThor chipsets to Nokia, which takes great pride in the CityLens AR app that runs on its Lumia handsets.

However, while AREngine may make use of such apps slightly more attractive on smartphones, I don’t think power consumption is the main reason why people don’t walk around constantly holding their phone at arm’s length in front of them. Here are three far more likely reasons: it looks absurd, it’s dangerous, and it represents poor ergonomics.

That’s not to say AR is useless – far from it; it’s occasionally handy today and I believe there are many cool applications lying on the other side of a tipping point we’ve not yet reached. It’s just that, with smartphones, AR makes the most sense in short bursts, like when you actively need to establish the direction in which you should next walk. And power’s less of an issue there.

Where the AREngine processor would be superbly useful is in smart glasses, of the Google Glass ilk. These devices will be the real tipping point for AR – they remove the absurdity, danger and poor ergonomics of physically and consciously holding something out in front of you as you walk.

And as such wearables get redesigned to make their users look less like tools, their sleeker, skinnier new look will mean less battery space. Combine that with the fact that such devices will need to constantly display AR data, and Metaio and ST-Ericsson’s technology becomes a no-brainer.

]]>Astute mobile application vendors are bringing to market applications that help mobile users connect and interact with people in close proximity. We expect this emerging market — what we call proximity-based mobile social networking — to grow to $1.9 billion in revenues by 2016.

]]>It may be Thanksgiving Weekend over in the U.S., but at Disneyland Paris it’s Minecon, a conference devoted to all things Minecraft. And, in a glorious collision of gaming and next-level augmented reality, 13th Lab is using the opportunity to show off its latest capabilities in an iOS app, developed alongside Mojang, called Minecraft Reality.

For those who need reminding, Minecraft is to digital gaming as Lego is to physical gaming – a sandbox effort that lends itself as much to the demonstration of engineering prowess as it does to standard gameplay. 13th Lab, which also hails from Sweden, is an augmented reality outfit that uses a technique called simultaneous localization and mapping (SLAM), also employed in autonomous vehicles such as the Mars Rover.

Minecraft Reality isn’t the first attempt to bring Minecraft worlds into the real world through augmented reality, but it’s certainly the most advanced. As this video shows, Minecraft addicts can insert their creations into their environment in a pretty fixed, non-floaty way – all the way from small objects up to entire buildings:

[youtube http://www.youtube.com/watch?v=2pOpcR7uf5U&w=560&h=315]

The $1.99 app comes with some Minecraft creations preloaded, but people can upload their own for other users to see when they pass the chosen location.

Now, this isn’t just a one-off for 13th Lab. Indeed, in many ways it’s a demonstration of the company’s new Pointcloud SDK, also launched on Saturday. The SDK is free to download for any developer who wants to build this kind of AR functionality into their iOS app (Android’s on the horizon too) and, according to 13th Lab CEO Petter Ivmark, it’s a vast improvement on the previous iteration.

“Before we had a version which is pretty limited in scale and performance compared to this new one,” Ivmark told me. “We’ve taken everything up a notch, in terms of how big an area you can map. Now we can do full rooms, big areas like that. We’re working to scale this indefinitely, basically. There are a lot of valuable things you can think about if you’ve got sub-centimeter accuracy in the real world.”

The company, which also produces a Pointcloud browser for iOS, has pretty big ambitions. Ivmark talks of making a mobile device’s camera even more important than the GPS chip when it comes to navigating the world. The technology certainly must have impressed Mojang – there have been very few partnerships around Minecraft, despite the game’s extreme popularity. (One, amazingly, is with the United Nations.)

Two recent AR implementations show promise, however, by adding value and actually demonstrating how useful the technology can be. One is Converse’s Sampler application for Apple iOS devices. From practically any location, you can shop for Converse sneakers using this free software. “OK,” you say, “I’ve been able to do that for years, and not just for sneakers.” You’d be correct, but this app leverages AR beyond simple browsing for shoes: you can actually see how a virtual sneaker looks on your foot. How? The software superimposes Converse sneakers over the live image of your feet, captured from an iPhone’s camera sensor.

For now, it’s not technology holding back adoption, because smartphones and other mobile devices are more than capable of enhancing the real world with virtual add-ons. The problem appears to be one of finding appealing uses to spur adoption of augmented reality solutions. Previewing virtual clothes and sneakers on our real bodies, however, just might be the application to help AR hit its stride.

]]>Augmented reality (AR) apps (like WorkSnug, a neat app that helps you to find good places to work nearby that I wrote about last year) are becoming more commonplace. GigaOM has published a great set of infographics summarizing the main AR apps, how they work, and the benefits they provide.

]]>Augmented Reality (AR) is one of those cool technologies (subscription required) that fascinates me. The ability to merge what is seen through a phone camera into the real world and leverage it in useful ways is awesome. When the cool factor settles down, I’m left with the reality of AR, and that is trying to find real-world uses for it that add benefit. Car Finder by Intridea is a good example of a real benefit that AR can bring the user in everyday life.

How many of us have parked our car in a huge parking lot, and then searched for it when we return? It can be especially challenging at sporting events, when all you can see is what seems like miles of cars between you and your car. Car Finder eliminates that frustration by using the iPhone 3GS camera to show you where you left your car. You tell the program where you are when you park the car, and when you return it shows you where the car is, superimposed over those miles of other cars. It’s a useful app with a real-world benefit, and it only costs you 99 cents.