Day 2 of Google I/O: 'Lookout' for advanced AR and VR

With some major awe-inspiring announcements on the use of Artificial Intelligence (AI) on the first day of the 11th edition of Google I/O, the internet giant's flagship conference, we now move to the surprises on the second day.

Google CEO Sundar Pichai (Image: Flickr)

On Day 2, Google announced that it has entered into an agreement to acquire Israel-based Velostrata, a leader in enterprise cloud migration technology. With Velostrata, Google Cloud customers can obtain two important benefits - they will be able to adapt their workloads on-the-fly for cloud execution, and they can decouple their computer from storage without performance degradation. More details on this acquisition are yet to be shared.

It also reiterated its focus on Machine learning (ML), which is now at the core of many of Google’s products. In this context, Google talked about TensorFlow, that the company claims has been an essential component in the work of scientists, researchers, and even high school students around the world. Google says,

"At the I/O, we are hearing from some of these people, who are solving big (we mean, big) problems like the origin of the universe, that sort of stuff."

Moving further, here are the key announcements and innovations from Day 2:

Open-source and smart Samsung Chromebooks

Last year at Google I/O, the company announced its plan to bring Google Play store to Chromebooks. Chromebooks come with several smart features like simple automatic updates, built-in virus protection, running multiple offline and online apps simultaneously.

Keeping the promise, Google made the official an announcement on Wednesday that it is working with Samsung to bring it to the Chromebook Pro, which will be available in the market in April 2019.

The official statement said,

"This year we’re making it possible for you to code on Chromebooks. Whether it’s building an app or writing a quick script, Chromebooks will be ready for your next coding project."

Augmented Reality (AR) all around

Three months ago, Google launched its homegrown Augmented Reality technology ARCore for building different AR experiences. At the I/O, it rolled out a major update to ARCore to help developers build more collaborative and immersive augmented reality apps. As part of this, with an application like Human Anatomy Atlas, users can visualise and learn about the intricacies of the human nervous system in 3D.

With the free-to-download Magic Plan app, users can create a floor plan for their next remodel just by walking around the house. This technology is based on a multiple patent technology leveraging the gyroscope.

The company said,

"ARCore now features Vertical Plane Detection, which means you can place AR objects on more surfaces, like textured walls. This opens up new experiences like viewing artwork above your mantelpiece before buying it. And thanks to a capability called Augmented Images, you’ll be able to bring images to life just by pointing your phone at them - like seeing what’s inside a box without opening it."

Developers can now start building with these new capabilities and try them out on ARCore-enabled apps on Play Store.

Lookout: App for the visually impaired

Google said that accessibility will be an ongoing priority for the company at the moment, and Lookout is one step in helping visually impaired people gain more independence by understanding their physical surroundings.

The core experience of the application is processed on the device, which means the app can be used without an internet connection. Lookout will soon be available on Google Pixel devices, which is recommended to be worn in a lanyard around the neck, or in the shirt pocket, with the camera pointing away from your body. Lookout delivers spoken notifications, designed to be used with minimal interaction allowing people to stay engaged with their activity.

There are four modes to choose from within the app: Home, Work and Play, Scan or Experimental.

The statement read:

"If you are getting ready to do your daily chores, you’d select 'Home' and you’ll hear notifications that tell you where the couch, table or dishwasher is. It gives you an idea of where those objects are in relation to you, for example 'couch 3 o’clock' means the couch is on your right. If you select 'Work & Play' while heading to the office, it may tell you when you’re next to an elevator, or stairwell. As more people use the app, Lookout will use machine learning to learn what people are interested in hearing about, and will deliver these results more often."

Virtual Reality (VR) tours now under the ownership of students

The Google Expeditions programme lets students take virtual reality trips to over 200 places including the Buckingham Palace, underwater in the Great Barrier Reef, and Dubai’s Burj Khalifa, the tallest building in the world. Since 2015, Google Expeditions has brought more than three million students to these places with virtual reality (VR) and augmented reality (AR), and both teachers and students have told Google that they would love to have a way to also share their own experiences in VR.

As a result, Google has introduced Tour Creator, which will enable students, teachers, and anyone with a story to tell, to make a VR tour using imagery from Google Street View or their own 360 degree photos. Once a student or a teacher has created a tour, they can publish it on Poly, Google’s library of 3D content.

The company added,

"From Poly, it’s easy to view. All you need to do is open the link in your browser or view in Google Cardboard. You can also embed it on your school's website for more people to enjoy. Plus, later this year, we’ll add the ability to import these tours into the Expeditions application."