Virtual bears, futuristic pods and spaghetti rain in Tokyo

It might earn them university credits but a robot developed by students in Japan won't help dispel the image of the lazy student. The robot has been designed to fold clothes.

The bot, called "Foldy," works with a camera mounted above a flat surface. An item of clothing is laid out on the surface and an image of it is captured into a PC. The robot operator can then draw fold lines on the PC. Once that's done it's just a case of clicking a button and Foldy goes to work.

The wheeled robot is about the size of a shoe box and has a pair of long grippers that it uses to grab the clothing and fold it. A 2D bar code sits prominently and clearly on top of the robot and allows the PC to control it by watching its position and movements through the camera.

The robot was developed by students at Keio University as part of the Japan Science and Technology Agency's Erato project for advanced research. It was one of several futuristic prototypes on show over the weekend at the Digital Contents Expo in Tokyo.

Providing a bridge between the real and virtual worlds at the show was a small ring with haptic feedback. With the ring on a finger its possible to touch a virtual creature.

The user wears the ring and sits in front of a PC screen on which a live image of the desk area immediately in front of the user is shown. Within the image a virtual creature -- in the demonstration it was a small bear -- is added. Also hooked up to the PC and mounted above the desk are two Wii remote controls that act as sensors and monitor infrared lamps on the ring to help determine its position.

As the user brings his or her finger to touch the bear a small motor on the ring causes vibration to simulate the sensation of touch, said Shoichi Hasegawa, a professor who leads the Tokyo Institute of Technology laboratory where the project was developed.

The project went beyond the haptic feedback ring and also tried to simulate cute responses from the virtual bear. Its movements are computed in real time depending on the interaction with the user and it will do things like try to touch its paw with the user's finger or follow the user's finger with its eyes.

A simulation of a different kind has been developed by Osaka University students. The "Funbrella" takes a regular umbrella and mounts a speaker inside at the point when the handle joins on to the umbrella's canopy. Because the speaker vibrates when it produces sound it can be used to simulate both the sound and feel of holding an umbrella during a rain storm.

The development team recorded the sound and vibrations from a real rain storm and used that data to come up with their simulation. They also demonstrated more extreme and unusual types of weather such as spaghetti or rubber snakes falling from the sky. With each hit of a snake the umbrella shuddered and a thud was heard.

Perhaps the most unusual looking project on show was the "Media Vehicle," a virtual reality pod developed by Tsukuba University. When closed the device appears to be two overlapping large white spheres on legs. Inside is a chair and projector.

When closed the inside of the top sphere is right in front of the occupant. It acts as a screen for the projector which is being fed with video from a camera with a fisheye lens. On a regular monitor the image appears rounded and distorted because of the lens but when projected onto the rounded surface inside the pod it appears more normal and immersive.

Sensors are attached to the camera so motors on the pod's legs can imitate its tilts and dips, adding to the virtual reality sensation.

The pod, like the other exhibits, are some of the latest prototypes from Japanese university research.

Copyright 2017 IDG Communications. ABN 14 001 592 650. All rights reserved. Reproduction in whole or in part in any form or medium without express written permission of IDG Communications is prohibited.