People who are blind tend to adopt sequential, route-based strategies for moving around the world. Common strategies take the self as the main frame of reference, but those who perform better in navigational tasks use more spatial, map-based strategies. Training in such strategies can improve performance. Virtual Environments have great potential, both for allowing people who are blind to explore new spaces, reducing their reliance on guides, and aiding development of more efficient spatial maps and strategies. Importantly, Lahav and Mioduser have demonstrated that, when exploring virtual spaces, people who are blind use more and different strategies than when exploring real physical spaces, and develop relatively accurate spatial representations of them. The present paper describes the design, development and evaluation of a system in which a virtual environment may be explored by people who are blind using Nintendo Wii devices, with auditory and haptic feedback. Using this technology has many advantages, not least of which are that it is mainstream, readily available and cheap. The utility of the system for exploration and navigation is demonstrated. Results strongly suggest that it allows and supports the development of spatial maps and strategies. Intelligent support is discussed.

This paper will discuss preliminary findings of user preferences regarding video game and VR game-based motor rehabilitation systems within a physical therapy clinic for patients with SCI, TBI and amputation. The video game and VR systems chosen for this research were the Sony PlayStation® 2 EyeToy™, Nintendo® Wii™, and Novint® Falcon™ and an optical tracking system developed at the Institute for Creative Technologies at the University of Southern California. The overall goals of the current project were to 1) identify and define user preferences regarding the VR games and interactive systems; 2) develop new games, or manipulate the current USC-ICT games to address these user-defined characteristics that were most enjoyable and motivating to use; and 3) develop and pilot test a training protocol aimed to improve function in each of the three groups (TBI, SCI and amputation). The first goal of this research will be discussed in this paper.

Virtual Cane/Guide DogWiiMote can be used as a pointing device, and can give auditory, visual and haptic (it rumbles) feedbackVirtual cane – uses auditory and haptic feedbackSupport with an intelligent agent which gives spoken warnings and adviceCombine to create a virtual guide dog

Tuesday, 21 October 2008

Our paper is now available in the ACM Electronic Library. If you're unable to access it there you can send me an email or a message here and I'll get a copy to you. Please send an email to G.White at my university, sussex.ac.uk

@inproceedings{1413663,author = {Gareth R. White and Geraldine Fitzpatrick and Graham McAllister},title = {Toward accessible 3D virtual environments for the blind and visually impaired},booktitle = {DIMEA '08: Proceedings of the 3rd international conference on Digital Interactive Media in Entertainment and Arts},year = {2008},isbn = {978-1-60558-248-1},pages = {134--141},location = {Athens, Greece},doi = {http://doi.acm.org/10.1145/1413634.1413663},publisher = {ACM},address = {New York, NY, USA},abstract = {3D virtual environments are increasingly used for education, business and recreation but are often inaccessible to users who are visually impaired, effectively creating a digital divide. Interviews with 8 visually impaired expert users were conducted to guide design proposals, and a review of current research into haptics and 3D sound for auditory displays is presented with suggestions for navigation and feedback techniques to address these accessibility issues. The diversity and volatility of the environment makes Second Life an unusually complex research object, suggesting the applicability of our work for the field of HCI and accessibility in 3D virtual environments.}}

Tuesday, 23 September 2008

Last week I returned from Athens where I presented our paper at DIMEA 2008.I thought the presentation went well, with about 15-20 people in the audience. There were a few good questions at the end where we discussed amongst other things the use of synthetic verses naturalistic sounds for 3D spatialisation, from a semiotic point of view, and the potential of formal, structured data like VRML for environments like Second Life.

The presentation itself dealt with 3D virtual environments more generally than the paper, which uses SL as a case study. I demonstrated screen readers and AudioQuake as an example of 3D sonification in an audio game. All presentations in the conference were arranged by theme, and my paper was included in the track called "Social and Collaborative Spaces", so the emphasis of my talk was to raise awareness of the issues for blind and visually impaired users in these environments, and also to call into question just how "social and collaborative" they are when they exclude a sector of society. Following the previous day's excellent keynote by Professor Michael Meimaris, in which he talked about "Digital Natives" and "Digital Immigrants" I coined the phrase "Digital Outcasts" in the context of rapidly evolving but inaccessible technology.

Please get in touch if you'd like a copy of the paper or presentation slides.

Friday, 4 July 2008

As referred to in a previous post, some of the IBM folk have developed an accessible game called PowerUp which is described in the following paper,

@inproceedings{1358752,author = {Shari M. Trewin and Mark R. Laff and Anna Cavender and Vicki L. Hanson},title = {Accessibility in virtual worlds},booktitle = {CHI '08: CHI '08 extended abstracts on Human factors in computing systems},year = {2008},isbn = {978-1-60558-012-X},pages = {2727--2732},location = {Florence, Italy},doi = {http://doi.acm.org/10.1145/1358628.1358752},publisher = {ACM},address = {New York, NY, USA},abstract = {Virtual worlds present both an opportunity and a challenge to people with disabilities. Standard ways to make such worlds accessible to a broad set of users have yet to emerge, although some core requirements are already clear. This paper describes work in progress towards an accessible 3D multi-player game that includes a set of novel tools for orienting, searching and navigating the world.}}

My first impressions suggest that it's quite similar to Second Life in some respects (fixed name lists, Orientation Center).

Thursday, 3 July 2008

Although our project's finished I've just coincidentally come across some interesting papers that seem relevant to our work,

Desurvire, H. & Wiberg, C. (2007). Master of the Game: The Crucial Role of Accessibility in Future Game Design. In Wiberg, C & Wiberg, M. (eds.) Proceedings of CMID´07 - The First International Conference on Cross-Media Interaction Design, March 22-25, 2007.

@inproceedings{1358752,author = {Shari M. Trewin and Mark R. Laff and Anna Cavender and Vicki L. Hanson},title = {Accessibility in virtual worlds},booktitle = {CHI '08: CHI '08 extended abstracts on Human factors in computing systems},year = {2008},isbn = {978-1-60558-012-X},pages = {2727--2732},location = {Florence, Italy},doi = {http://doi.acm.org/10.1145/1358628.1358752},publisher = {ACM},address = {New York, NY, USA},abstract = {Virtual worlds present both an opportunity and a challenge to people with disabilities. Standard ways to make such worlds accessible to a broad set of users have yet to emerge, although some core requirements are already clear. This paper describes work in progress towards an accessible 3D multi-player game that includes a set of novel tools for orienting, searching and navigating the world.}}

Monday, 11 February 2008

[EDIT: 6th May 2008] This stage of the project is now complete, we're no longer looking for volunteers. Many thanks to those who participated.

---

For the next stage of our project we will conduct interviews with people who are blind or significantly visually impaired.

We'd like to take 30 minutes of your time for a voice chat to hear about your experiences of getting around in the real world, and any experiences you have of doing so in virtual worlds. The aim is to direct our further work developing interaction techniques for blind users in Second Life.

Briefly, the Falcon is a consumer 3D haptic device targeted at the mass gaming market. Bit-Tech recently published an in-depth review which is worth reading. The product is interesting for this project as a mobility device to complement spatial audio. Think of it as a super-sensitive long cane white stick that could allow SL residents to reach out from the physical world and realistically feel objects in the virtual world.

Optional hardware devices are usually ignored by game developers as the effort required to support them is not justified by the small number of users who own the devices, and as there are few games that support them it's unlikely that many gamers would buy such devices: Catch 22. However as pointed out by a reader of this blog, blind users will often spend thousands of dollars on specialist hardware such as Braille keyboards, so the $190 that the Falcon costs is a relatively small investment. Furthermore support for the device in SL and other open source or moddable software can be implemented by the community rather than relying on industry.

The Falcon compares favourably to other haptic devices which cost upwards of ten times the price (for example Sensable Technologies' Phantom range of haptic devices are in the range of multi-thousand pounds Sterling.), and there are already some other academic researchers investigating haptics in Second Life and the Falcon in particular.

In Second Life the only way to navigate with a mouse is to bring up an on screen navigation menu that you have to click to move the avatar. It works okay when the avatar is flying, but otherwise you just end up using the buttons on the handle to move around. However, just in case anyone wants to work with the script, here it is.

In Linden's default client movement is controlled using the keyboard, but in my own research I have recently been able to control by walking and flying using a force feedback joystick (Logitech Wingman Strike Force 3D). This was made possible by using a free 3rd party tool called GlovePIE which VanDrimmelen's team also employed. The tool works by intercepting output from the joystick and injecting the corresponding keyboard signals, such that by moving the joystick left and right the Second Life avatar turns left and right, and moving the joystick forward and backwards moves the avatar forward and back. The same technique is used by VanDrimmelen's team to use the Novint Falcon as input device for Croquet. This approach appears to offer a very quick and easy way to prototype haptics in Second Life. VanDrimmelen continues, however,

Currently both of these drivers are "in exploration phase" with no estimated completion date. Also in their (busy!) release schedule Novint also describe another interesting product, "Feelin' It: Blind Games™":

Novint will release a number of games that can be played entirely without sight. For example, in a bowling game, you will be able to feel the extents of the lane, feel the weight of the ball as it is thrown, and hear the pins crash down. After throwing the ball and hitting the pins, the game will bring up a touchable representation of how the ball traveled down the lane to guide the user's muscle memory for future shots, and the user will be able to feel with a 3D cursor which pins are still standing. All the information needed to play the game and become a true master, will be available without any graphics.

Judging from the screenshot, I would imagine that the Siena team are not using the Novint, but rather a different haptic device that has a stylus, perhaps one of SensAble Technology's Phantom range which seem popular in academic research.

"The unique design of the torch allows users to range from sighted individuals in low-light conditions to people who are both deaf and blind. The torch provides a method of alerting users to presence of potentiol hazards using non-contact measurement techniques. An subtle tactile (touch) interface conveys relevent information to the user while not interfering with other senses." [sic]

Whereas the Haptic Torch is only capable of signifying the presence of objects, the Falcon could be used to reach out and feel their shape, and this immediate physical stimuli will assist the users construction of a mental map of the virtual space.

Of the suppliers that Novint list as selling the Falcon, Fry's looks like the best for delivery outside the USA and Canada, although it should be noted that in the Gamasutra podcast mentioned earlier, Novint's CEO, Tom Anderson, mentioned that they are providing Falcons to game development studios for free and they have also used the Falcon as an interface for medical dental simulators with the Harvard School of Dental Medicine. With their aggressive PR policy perhaps they'd extend this generous offer to other academic research projects too?

Fry's"Through DHL, we quickly ship international orders just about anywhere in the world for very reasonable rates."

SkyMall"Many SkyMall products are available for delivery outside the United States."

CircuitCity"Due to our manufacturer distribution agreements, we are not permitted to ship products to international addresses except APO/FPO or U.S. Territory addresses. Circuit City does not ship to Puerto Rico."

GoGamer"GoGamer does not ship to International Destinations at this time."

JR"At the present time, we only ship to the Continental U.S., Alaska, Hawaii, U.S. territories, Puerto Rico, and Canada. J&R proudly ships to our Armed Forces APO/FPO customers."