Test Report

How to improve the limited VR experiences of individual immersion for an interactive art project?

How to make shared awareness in a VR experience between HMD wearer and other observers by merging other existing emerging technologies?

How to use speculative design as a tool to explore mobile-rendered mixed reality HMD for the future?

How to make active communications and interactions between HMD wearer and other observers to overcome the trust issues?

How to use game design methods to set the goals, constraints in the shared VR experiences?

How to discuss the relationship between human and animals by showing different perspectives in a shared VR experience?

The purpose of this user testing is to gain some useful feedback to examine my main research question about the shared awareness between HMD wearer and other audiences to further improve the experience. From more technical part of this user testing, I want to test out the wearability of my headset, the usability of the AR app, the functionality of the IoT communications. Beyond that, I also want to test the artistic concept behind relationship and different perspectives of human and animals.

Prototype:

HMD wearer: A DIY VR headset with LEDs, servos and felts marker. An Android device that running VR scene with the designed rabbit’s vision to be placed inside the headset.

Observer: A ios device that running AR app which can control the components on the headset.

User flow:

HMD wearer put on the headset

Observer launch the AR app

Observer touch the ON button on AR app

LED on the headset turn on and motors move

Observer touch the OFF button on AR app

LED on the headset turn off

Types of my user testers:

Self-testing

My cohorts

Professors

People I don’t know

Before the formal testing on Tuesday, I tested my prototype on myself and my friend both as an HMD wearer and observer. I was aiming to make the VR scene moving like a rabbit but both of us were feeling very sick when the VR camera bouncing and moving forward. As a consequence, I decided to remove the ability to move in VR for my user testing. On Tuesday, our class ran a user testing session with my cohorts and professors. After that, I also asked someone who completely has no idea of my thesis to test out my prototype.

Methods of user-testing:

One-to-one testing

Interview

Open discussion

Test with strangers

Test with cohorts

In-session observations

The HMD wearer moved their head in a large-range once they put on the headset to look around in the VR. The observer who is holding the ios device had to chase the movement of the headset to be able to trigger the AR control panel.

The HMD wearer didn’t feel any uncomfortable when wearing the headset and their body movement is pretty natural.

The HMD wearer tried to walk around and wondered the interactions in the VR.

The observer with the ios device had some difficult to touch the button in the AR app.

The sound in the VR scene is not loud enough.

Some people have some difficulties to adjust their eye focus in VR, I need to tell them to watch trees that far away, which helped a lot.

Questions and interviews

To HMD wearer:

How do you feel like as a rabbit in VR?

How do you find the VR scene?

How do you feel like when the ears are moving?

Do you feel any uncomfortable when wearing the headset?

Is the headset heavy to you?

To AR App observer:

Do you find the app to be difficult to use?

How do you feel when you can interact with the HMD wearer?

Feedbacks and Answers:

The headset is not heavy at all, It’s quite comfortable to wear.

I would prefer more instructions from you.

The menu in AR app is too small.

I wonder why there is a campfire on the headset.

The sound is too low, I can’t hear anything.

I wonder If there is any interaction I can do in VR.

If I were a rabbit, I would prefer to see my pink noise, furry feet and pows when I look down.

Wow I feel like I am so small like a rabbit.

I prefer the small movements of the ears, I feel like the big movements are too robotics.

I can see some shadow of trees is pink.

I wonder if a rabbit can turn their neck at such a big angle.

I found the graphics in VR are a bit rough.

I feel it’s a bit blur in VR.

Reflections and Revision ideas

Based on the feedback I received from the testers, firstly I want to speculate a background story which can both be instructions for the participants but also to immerse my participants in a cyberculture futurist setup and further introduce the idea of human-animal perspectives.

Secondly, I would like to improve both the AR and VR apps. For VR, I want to add a lab scene outside the forest to make sense of my background story and also improve the graphics quality. For AR, I want to replace the small buttons with a 3D rabbit model, so instead of touching on the buttons, my user can touch the specific parts of the rabbit model to interact with the corresponding parts on the headset.

Adding background story:

Today is 26th July 2038, My name is Yiyi Shao, I’m the chief scientist at AniBot Lab. Welcome welcome! It’s an exciting day to the public our new project RabBot. We implanted two robotics ears and eyes to our test subject. It’s still the first stage of our experiment, but we saw a remarkable success. Now may I invite our guest to wake up RabBot1.0.

Overall, I found the user testing is very useful to my thesis. It helped me to find a way to combine my theoretical framework of cybernetics, speculative design and my research question about shared awareness between HMD wearer and other observers. It also helped me set the artistic style that I intend to incorporate into my design. Beyond that, I gained the positive results from the RTD, UCD, agile methodology I want to use for my thesis, the design and research contributes to each other to examine my research questions. From the user testing feedback, I understand more on what the HMD wearer and other observer expect to interact and communicate with each other. And all of the results will be revised on my next steps of prototypes.