VR VIRTUAL DOG

The Project

The VR dog is a project with a real client that wanted to include a dog in a game that will be released on steam VR. So our team was tasked with designing and developing the behavior and the art of that character.

The project is a lot focused on finding optimal interactions in VR since we tried many "out of the box" ways doing the player-dog interactions, to make the dog believable and the experience enjoyable.

The Challenge

Interaction in VR is still at its infancy and we were tasked to find the best ones for a player-dog interaction. We wanted to avoid button pressing or traditional word recognition.

The goal was to make the dog feel alive and not feel like a robot receiving commands. By using a mix of volume recognition and gesture tracking, it is possible to simply simulate real interactions depending of the context.

For example if the dog is looking at the player and the player talks while having one hand extended, the combination would be recognized as pointing, even if no button was pressed. The dog would go where the player ask him to go in that case. This kind of interaction makes the dog feel more real than the usual dog we find in games.

Behavior Design

Volume Recognition

Simple Gestures

The dog acts as the player's leg. Since in VR the players moving area is very limited, the dog is very handy to retrieve inaccessible objects of the scene. But the player will have to ask the dog for them! The dog will react to objects in range and to the gestures and voice of the player, following a list of priorities. For example, food is usually the priority #1 of the dog. If there is food around, nothing else matter! If the player is holding food, the dog will listen carefully to all the tricks the player is telling. The dog can execute many tricks and also has its own "idle behavior" when the player is not taking care of it.

Volume recognition is a key element of how the dog behavior works. We stayed away from word recognition because of its delay and inaccuracy, also because it feels very robotic, while body language is much more natural. The game detects sound intensity in real-time from the headset microphone and will mix that value to other factors like orientation of the head and position of the hands to determine which action is communicated to the dog. Then, depending of the state of the dog, it will react or not to the player orders.

Simple gestures is another key element of the system. Mixed with volume recognition it was proven to be a very powerful mechanic during play test sessions. Players commented that the interaction feels natural because it is how they would interact with a dog. It is a "simple" gesture because there is almost no learning curve. For example, understanding that the dog will react to you when you are holding food is trivial. As soon as the player pick up food the dog will stare at it carefully until you throw it away. The height of your hand will influence the stance of the dog, and quick movement may triggers tricks like jumping or rolling over. Put your hand high and the dog will stand on 2 legs. All these gestures are very intuitive and easy since they require no button press and are very simple. You just do like you would with a real dog!