Independent Game Creator and Installation Designer

Menu

Pollock Ball

Development

Having this concept in mind, I started development by creating a timeline that would keep me on track to finish the entire assignment in three weeks. Build the experience and playtest it myself during the first week. Set up an event for playtesting during week two, and invite friends and family to try out the game themselves. Week three was then set aside to interpret feedback, create the demonstration video (with footage from the second week), and write this report.

Audience

I built the experience with the HTC Vive headset in mind because it’s the equipment that I currently have access to, but I designed the game specifically in such a way that it could be scaled to other hardware platforms with further development to reach a much broader audience. The intended audience is anyone with access to a virtual reality headset capable of six degrees of freedom interaction (for headset position and rotation tracking, and controller position and rotation tracking) and an interest in gaming or art. Currently, this audience is limited to higher-end Virtual Reality headsets, but because the game only requires a single controller, it could easily be experienced with untethered, inside-out tracking mobile VR headsets that could exist in the future, and allow it to reach a significantly higher number of people.

Design

Pollock Ball was created in Unity to be experienced in virtual reality, and I needed to make sure that I was designing the game in such a way that would feel natural in a virtual environment. To do so, each element of the experience needed to support the illusion that the player was actually in the room playing the game, and that the interactions were plausible. Unusual interactions are much more obvious with a relatively small set of objects in the room (the room itself, a paddle, the ball, paint, and the player) so I also needed to make sure that each element behaved in a way that you would expect them to in the real world. I did so by focusing on using haptic touch, audible and visual feedback, and by attempting to create natural physics interactions when designing each object.

The Room

I started building the project in Unity by modeling a room of a similar size and shape to that of the one found in Ray Wenderlich’s “HTC Vive Tutorial Unity”5 to experiment with movement in the space. I then added point lights, and disabled several shadows to make sure that the game could run on less powerful hardware if needed. Every surface has a box collider and rigidbody attached to ensure that interactions occur with the ball. I experimented with several versions of the room with much higher ceilings, and a cube-like shape by scaling the sizes of the surfaces before settling on a room that is much wider than it is tall (approximately 10m x 10m x 5m). The room felt comfortable in VR at this level, not too large as to get lost, and big enough for the ball to build up velocity when it is being returned to the player. With that said, the play area was significantly larger than that of a typical setup, and was a problem that I would later solve when configuring the player object.

I also added an audio source that is attached to the room prefab, with a “splat”1 sound that is played when the ball object comes in contact with a surface. In conjunction with the visuals, Unity’s built in spatial sound does a very good job of convincing the player that the splat is occuring at the point of contact.

Additionally, I created an AudioManager gameObject that controls environmental audio (in the form of white noise2) as well as music3. This is positioned near the center of the room where the player spends most of their time during the game and is used as a way of both engaging the player, and making sure that the game is not full of an eerie silence.

The Paddle

The paddle is technically a “Beach Bat”7 that I found on the unity asset store. Though it comes with a set of custom mesh colliders that have been disabled and replaced by a Box Collider and rigidbody that are slightly larger than the paddle itself that feel better when reflecting the ball in the game.

To make sure that the paddle is oriented properly within the environment, I have attached the paddle as a gameObject that is a child of one of the SteamVR6 controller prefabs and rotated the object by 90 degrees on the X-axis to ensure that the position and rotation are accurately represented in game. It’s worth noting that the child object needed to be set to “is Kinematic” to prevent the object from floating away when it is attached in this manner. I also disabled the display of the steam controller itself so it felt more natural, like you were holding the paddle itself instead of a controller with a paddle on it.

Like the room, an audio source is attached to the paddle that plays a “slapping” sound4 when the ball comes in contact with other objects. I also wrote a script that changes a flag on collision that triggers a quick haptic pulse on the controller. With accurate tracking, audio, and haptic feedback, the paddle is represented quite well in VR.

The Ball

The physics interaction between the ball and the paddle was my biggest and most important challenge during development because it is integral to how the experience works. I started by settling on a ball size that is about the size of a beach ball that felt natural and visible in virtual reality when the ball is at a distance. This is necessary because I was planning on having the player stay in a central location and the room is 5 meters in each direction. If the ball was too small, it became really difficult to keep track of.

This created a new challenge with collision detection with the controller object. Because you can swing very quickly in VR, it’s quite easy to go fast enough that you pass through the object before it is able to obtain the proper values for a new direction vector. Additionally, even if you decrease the mass of the ball, significantly reduce its friction, and apply a very bouncy physics material, it still manages to find its way to a corner of the room and stay there. I tried adding teleportation movement and adding the second controller object back in to pick up the ball for throwing, but the problem persisted. It just wasn’t very fun to teleport around the room, and play what was essentially a glorified volleyball serving simulator.

To solve this, I turned to two different online tutorials. Jason Weimann’s “Using Vector3.reflect to Cheat Ball Bouncing Physics in Unity”12 provided an example script that I could apply to the ball that would make it regularly return to the player by periodically adjusting the ball’s velocity to a vector that is towards a chosen object on screen. This made the ball have its own unique physics that weren’t necessarily compatible with the way I was hoping to use the paddle. FusedVR’s “Hey Batter Batter! Let’s Build Baseball Practice with the HTC Vive Tracker (Unity VR 2017 Tutorial)”11 helped me isolate the collision detection so that I could build a working script between the two. The end result is a paddle and ball that interact more like a child’s toy with a ball attached to a paddle with an elastic string than a tennis ball and raquet. It’s unusual at first, and has the potential to really break the sense of plausibility in the experience, but when you learn the rules of the world it does end up feeling really satisfying.

Paint

My first attempt to create paint in the room was through using an asset from the asset store called “Dynamic Decals”14. Using the example scenes as a guide, I implemented a function on the ball that would splatter several paint decals in a nearby area using a particle system. To improve performance, I disabled the rendering of the particle effects to just leave the paint splatters afterwards. This looked great, and I had the option to create additive or multiplicative effects (where paint could mix or add together to fill empty spaces).

Unfortunately, upon testing this version at home I realized that viewing these decals were extremely nauseating in VR because they weren’t being layered properly for having two eyes at different positions. They appeared to be layered in such a way that they are instantiated based on the camera position at the time of creation, which is fine if you stand completely still, and completely broken if you move at all. I needed to make significant changes to the way that the decal printing process was taking place.

I found a solution from another project, “Splatter VR,”13 that was part of a hackathon and published freely on GitHub. Their solution was to use a script to instantiate a quad prefab paint object with a splatter-shaped material when a paintbrush throws globs of paint at a wall. They also incorporated a color-picker capability for the left-controller that I thought was interesting, though unnecessary for what I was doing with Pollock Ball. I then used their material and paint prefab, and modified their script so that the color of the paint splatters randomize each time they are instantiated when the ball hits the wall. While I couldn’t easily mix colors, or splatter across multiple surfaces with this method, it felt much more natural (and less nauseating) when playing in VR than the multiplicative and additive features of the “Dynamic Decals”14 asset.

The Player

The SteamVR camera rig prefab tracks player position and movement through the Camera(head) object. Because there is no preset collider with that object, and I wanted the ball to bounce off of the player (rather than get stuck inside), I added a capsule collider and rigidbody to the prefab with a little bit of room so that the ball can get close enough to the player to let them know that they were hit, but not close enough to disorient them. Because the audio source for the noise is the player themselves when they are hit, they are immediately aware of it.

The game is also designed with natural movement in mind, but because the room is much larger than the tracked space, I needed to come up with a solution that would make sense for the game. Initially, I considered adding teleportation to chase down the ball if it got stuck, but once I playtested with the script that makes the ball follow the player, I realized that people wouldn’t have time to move much more than a few steps at a time and would generally recenter themselves in anticipation of the ball coming back. This meant that players could move naturally without needing to move around a larger space, and has the added benefit that the game could be played in smaller spaces or as a standing-only experience and still be comfortable.

Playtesting

I held a small holiday event to have my friends and family playtest the game, and collected qualitative feedback in the form of recorded video interviews. There was a small sample size of five participants that mostly had very little experience with Virtual Reality, and although their feedback is biased (because they know me well), I feel as though I received a large amount of positive feedback and useful suggestions for further development.

Processing Feedback

Most of the feedback that I received was related to the physics of the game. At first, most people swung at the ball the way that I had anticipated, and over time learned how to reflect the ball against the wall. Some needed additional instruction, but most were able to instinctively learn how it worked.

What I didn’t anticipate was the broken collisions having a positive effect. When the ball and the paddle don’t connect well, or at an odd angle, the ball is sent off slowly in a direction. This ended up building a sense of tension as players waited for the ball to come flying back at them. And because of this, most players tended to stay within a small radius and didn’t need to move around the room very much.

Most players really enjoyed the movement, and some hoped for additional options. For example, one person would have loved to see a score, while another just found the experience of filling up the room to be relaxing. In general, people seemed pleasantly surprised and excited after playing the game.

Further Development

Taking the feedback from my playtesters into account, if I were to continue development on Pollock Ball the first thing I would do is attempt to improve the collision and physics interactions between the ball and the paddle. I’d still like to keep the slowdown (I might be able to randomly decrease the velocity within a certain proximity of the wall for example), but I would most like to have the ability to swing and hit the ball in such a way that it works as you would expect. And although the current implementation works, it’s easily the biggest detractor to natural interactions in Virtual Reality.

Additionally, I’d love to be able to implement a screenshot camera that would allow Pollock Ball to capture images of the surfaces to be exported into a template for printing. Players could then keep a physical version of their painting as a souvenir if the game was presented at an exhibition or installation. This would also support the idea of porting the project to other platforms so they could be used in such a way.

Conclusion

As supported by the feedback from the playtest, Pollock Ball is very successful in creating an engaging experience that allows players to express themselves artistically in a virtual space while demonstrating several virtual reality techniques (like positional and environmental audio, natural movement with six degrees of freedom, haptic touch feedback, gestural interaction, and no evidence of simulation sickness). With further development, more natural physics could be implemented, and the game could be distributed to other platforms and hardware to reach a wider audience.