Developing a game using unity 3D and leap motion controller

Author

Muhammad Salihan Zaol-Kefli

Date of Issue

2016

School

School of Computer Engineering

Abstract

Games on personal computers have traditionally been played using the keyboard. However, with the popularity of mobile games and even Microsoft’s Kinect, perhaps using gestures to play games on personal computers would improve the user experience. Hence, for this Final Year Project, the objective is to develop a game with the Unity3D game engine and use the Leap Motion controller to capture hand gestures and allow players to interact with the elements in the game.
This report also gives an overview of the project schedule, the work breakdown structure and the level designs. There is also some elaboration on how the different components are put together for the game.
After the implementation of the game was completed, a user study was conducted where ten people were approached to play and test the game as well as provide feedback. Useful information and data was gathered and it was concluded that using gestures to play the game indeed gave a higher degree of interaction and immersion as compared to just using the keyboard. Having said that, there were some complaints regarding some of the gestures and this shows that the right gestures are needed to truly provide a positive user experience. The gestures chosen should be simple, easy to learn and use as well as not cause discomfort and feel natural. In addition, most of the participants of the user study still prefer playing games with the keyboard because they are used to it. This implies that it will take time for the use of gestures for games on personal computers to catch on.
Finally, this project can be taken as a foundation for further exploration into the area of user experience in games. In February, Leap Motion revealed a new project called Orion which promises greater accuracy in gesture capture and hardware is being developed to be mounted on Virtual Reality headsets such as Occulus Rift. This project has already set up the first person perspective and some gestures to control movement so any continuation of this project can reuse components that have been implemented.