Gesture based PTZ camera control

Abstract

Gesture Detection is one of the most popular fields of research among the computer vision community. Gesture detection has increased importance now because of its potential application in the fields of indoor surveillance, object tracking, traffic surveillance, etc. Research in this field in future can lead to non-textual inputs, a generation where there will be no need for keyboards, mouse or even switchboards. There are many possible techniques to detect motion in the real world. Tracking devices like gloves and body sensors have been used in the past and are still used for detecting fast and subtle motions. Some vision based systems are also used, which function based on properties like texture and colour. However, several factors are associated with these gesture sensing technologies, like accuracy, cost, comfort, etc. Handling these devices might be cumbersome to the user. This work essentially deals with detection of gesture, first from a recorded video and then from a live camera feed. First image segmentation techniques like frame differencing and background subtraction have been applied to study the motion occurring from one frame to the other. Morphological operations like dilation and erosion are then applied to filter the segmented image. Then connected component analysis is performed on the resultant image and the person making the gesture is detected. The work has then been extended to observe and detect motion from a live camera feed. Optical flow is used here for motion detection, since frame differencing has some observable disadvantages like noise detection and inaccuracy. A live feed is taken from an IP PTZ camera and optical flow technique is applied to it. Once the gesture is detected from the feed, we compute the pan, tilt and zoom required to focus the person making the gesture. Then the camera is fed with computed values of the parameters and the camera focuses on the person, in real time.