Monday, March 2

EyePhone

Eyephone Design:- :-

One
question we address in this paper is how useful is a cheap, ubiquitous sensor,
such as the camera, in building HPI applications. We develop eye tracking and
blink detection mechanisms based algorithms originally designed for desktop
machines using USB cameras. We show the limitations of an off-the-shelf HCI
techniquewhen used to realize a HPI
application on a resource limited Mobile
device such as the Nokia N810. The EyePhone algorithmic design breaks down into
the following pipeline phases:

1)
An eye detection phase;

2)
An open eye template creation phase;

3)
An eye tracking phase;

4)
A blink detection phase.

Impact of Distance Between Eye and
Tablet

Since
in the current implementation the open eye template is created once at a fixed
distance, we evaluate the eye tracking performance when the distance between
the eye and the tablet is varied while using EyePhone. We carry out the
measurements for the middle-center position in the display (similar results are
obtained for the remaining eight positions) when the person is steady and
walking. As expected, the accuracy degrades for distances larger than 18-20 cm
(which is the distance between the eye and the N810 we currently use during the
eye template training phase). The accuracy drop becomes severe when the
distance is made larger (e.g., ∼45 cm). These results indicate that research is
needed in order to design eye template training techniques which are robust
against distance variations between the eyes and the
phone.

Blink Detection

To
detect blinks we apply a thresholding technique for the normalized correlation
coefficient returned by the template matching function as suggested in.
However, our algorithm differs from the one proposed in. In the authors
introduce a single threshold T and the eye is deemed to be open if the
correlation score is greater than T, and closed vice versa. In the EyePhone
system, we have two situations to deal with: the quality of the camera is not
the same as a good USB camera, and the phone’s camera is generally closer to
the person’s face than is the case of using a desktop and USB camera. Because of this latter situation the camera can
pick up iris movements, i.e.,
the interior of the eye, due to Eyeball
rotation.

EyePhone Overview

As
smartphones evolve researchers are studying new techniques to ease the
human-mobile interaction. We propose EyePhone, a novel “hand-free” interfacing
system capable of driving mobile applications/functions using only the user’s
eyes movement and actions (e.g., wink). EyePhone tracks the user’s eye movement
across the phone’s display using the camera mounted on the front of the phone;
more specifically, machine learning algorithms are used to: i) track the eye
and infer its position on the mobile phone display as a user views a particular
application; and ii) detect eye blinks that emulate mouse clicks to activate
the target application under view. We present a prototype implementation of
EyePhone on a Nokia N810, which is capable of tracking the position of the eye
on the display, mapping these positions to an application that is activated by
a wink. At no time does the user have to physically touch the phone display.

Artificial Light Exposure For A
Stationary Subject

In
this experiment, the person is again not moving but in an artificially lit
environment (i.e., a room with very low daylight penetration from the windows).
We want to verify if different lighting conditions impact the system’s
performance. The results, shown in Table 1, are comparable to the daylight scenario
in a number of cases. However, the accuracy drops. Given the poorer lighting
conditions, the eye tracking algorithm fails to locate the eyes with higher
frequency. Daylight Exposure for Person Walking. We carried out an experiment
where a person walks outdoors in a bright environment to quantify the impact of
the phone’s natural movement; that is, shaking of the phone in the hand induced
by the person’s gait. We anticipate a drop in the accuracy of the eye tracking
algorithm because of the phone movement. This is confirmed by the results shown
in Table 1, column 4. Further research is required to make the eye tracking
algorithm more robust when a person is using the system on the move.

Conclusion

In
this paper, we have focused on developing a HPI technology solely using one of
the phone’s growing number of onboard sensors, i.e., the front-facing camera.
We presented the implementation and evaluation of the EyePhone prototype. The
EyePhone relies on eye tracking and blink detection to drive a mobile phone
user interface and activate different applications or functions on the phone.
Although preliminary, our results indicate that EyePhone is a promising
approach to driving Mobile
applications in a hand-free manner.