Improving our capacity to measure the sightlines and paths of pedestrians can assist with our understanding of complex spatial environments, which can in turn lead to improvements in urban design. However, in practice, the gaze direction of pedestrians is difficult to estimate due to high noise levels in video recordings coupled with the subtlety of gaze dynamics. In this paper robots are used to develop a pedestrian gaze analysis system that is suitable for urban environments. The present paper describes a model street scene that was set up within a laboratory where autonomous humanoid robots of approximately 55cm in height were programmed to model the behaviour of human pedestrians. Experiments were conducted to collect a variety of data from overhead cameras, video cameras, and from the robot’s view through its internal camera as it traversed the model street, where it could become distracted by “visually attractive objects”. Overhead recordings were processed to obtain distracted and undistracted gaze signals which were analysed after filtering. Pilot experiments were performed for extracting 3D gaze vectors from video sequences using a manifold learning approach. An outline of the temporal behaviour analysis technique used on the obtained signals is also presented. This approach aims to improve the precision of pedestrian gaze analysis in real urban environments.

Relation

45th Annual Conference of the Australian and New Zealand Architectural Science Association (ANZAScA 2011). Conference Papers: 45th Annual Conference of the Australian and New Zealand Architectural Science Association: From Principles to Practice in Architectural Design, Sydney 2011 (Sydney 17-19 November, 2011)