NethSEKBM2011_23CTNethJLSoumanDEngelUKloosHHBülthoffBJMohler2012-07-0071810411052IEEE Transactions on Visualization and Computer GraphicsRedirected walking techniques allow people to walk in a larger virtual space than the physical extents of the laboratory. We describe two experiments conducted to investigate human sensitivity to walking on a curved path and to validate a new redirected walking technique. In a psychophysical experiment, we found that sensitivity to walking on a curved path was significantly lower for slower walking speeds (radius of 10 m versus 22 m). In an applied study, we investigated the influence of a velocity-dependent dynamic gain controller and an avatar controller on the average distance that participants were able to freely walk before needing to be reoriented. The mean walked distance was significantly greater in the dynamic gain controller condition, as compared to the static controller (22 m versus 15 m). Our results demonstrate that perceptually motivated dynamic redirected walking techniques, in combination with reorientation techniques, allow for unaided exploration of a large virtual city model.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/fileadmin/user_upload/files/publications/2011/TVCG_Neth_Manuscript_revised.pdfpublished11Velocity-Dependent Dynamic Curvature Gain for Redirected Walking1501715422SoumanRSFTUDBE20113JLSoumanPRobuffo GiordanoMSchwaigerIFrissenTThümmelHUlbrichADe LucaHHBülthoffMErnst2011-11-004:258122ACM Transactions on Applied PerceptionDespite many recent developments in Virtual Reality, an effective locomotion interface which allows for normal walking through large virtual environments was still lacking until recently. Here, we describe the new CyberWalk omnidirectional treadmill system, which makes it possible for users to walk endlessly in any direction, while never leaving the confines of the limited walking surface. The treadmill system improves on previous designs, both in its mechanical features and in the control system employed to keep users close to the centre of the treadmill. As a result, users are able to start walking, vary their walking speed and direction, and stop walking like they would on a normal, stationary surface. The treadmill system was validated in two experiments, in which both the walking behaviour and the performance in a basic spatial updating task were compared to that during normal overground walking. The results suggest that walking on the CyberWalk treadmill is very close to normal walking, especially after some initial familiarization. Moreover, we did not find a detrimental effect of treadmill walking in the spatial updating task. The CyberWalk system constitutes a significant step forward to bringing the real world into the laboratory or workplace.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published21CyberWalk: Enabling unconstrained omnidirectional walking through virtual environments15017154221501718824FrissenCSE20113IFrissenJLCamposJLSoumanMOErnst2011-07-002212163176Experimental Brain ResearchSpatial updating during self-motion typically involves the appropriate integration of both visual and non-visual cues, including vestibular and proprioceptive information. Here, we investigated how human observers combine these two non-visual cues during full-stride curvilinear walking. To obtain a continuous, real-time estimate of perceived position, observers were asked to continuously point toward a previously viewed target in the absence of vision. They did so while moving on a large circular treadmill under various movement conditions. Two conditions were designed to evaluate spatial updating when information was largely limited to either proprioceptive information (walking in place) or vestibular information (passive movement). A third condition evaluated updating when both sources of information were available (walking through space) and were either congruent or in conflict. During both the passive movement condition and while walking through space, the pattern of pointing behavior demonstrated evidence of accurate egocentric updating. In contrast, when walking in place, perceived self-motion was underestimated and participants always adjusted the pointer at a constant rate, irrespective of changes in the rate at which the participant moved relative to the target. The results are discussed in relation to the maximum likelihood estimation model of sensory integration. They show that when the two cues were congruent, estimates were combined, such that the variance of the adjustments was generally reduced. Results also suggest that when conflicts were introduced between the vestibular and proprioceptive cues, spatial updating was based on a weighted average of the two inputs.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published13Integration of vestibular and proprioceptive signals for spatial updating1501715422150171882467233JLSoumanTCAFreemanVEikmeierMOErnst2010-09-0011:1410112Journal of VisionPerceived visual speed has been reported to be reduced during walking. This reduction has been attributed to a partial subtraction of walking speed from visual speed (F. H. Durgin & K. Gigone, 2007; F. H. Durgin, K. Gigone, & R. Scott, 2005). We tested whether observers still have access to the retinal flow before subtraction takes place. Observers performed a 2IFC visual speed discrimination task while walking on a treadmill. In one condition, walking speed was identical in the two intervals, while in a second condition walking speed differed between intervals. If observers have access to the retinal flow before subtraction, any changes in walking speed across intervals should not affect their ability to discriminate retinal flow speed. Contrary to this direct access hypothesis, we found that observers were worse at discrimination when walking speed differed between intervals. The results therefore suggest that observers do not have access to retinal flow before subtraction. We also found that the amount of subtraction depended on the visual speed presented, suggesting that the interaction between the processing of visual input and of self-motion is more complex than previously proposed.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published11Humans do not have direct access to retinal flow during walking150171882449023JLSoumanPRobuffo GiordanoIFrissenADLucaMOErnst2010-02-002:117114Transactions on Applied PerceptionFor us humans, walking is our most natural way of moving through the world. One of the major challenges in present research on navigation in virtual reality is to enable users to physically walk through virtual environments. Although treadmills, in principle, allow users to walk for extended periods of time through large virtual environments, existing setups largely fail to produce a truly immersive sense of navigation. Partially, this is because of inadequate control of treadmill speed as a function of walking behavior. Here, we present a new control algorithm that allows users to walk naturally on a treadmill, including starting to walk from standstill, stopping, and varying walking speed. The treadmill speed control consists of a feedback loop based on the measured user position relative to a given reference position, plus a feed-forward term based on online estimation of the user's walking velocity. The purpose of this design is to make the treadmill compensate fully for any persistent walker motion, while keeping the accelerations exerted on the user as low as possible.
We evaluated the performance of the algorithm by conducting a behavioral experiment in which we varied its most important parameters. Participants walked at normal walking speed and then, on an auditory cue, abruptly stopped. After being brought back to the center of the treadmill by the control algorithm, they rated how smoothly the treadmill had changed its velocity in response to the change in walking speed. Ratings, in general, were quite high, indicating good control performance. Moreover, ratings clearly depended on the control algorithm parameters that were varied. Ratings were especially affected by the way the treadmill reversed its direction of motion. In conclusion, controlling treadmill speed in such a way that changes in treadmill speed are unobtrusive and do not disturb VR immersiveness is feasible on a normal treadmill with a straightforward control algorithm.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de//fileadmin/user_upload/files/publications/ACM_TAP_2010_4902[0].pdfpublished13Making virtual walking real: perceptual evaluation of a
new treadmill control algorithm150171882457733JLSoumanIFrissenMNSreenivasaMOErnst2009-09-00181915381542Current BiologyCommon belief has it that people who get lost in unfamiliar terrain often end up walking in circles. Although uncorroborated by empirical data, this belief has widely permeated popular culture. Here, we tested the ability of humans to walk on a straight course through unfamiliar terrain in two different environments: a large forest area and the Sahara desert. Walking trajectories of several hours were captured via global positioning system, showing that participants repeatedly walked in circles when they could not see the sun. Conversely, when the sun was visible, participants sometimes veered from a straight course but did not walk in circles. We tested various explanations for this walking behavior by assessing the ability of people to maintain a fixed course while blindfolded. Under these conditions, participants walked in often surprisingly small circles (diameter &amp;lt; 20 m), though rarely in a systematic direction. These results rule out a general explanation in terms of biomechanical
asym
metries or other general biases [1], [2], [3], [4], [5] and [6]. Instead, they suggest that veering from a straight course is the result of accumulating noise in the sensorimotor system, which, without an external directional reference to recalibrate the subjective straight ahead, may cause people to walk in circles.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published4Walking Straight into Circles150171882449453JLSoumanTCAFreeman2008-11-0014:108114Journal of VisionSmooth pursuit eye movements add motion to the retinal image. To compensate, the
visual system can combine estimates of pursuit velocity and retinal motion to recover
motion with respect to the head. Little attention has been paid to the temporal
characteristics of this compensation process. Here, we describe how the latency
difference between the eye movement signal and the retinal signal can be measured for
motion perception during sinusoidal pursuit. In two experiments, observers compared the
peak velocity of a motion stimulus presented in pursuit and fixation intervals. Both the
pursuit target and the motion stimulus moved with a sinusoidal profile. The phase and
amplitude of the motion stimulus were varied systematically in different conditions,
along with the amplitude of pursuit. The latency difference between the eye movement
signal and the retinal signal was measured by fitting the standard linear model and a nonlinear
variant to the observed velocity matches. We found that the eye movement signal
lagged the retinal signal by a small amount. The non-linear model fitted the velocity
matches better than the linear one and this difference increased with pursuit amplitude.
The results support previous claims that the visual system estimates eye movement
velocity and retinal velocity in a non-linear fashion and that the latency difference
between the two signals is small.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published13Motion perception during sinusoidal smooth pursuit eye movements: signal
latencies and non-linearities1501715422150171882448143MNSreenivasaIFrissenJLSoumanMOErnst2008-10-003191313320Experimental Brain ResearchWalking along a curved path requires coordinated motor actions of the entire body. Here, we investigate the relationship between head and trunk movements during walking. Previous studies have found that the head systematically turns into turns before the trunk does. This has been found to occur at a constant distance rather than at a constant time before a turn. We tested whether this anticipatory head behavior is spatially invariant for turns of different angles. Head and trunk positions and orientations were measured while participants walked around obstacles in 45°, 90°, 135° or 180° turns. The radius of the turns was either imposed or left free. We found that the head started to turn into the direction of the turn at a constant distance before the obstacle (~1.1 m) for turn angles up to 135°. During turns, the head was consistently oriented more into the direction of the turn than the trunk. This difference increased for larger turning angles and reached its maximum later in the turn for larger turns
. Walking speeds decreased monotonically for increasing turn angles. Imposing fixed turn radii only affected the point at which the trunk started to turn into a turn. Our results support the view that anticipatory head movements during turns occur in order to gather advance visual information about the trajectory and potential obstacles.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published7Walking along curved paths of different angles: the relationship between head and trunk turning150171882438223JLSoumanITCHoogeAHWertheim2006-06-004171448458Experimental Brain ResearchWe investigated the relationship between
compensation for the effects of smooth pursuit eye
movements in localization and motion perception. Participants
had to indicate the perceived motion direction,
the starting point and the end point of a vertically
moving stimulus dot presented during horizontal
smooth pursuit. The presentation duration of the stimulus
was varied. From the indicated starting and end
points, the motion direction was predicted and compared
with the actual indicated directions. Both the
directions predicted from localization and the indicated
directions deviated from the physical directions, but the
errors in the predicted directions were larger than those
in the indicated directions. The results of a control
experiment, in which the same tasks were performed
during fixation, suggest that this difference reflects different
transformations from a retinocentric to a headcentric
frame of reference. This difference appears to be
mainly due to an asymmetry in the effect of retinal image
motion direction on localization during smooth pursuit.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published10Localization and motion perception during smooth pursuit eye movements150171542238233JLSoumanITCHoogeAHWertheim2006-02-001206176Journal of Computational NeuroscienceSmooth pursuit eye movements change the retinal image velocity of objects in the visual field. In order to change from a retinocentric frame of reference into a head-centric one, the visual system has to take the eye movements into account. Studies on motion perception during smooth pursuit eye movements have measured either perceived speed or perceived direction during smooth pursuit to investigate this frame of reference transformation, but never both at the same time. We devised a new velocity matching task, in which participants matched both perceived speed and direction during fixation to that during pursuit. In Experiment 1, the velocity matches were determined for a range of stimulus directions, with the head-centric stimulus speed kept constant. In Experiment 2, the retinal stimulus speed was kept approximately constant, with the same range of stimulus directions. In both experiments, the velocity matches for all directions were shifted against the pursuit direction, suggesting an incomplete transform
ation of the frame of reference. The degree of compensation was approximately constant across stimulus direction. We fitted the classical linear model, the model of Turano and Massof (2001) and that of Freeman (2001) to the velocity matches. The model of Turano and Massof fitted the velocity matches best, but the differences between de model fits were quite small. Evaluation of the models and comparison to a few alternatives suggests that further specification of the potential effect of retinal image characteristics on the eye movement signal is needed.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published15Frame of reference transformations in motion perception during smooth pursuit eye movements.150171542235243JSoumanIHoogeAWertheim2005-00-00164376386Experimental Brain ResearchAlthough many studies have been devoted to
motion perception during smooth pursuit eye movements,
relatively little attention has been paid to the
question of whether the compensation for the effects of
these eye movements is the same across different stimulus
directions. The few studies that have addressed this
issue provide conflicting conclusions. We measured the
perceived motion direction of a stimulus dot during
horizontal ocular pursuit for stimulus directions spanning
the entire range of 360. The stimulus moved at
either 3 or 8/s. Constancy of the degree of compensation
was assessed by fitting the classical linear model of
motion perception during pursuit. According to this
model, the perceived velocity is the result of adding an
eye movement signal that estimates the eye velocity to
the retinal signal that estimates the retinal image velocity
for a given stimulus object. The perceived direction depends
on the gain ratio of the two signals, which is assumed
to be constant across stimulus directions. The
model provided a good fit to the data, suggesting that
compensation is indeed constant across stimulus direction.
Moreover, the gain ratio was lower for the higher
stimulus speed, explaining differences in results in the
literature.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published10Perceived motion direction during smooth pursuit eye movements35233JLSoumanITCHoogeAHWertheim2005-00-00745845853Vision ResearchSmooth pursuit eye movements change the retinal image motion of objects in the visual field. To enable an observer to perceive
the motion of these objects veridically, the visual system has to compensate for the effects of the eye movements. The occurrence of
the Filehne-illusion (illusory motion of a stationary object during smooth pursuit) shows that this compensation is not always perfect.
The amplitude of the illusion appears to decrease with increasing presentation durations of the stationary object. In this study
we investigated whether presentation duration has the same effect when an observer views a vertically moving object during horizontal
pursuit. In this case, the pursuit eye movements cause the perceived motion path to be oblique instead of vertical; this error in
perceived motion direction should decrease with higher presentation durations. In Experiment 1, we found that the error in perceived
motion direction indeed decreased with increasing presentation duration, especially for higher pursuit velocities. The results
of Experiment 2 showed that the error in perceived motion direction did not depend on the moment during pursuit at which the
stimulus was presented, suggesting that the degree of compensation for eye movements is constant throughout pursuit. The results
suggest that longer presentation durations cause the eye movement signal that is used by the visual system to increase more than the
retinal signal.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published8Vertical object motion during horizontal ocular
pursuit: compensation for eye movements increases with presentation durationNethSEKBM20117CNethJLSoumanDEngelUKloosHHBülthoffBJMohlerSingapore2011-03-00151158IEEE Virtual Reality Conference (VR 2011)The aim of Redirected Walking (RDW) is to redirect a person along their path of travel in a Virtual Environment (VE) in order to increase the virtual space that can be explored in a given tracked area. Among other techniques, the user is redirected on a curved real-world path while visually walking straight in the VE (curvature gain). In this paper, we describe two experiments we conducted to test and extend RDW techniques. In Experiment 1, we measured the effect of walking speed on the detection threshold for curvature of the walking path. In a head-mounted display (HMD) VE, we found a decreased sensitivity for curvature for the slowest walking speed. When participants walked at 0.75 m/s, their detection threshold was approximately 0.1m-1 (radius of approximately 10m). In contrast, for faster walking speeds (>;1.0m/s), we found a significantly lower detection threshold of approximately 0.036m-1 (radius of approximately 27m). In Experiment 2, we implemented many well known redirection techniques into one dynamic RDW application. We integrated a large virtual city model and investigated RDW for free exploration. Further, we implemented a dynamic RDW controller which made use of the results from Experiment 1 by dynamically adjusting the applied curvature gain depending on the actual walking velocity of the user. In addition, we investigated the possible role of avatars to slow the users down or make them rotate their heads while exploring. Both the dynamic curvature gain controller and the avatar controller were evaluated in Experiment 2. We measured the average distance that was walked before reaching the boundaries of the tracked area. The mean walked distance was significantly larger in the condition where the dynamic gain controller was applied. This distance increased from approximately 15m for static gains to approximately 22m for dynamic gains. This did not come at the cost of an increase in simulator sickness. Applying the avatar cont roller did reveal an effect on walking distance or simulator sickness.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/fileadmin/user_upload/files/publications/2011/VR-2011-Neth.pdfpublished7Velocity-Dependent Dynamic Curvature Gain for Redirected Walking1501715422352546JSouman2004-00-002004-00-002D motion perception during smooth pursuit eye movementsnonotspecified2D motion perception during smooth pursuit eye movementsChuangS20117LLChuangJLSoumanTübingen, Germany2011-09-00Bernstein Cluster D Symposium: Multisensory Perception and Actionnonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published0Object speed estimation during walking does not add up1501715422ChuangBS20117LChuangHHBülthoffJSoumanToulouse, France2011-09-0012934th European Conference on Visual PerceptionWalking reduces visual speed estimates of optic flow (Souman et al, 2010 Journal of Vision 10(11):14]. Simultaneously, visual background motion can influence the perceived speed of moving objects (Tynan and Sekular, 1975 Vision Research 25 1231–1238; Baker and Graf, 2010 Vision Research 50 193–201). These two effects have been attributed to different subtractive processes, which may help in segregating object motion from self-motion induced optic flow. Here, we investigate how both factors jointly contribute to the perceived visual speed of objects. Participants compared the speed of two central Gabor patches on a ground plane, presented in consecutive intervals, either while standing still or while walking on a treadmill. In half the trials, one of the Gabors was surrounded by a moving random dot pattern, the speed of which matched walking speed. Our results replicated previous findings. A moving surround as well as walking can independently induce a subtractive effect on the perceived speed of the moving center, with the effect size increasing with center speed. However, walking does not affect visual speed estimates of the center when a visual surround is present. These results suggest that the visual input dominates the segregation of object motion from background optic flow.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published-129The center-surround effect in visual speed estimation during walking150171542267557CTNethJLSoumanDEngelUKloosHHBülthoffBJMohlerStuttgart, Germany2010-10-00122010 Joint Virtual Reality Conference of EuroVR - EGVE - VEC (JVRC 2010)We investigated in a study whether humans’ sensitivity to curved walking is affected by their walking velocity.
Amongst other techniques, redirecting users of an immersive virtual environment on a curved path is part of the
so-called ’Redirected Walking’. We conducted an experiment in which 12 participants walked specific curvatures
at given speeds in a VR. We found that people are significantly less sensitive to walking on a curve when walking slower. Moreover, we assume the possibility of using avatars to support redirection algorithms as it was shown by Llobera et al. ([LSRS10]) that proxemics holds true for avatars in virtual environments, too. In this work, we depict three possible applications of how avatars could be used to achieve a better redirection.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de//fileadmin/user_upload/files/publications/JVRC_Manuscript_[0].pdfpublished1Velocity-dependent curvature gain and avatar use for Redirected Walking150171542265507CNethJLSoumanHHBülthoffUKloosBJMohlerLausanne, Switzerland2010-08-009633rd European Conference on Visual PerceptionPeople are relatively insensitive to the curvature of their walking trajectory [Kallie et al., 2007, JEP:HPP, 33(1), 183-200]. This is exploited in the "Redirected Walking" technique which is used in Virtual Reality to extend the borders of Virtual Environments (VE) beyond the size of the physical walking area [Steinicke et al., 2009, Journal of Virtual Reality and Broadcasting, 6(2009)]. One method is to slowly rotate the VE while the user is aiming to walk a straight path, inducing him/her to unknowingly walk on a curved trajectory. We tested whether the sensitivity to curvature depends on walking speed. Participants followed a virtual sphere in a VE, which moved on a straight path. During walking, the entire visual scene was rotated, creating a curved real-world trajectory (radius 20-200m). Walking speed was 0.75, 1.0, or 1.25 m/s. Participants indicated whether their physical walking path curved to the left or right. Discrimination thresholds were estimated by fitting a psychometric function to the propor
tion of trials in which the trajectory was reported to curve to the left. Curvature thresholds were found to be higher for slow walking. This suggests that the effectiveness of the redirected walking technique depends on walking speed.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de//fileadmin/user_upload/files/publications/Abstract%20ECVP10_[0].pdfpublished-96The effect of walking speed on the sensitivity to curved walking in an immersive Virtual Environment150171542264257TMeilingerJLSoumanHHBülthoffSaarbrücken, Germany2010-03-007752. Tagung Experimentell Arbeitender Psychologen (TeaP 2010)Um in einer Stadt oder einem Gebäude zu entfernt liegenden Orten zu zeigen, müssen die während der Navigation erfahrenen Eindrücke in einem Referenzrahmen integriert werden. Um diesen Prozess zu untersuchen, liefen Versuchspersonen auf einem omnidirektionalem Laufband mindestens sechs mal eine Route durch eine virtuelle Stadt. Konnten sie die Route mehrmals fehlerfrei reproduzieren
wurden sie an Orte in der Stadt teleportiert, lokalisierten ihren Standort und zeigten zu einer Reihe von Orten: entweder der Reihe nach vom derzeitigen Standort bis zum Start oder Ziel, oder vom Start/Ziel ausgehend bis zum derzeitigen Standort. Ersteres erledigten sie schneller, was vereinbar ist mit der Konstruktion eines mentalen Modells oder einer mentalen Reise vom derzeitigen Standort aus.
Außerdem zeigten die Versuchspersonen konsistent genauer entweder Richtung Ziel oder Richtung Start – je nach Versuchsperson. Dies spricht für eine asymmetrische Encodierung räumlicher Information in lokalen, verknüpften Referenzrahmen und gegen die automatische Integration in einer globalen mentalen Karte.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published-77Asymmetrien und die Konstruktion von Überblickswissen150171542259577JLSoumanVEikmeierMOErnstTCAFreeman2009-07-0028810 International Multisensory Research Forum (IMRF 2009)nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published-288Walking changes perceived visual speed of both expanding and contracting optic flow fields1501718824SoumanFSE20077JLSoumanIFrissenMNSreenivasaMOErnstUtrecht, Netherlands2008-08-004131st European Conference on Visual PerceptionCommon myth has it that people who get lost in unfamiliar terrain end up walking in circles. We tested whether this is true and what role visual information plays. Participants walked for several hours under various conditions of visual information. Their task was to walk as straight as possible in the direction indicated at the beginning of the experiment. GPS was used to register their walking paths. Participants often walked in circles when blindfolded, although only few exhibited a consistent bias in one direction. In a forest, with ample visual information at short distance but few distant landmarks, participants walked in circles with overcast sky. However, with sunny weather they walked perfectly straight. In the Sahara desert, finally, participants only walked in circles during the night when the moon was not visible, but not when either moon or sun was visible. The results suggest that visual information is critical for walking straight. Furthermore, the mere availability of optic flow is not sufficient; participants needed distant landmarks to walk straight.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published-41Walking in circles: the role of visual information in navigation1501718824150171542254777IFrissenJLSoumanMOErnstHamburg, Germany2008-07-002669th International Multisensory Research Forum (IMRF 2008)A variety of sources of sensory information (e.g., visual, inertial and proprioceptive) are available for the estimation of walking speed. However, little is known about how they are integrated. We present a series of experiments, using a 2-IFC walking speed judgment task, investigating the relative contributions of the inertial and proprioceptive information. We used a circular treadmill equipped with a motorized handlebar, to manipulate inertial and proprioceptive inputs independently. In one experiment we directly compared walking-in-place (WIP) and walking-through-space (WTS). We found that WIP is perceived as slower than WTS. The WIP condition creates a special conflict situation because the proprioceptive cue indicates motion whereas the inertial cue indicates an absence of motion through space. In another experiment we presented a range of conflicts by combining a single proprioceptive input with different inertial inputs. We found that the inertial input is weighted more heavily when it indicates a faster walking speed than proprioception. Conversely, it receives less weight if it indicates a lower speed. This suggests that the inertial cue becomes more reliable with increasing velocity. Our findings show a more important role for inertial information in the perception of walking speed than has previously been suggested in the literature.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published-266Multisensory integration of non-visual sensory information for the perceptual estimation of walking speed150171542251497JLSoumanIFrissenMOErnstNaples, FL, USA2008-06-0011468th Annual Meeting of the Vision Sciences Society (VSS 2008)Perceived visual speed has been reported to be reduced during walking compared to standing still. This so-called ‘subtraction effect’ has been attributed to an automatic subtraction of part of the walking speed from the visual speed (Durgin et al., 2005). We investigated how general this subtraction effect is, by varying both visual speed and walking speed in a series of experiments. Observers judged the visual speed of a simulated ground plane (presented through a HMD) in a 2IFC task. In one interval, they walked in place on a treadmill, in the other they stood still. In different experiments, the interval with the visual standard speed, the order of the intervals, and the walking speed were varied. In all experiments, observers consistently reported the perceived visual speed for the lowest standard speed to be lower during walking than during standing still. However, most observers also perceived the highest standard speed as faster during walking than during standing still, which is clearly incompatible with the subtraction effect. We tested the apparent interaction between visual speed and walking in another experiment by presenting the exact same visual speed in the two intervals and asking the observers again to judge which of the two appeared to be faster. As in the previous experiments, the visual speed was reported to be faster during standing for slow visual speeds; this gradually changed into the opposite for faster visual speeds. Taken together, the results question the generality of the subtraction effect and raise doubts regarding the hypothesized functional role of this effect.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published-1146The effect of walking on perceived visual speed depends on visual speed150171542249637JLSoumanIFrissenMOErnstSant Feliu de Guixols, Spain2007-10-006162ESF-EMBO Symposium on Three Dimensional Sensory and Motor Space: Perceptual Consequences of Motor Actionnonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published1Perceived visual speed while walking: more than subtraction1501715422150171882450357JLSoumanIFrissenMOErnstArezzo, Italy2007-08-0016930th European Conference on Visual PerceptionPerceived visual speed has been reported to be reduced during walking compared to standing
still. This effect has been attributed to an automatic subtraction of part of the walking speed
from the visual speed (Durgin et al, 2005 Journal of Experimental Psychology: Human Perception
and Performance 31 339 ^ 353). Here, we show that both the magnitude and the direction of this
`reduction&lsquo; depend on visual speed. Observers compared visual speed of a simulated ground plane
(presented through an HMD) while standing and walking (1.1 m sÿ1). PSEs, estimated for three
standard speeds during walking (1.0, 2.0, 3.0 m sÿ1 simulated speed), increased approximately
linearly with the standard speed, with a slope 4 1. For the lowest standard speed, the PSEs were
lower than the standard speed, whereas they were higher for the highest standard speed. The latter
is clearly incompatible with an automatic subtraction effect. The results suggest that, contrary to
what Durgin et al (2005) claim, the effect of walking on perceived visual speed is not independent
of the visual speed and raise questions regarding the functional role of the subtraction effect.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published-169Perceived visual speed while walking: Adding to subtraction150171882448577IFrissenJLSoumanTübingen, Germany2007-07-007610th Tübinger Wahrnehmungskonferenz (TWK 2007)Although the vestibular system clearly plays an important role in the control of locomotion,
it is not clear to what extent it is also involved in the perception of our own locomotion. We
investigated whether vestibular information is used for the perceptual estimation of one’s own
walking speed. If vestibular information is used, perceived walking speed would be expected
to be lower during walking in place on a treadmill than when walking at the same speed across
the ground, as the forward acceleration of the head during walking is largely absent. To experimentally
address this hypothesis, we used a circular treadmill setup, consisting of a large
turn table (diameter 3.5m) and a motorized handlebar. Both could be actuated independently
from each other. In this setup, walking behind the moving handlebar on the stationary treadmill
stimulates both the otholiths and the semicircular canals, whereas this vestibular stimulation is
much reduced when walking in place on the rotating treadmill. The biomechanical information
is largely equal in these two conditions. Subjects had to judge their walking speed in a 2IFC
task. In one interval, they walked around the stationary treadmill behind the moving handlebar
at one of three standard speeds (31.7, 42.3, and 52.8 deg/s at a radius of 1.28 m, corresponding
to tangential speeds of 0.71, 0.94, and 1.18 m/s, respectively). In the other interval, they
walked in place at one of nine test speeds. Their task was to indicate in which of the two intervals
they walked faster. Accelerations (20–30 deg/s) as well as the duration of the walking
period (3–4 sec) were randomly set for individual intervals. A psychometric curve was fitted to
the speed judgments for each standard speed, from which the PSEs were estimated. The results
were in general agreement with the hypothesis. Subjects tended to underestimate their walking
speed when walking in place relative to actually walking around. This underestimation, however,
seems to be dependent on walking speed and varies considerably across participants. In
conclusion, vestibular information is used for estimating perceived walking speed.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published-76Contribution of Vestibular Information to Perceived Walking Speed150171542245687MSreenivasaIFrissenJLSoumanMOErnstTübingen, Germany2007-07-007910th Tübinger Wahrnehmungskonferenz (TWK 2007)During walking the behavior of the head and trunk are closely coupled. This becomes particularly
clear while taking a turn. Here, we investigate this coupling during two phases of turning,
before and during. Before a turn people make anticipatory orientations of the head into the
direction of the turn. Previous research suggests that this anticipation occurs at a constant
distance before the curve for different walking speeds. However, in most studies participants
only performed 90 turns. We tested whether anticipation distance is invariant across different
turn angles. As the turn progresses the head continues to look further into the turn than the
trunk, and slowly converging towards the end of turn. An additional question here is the dependence
of relative yaw between head and trunk on the turn angle. To answer these questions
we measured head-trunk angles across a range of different turn angles. Participants followed
predefined paths around obstacles with the radius of turn indicated by circles drawn on the
floor. Turning angles ranged from 45 to 180 in steps of 45. The position and orientation of
both the head and trunk were measured using an optical tracking system. Two parameters were
calculated from the data: head anticipation and maximum relative yaw. Head anticipation is the
distance in space where the head starts to look into the upcoming turn. Maximum relative yaw
is the maximum difference occurring between the yaw angle of the head and the trunk during
a turn. Both head anticipation and maximum relative yaw increased with turn angle, although
maximum relative yaw leveled off after 135. In a second experiment, participants followed
the same paths as in Experiment 1, but were not constrained in the turn radius. Results showed
that turn radius decreased with increasing turn angle. Nevertheless, we found the same pattern
of results as in Experiment 1. In conclusion, the relation between head and trunk both before
and during a turn is dependent on the angle of turn one is about to make.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published-79Head-Trunk Relation Before and During a Turn: the Effect of Turn Angle1501715422150171882448927JLSoumanIFrissenMOErnstTübingen, Germany2007-07-0015510th Tübinger Wahrnehmungskonferenz (TWK 2007)Perceived visual speed has been reported to be reduced during walking compared to standing
still. This so-called ‘subtraction effect’ has been attributed to an automatic subtraction of part
of the walking speed from the visual speed [1]. In this study, we investigated how general this
subtraction effect is, by varying visual speed, walking speed and the order of the intervals in
which observers walked or stood still. Five observers judged the visual speed of a simulated
ground plane that was presented on a HMD in a 2IFC task. In one interval, they judged the
visual speed while walking in place on a treadmill (0.6, 1.0, or 1.4 m/s), and they did the same
while standing still in the other interval. Simulated visual standard speed, presented during
walking, was 1.0, 2.0, or 3.0 m/s. All observers compared the three visual standard speeds
during the three walking speeds against a range of visual test speeds during standing still and
indicated in which of the two intervals the visual speed appeared to be higher. For three of the
observers the order of the intervals was standing—walking, while it was reversed for the other
two observers. From the speed judgments, the PSE’s in the nine conditions were estimated
by fitting psychometric functions. Surprisingly, the PSE’s were hardly affected by walking
speed. Visual standard speed strongly affected visual speed judgments for the observers who
first stood still and then walked. The lowest standard speed was reported to be perceived as
slower during walking than during standing still, while the opposite was true for the highest
standard speed. When observers first walked and then stood still, this effect did not occur.
Taken together, the results question the generality of the subtraction effect and raise doubts
regarding the hypothesized functional role of this effect.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published-155Perceived Visual Speed while Walking: More than Subtraction15017154221501718824FrissenSE20067IFrissenJLSoumanMOErnstDublin, Ireland2006-06-00317th International Multisensory Research Forum (IMRF 2006)Vestibular activity, motor command efference copies, and proprioception, among others, contribute to self-motion perception. According to Durgin et al. (2005) these sources are recalibrated when they are in conflict with the global self-motion percept. We tested this hypothesis by having participants walk blindfolded on a circular treadmill, under different conditions which varied in speed and direction of treadmill rotation independent of the participants’ walking speed. Recalibration was assessed with two tasks. Participants either stood in place and judged when the treadmill had rotated 360º (passive task), or walked 360º on a stationary treadmill (active task). Durgin’s, results indicate that participants should undershoot relative to pretest performance in the active task when the treadmill had rotated in the walking direction and that they should overshoot when it was moving against the walking direction. For the passive task the opposite pattern was predicted. However, we obtained an overshoot in both tasks increasing with the duration of adaptation. One possible source for the difference between Durgin’s and our results might be the availability of visual information that his participants had at the start of pre/posttests about their location in space. In our study disorientation might have accumulated leading to an increasing overshoot.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published-31Aftereffects of prolonged locomotion on a circular treadmill1501715422150171882441777JLSoumanTCAFreemanSarasota, FL, USA2006-06-00776th Annual Meeting of the Vision Sciences Society (VSS 2006)During everyday viewing we rarely keep our eyes still. Our visual system has to take these eye movements into account in order to create a veridical percept of object motion. When we make smooth pursuit eye movements, the perceived velocity of a moving object can be obtained by summing two signals, one estimating retinal image velocity and the other estimating eye velocity. Previous studies have shown that the gains of these two signals differ. Here we investigate whether they also differ in their latencies. Observers compared the peak velocity of sinusoidally moving dot patterns viewed during sinusoidal smooth pursuit eye movements and during fixation. The relative gains and phases of the two signals were estimated from the amplitude matches by fitting a simple linear model. At VSS2005, we showed that the model described the data well for most observers, but the estimated signal gains and phases showed considerable variability. Also, the gain ratio was very low for most observers, suggesting they ignored eye-velocity information and judged instead the relative motion in the display. Here, we tested whether removing the vertical edges in the stimulus window, using a large-field cylindrical screen, promoted head-centred judgements. Using this display, observers seem more able to judge head-centred sinusoidal motion consistently during smooth eye pursuit. Relative signal gain was comparable to that previously reported in the literature. Moreover, the results suggest that retinal motion signals lag eye-movement signals by a small amount.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published-77Phase lags and gain ratios in motion perception during smooth pursuit eye movements150171542248467JLSoumanTCAFreemanTübingen, Germany2006-03-001649th Tübingen Perception Conference (TWK 2006)During everyday viewing we rarely keep our eyes still. Our visual system has to take these eye movements into account in order to create a veridical percept of object motion. When we make smooth pursuit eye movements, the perceived velocity of a moving object can be obtained by summing two signals, one estimating retinal image velocity and the other estimating eye velocity. Previous studies have shown that the gains of these two signals differ. Here we investigate whether they also differ in their latencies. Observers compared the peak velocity of sinusoidally moving dot patterns viewed during sinusoidal smooth pursuit eye movements and during fixation. The relative gains and phases of the two signals were estimated from the amplitude matches by fitting a simple linear model. At VSS2005, we showed that the model described the data well for most observers, but the estimated signal gains and phases showed considerable variability. Also, the gain ratio was very low for most observers, suggesting they ignored eye-velocity information and judged instead the relative motion in the display. Here, we tested whether removing the vertical edges in the stimulus window, using a large-field cylindrical screen, promoted head-centred judgements. Using this display, observers seem more able to judge head-centred sinusoidal motion consistently during smooth eye pursuit. Relative signal gain was comparable to that previously reported in the literature. Moreover, the results suggest that retinal motion signals lag eye-movement signals by a small amount.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published-164Phase Lags and Gain Ratios in Motion Perception During Smooth Pursuit Eye Movements150171542238217JLSoumanTCAFreemanSarasota, FL, USA2005-09-00135Fifth Annual Meeting of the Vision Sciences Society (VSS 2005)Smooth pursuit eye movements change the retinal image motion of objects in the visual field. The visual system therefore has to take the eye movements into account to produce a veridical motion percept. According to the classical linear model of motion perception during smooth pursuit the perceived velocity depends on the sum of a retinal motion signal, estimating the retinal image velocity for a given object, and an eye movement signal that estimates the eye velocity. Errors in motion perception during smooth pursuit, such as the Filehne illusion and the Aubert-Fleischl phenomenon, can be explained in terms of the relative size of these signals. However, little attention has been paid to the temporal relationship between the two signals. If the eye velocity is not constant, differences between the latencies of the two signals will also produce perceptual errors. We therefore tested whether the signal latencies differ and what their perceptual consequences are. Participants judged the velocity of a sinusoidally moving random dot pattern, viewed during smooth pursuit of a sinusoidally moving target. In Experiment 1, the phase relationship between the dot pattern and the pursuit target was manipulated and in Experiment 2 we varied the motion amplitude of the dot pattern. In addition we examined whether positional cues affected performance by including a condition containing limited-lifetime dots. The relative signal size and phase difference of eye movement signal and retinal motion signal were estimated by fitting the classical linear model to the data. The model described the data well for most observers. The phase difference between the two signals turned out to be quite small, with perceptual errors mainly caused by differences in signal size.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published-135Signal latencies in motion perception during sinusoidal smooth pursuit35287JSoumanIHoogeAWertheim2004-00-003Perceptionnonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published-3How smooth pursuit eye movements affect the perceived direction and speed of moving objects.35277JLSoumanAHWertheimITCHoogeSarasota, FL, USA2003-10-00540Third Annual Meeting of the Vision Sciences Society (VSS 2003)When a vertically moving dot is perceived during ocular pursuit of a horizontally moving pursuit target, it appears to move in a slanted instead of a vertical direction, which is thought to reflect incomplete compensation for the eye movement (Becklen, Wallach, & Nitzberg, 1984). We investigated the influence of three factors on this misperception: stimulus duration, pursuit velocity and the moment during pursuit at which the stimulus was presented.
While following a horizontally moving pursuit target with their eyes, participants were presented with a vertically moving dot (the stimulus), crossing the pursuit path. The task of the participants was to indicate either the perceived motion direction, or the (horizontal) position where the stimulus appeared or disappeared. Stimulus presentation duration varied from 200 ms to 1400 ms and the stimulus was presented half-way the pursuit, or shortly before or after this. In a second experiment, pursuit target velocity was varied from 6 /s to 14 /s.
Decreasing the stimulus presentation duration increased the perceived slant and reduced the horizontal distance between perceived begin and end points of the stimulus path. Perceived slant could be predicted fairly well from this horizontal distance, except for the shortest presentation durations. Increasing the pursuit velocity also caused the perceived slant to increase. The moment of stimulus presentation, however, did not have an effect on perceived slant.
An additional finding was that the whole slanted stimulus path was mislocated in the direction of the pursuit. This mislocalization did not depend on stimulus presentation duration or pursuit velocity, but it was larger when the stimulus presentation occurred earlier in the pursuit path.
Our results show that the perceived path of a vertically moving dot presented during horizontal ocular pursuit is not only slanted, but also displaced. This might have implications for theories of motion perception during ocular pursuit.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published-540Motion perception and localization during smooth pursuit eye movements35267JSoumanAWertheimParis, France2003-09-002226th European Conference on Visual PerceptionDuring ocular pursuit of a moving target the retinal image velocity of other objects in the visual field differs from their physical velocity. To produce veridical perception of the motion of these objects, the visual system has to compensate for the eye movements. When an object is moving collinearly with the pursuit target, compensation occurs, although incompletely. This partial compensation produces illusions like the Filehne illusion and the Aubert - Fleischl phenomenon (Wertheim, 1994 Behavioral and Brain Sciences 17 293 - 355). However, whether the visual system compensates for eye movements when an object moves noncollinearly with the pursuit target is still a matter of debate. We investigated whether compensation occurs in the noncollinear case and whether it is the same as for collinear motion. We measured the perceived motion direction of a moving dot during ocular pursuit (10 deg s-1) of a horizontally moving pursuit target. The angle of the physical motion path of the dot relative to the pursuit path was varied from 0° to 360°. We found that observers made systematic errors in indicating the motion direction. The pattern of these errors was different for a dot moving with a lower (3 deg s-1) than with a higher (8 deg s-1) speed, with larger errors for the lower speed. The data can be explained by a model that assumes that compensation for eye movements is independent of speed and direction of the moving dot. Compensation is assumed to be normally distributed, with mean and standard deviation varying between observers.nonotspecifiedhttp://www.kyb.tuebingen.mpg.de/published-22Compensation for eye movements is partial in perception of both collinear and non-collinear object motion during smooth pursuit560810PRobuffo GiordanoJLSoumanRMattoneADLucaMOErnstHHBülthoff577810MErnstJLSouman577910JLSouman578210JLSouman578010JLSouman578110JLSouman578310JLSouman