Integration of visual input signals along motion trajectory is widely recognized as a basic mechanism of motion detection. It is however not widely recognized that the same computation is potentially useful for shape and color perception of moving objects. This is because trajectory integration can improve signal-to-noise ratio of moving feature extraction without introducing motion blur. Indeed, trajectory integration of shape information is indicated by several phenomena including multiple-slit view (e.g., Nishida, 2004). Trajectory integration of color information is also indicated by a couple of phenomena, motion-induced color mixing (Nishida et al., 2007) and motion-induced color segregation (Watanabe & Nishida, 2007). In the motion-induced color segregation, for instance, temporal alternations of two colors on the retina are perceptually segregated more veridically when they are presented as moving patterns rather than as stationary alternations at the same rate. This improvement in temporal resolution can be explained by a difference in motion trajectory along which color signals are integrated. Furthermore, we recently found that the improvement in temporal resolution is enhanced when an observer views a stationary object while making a pursuit eye movement, in comparison with when an observer views a moving object without moving eyes (Terao et al, 2008, VSS). This finding further strengthens the connection of the motion-induced color segregation with subjective motion deblur.