Cognitive Sciences Stack Exchange is a question and answer site for practitioners, researchers, and students in cognitive science, psychology, neuroscience, and psychiatry. It's 100% free, no registration required.

I've edited this quite a bit to clarify what I think you meant to ask; please fix it if I've changed your meaning too much. +1 BTW! I'd love to know more about this myself...and welcome to cogsci.SE!
–
Nick StaunerJan 28 '14 at 0:33

2 Answers
2

There is no true frame rate of the eyes, but there are limitations. The brain uses blurring to simulate continuity. Films are shot at 24 frames per second; if you go too much lower than that, the film will seem choppy. This is because the motion blurring process is too fast and it finishes "blurring" before the frame changes, so you just see choppy frames. If it's a bit faster, suddenly the blurring occurs in a time frame on the order of the frame changing and it looks smooth.

Similarly, audio doesn't really have a frame rate, but there are limitations to what frequencies the sensory can transmit into neural code. ~(20 Hz - 20kHz).

So you do see some kind of frame rate for perception being plausible, and it turns out that higher-level perception rates are not fixed, but depends on the modality of the perceptions themselves:

Recanzone has recently demonstrated that the perceived rate at which people judge a light to be flickering on and off can also be modulated by the rate at which a concurrent stream of auditory stimuli are presented

Even if there weren't a frame rate, due to differences in processing times between the two systems, there would still be a synchronization process required:

Light travels through air far more rapidly than sound: 300,000,000 versus 330 metres per second. Differences in arrival time also occur for events that occur much closer to us, and yet we are rarely aware of them. Part of the reason for this is that the mechanical transduction of sound waves at the ear takes less time than that required for the chemical transduction of light at the retina. These physical and biophysical differences in the arrival time of light and sound cancel each other out when stimuli are approximately 10 metres from us, at the so-called ‘horizon of simultaneity’. Given most audiovisual events are not perceived at this ‘optimal’ distance, however, psychologists, neuroscientists and even philosophers have long puzzled over why the perception of multisensory synchrony should be such a pervasive aspect of our everyday phenomenology.

No, that is not the current model of human vision. The refresh rate of our eyes varies. The upward limits are about 300Hz with a constant set around 60Hz (Deering, 1998). Yet no experiments have definitely given the max refresh rate of the eye.

The human eye and its brain interface, the human visual system, can process 10 to 12 separate images per second, perceiving them individually. The threshold of human visual perception varies depending on what is being measured. When looking at a lighted display, people begin to notice a brief interruption of darkness if it is about 16 milliseconds or longer. When given very short single-millisecond visual stimulus people report a duration of between 100 ms and 400 ms due to persistence of vision in the visual cortex. This may cause images perceived in this duration to appear as one stimulus, such as a 10 ms green flash of light immediately followed by a 10 ms red flash of light perceived as a single yellow flash of light. Persistence of vision may also create an illusion of continuity, allowing a sequence of still images to give the impression of motion.

Motion detection in the brain is performed by direction-sensitive units in the primary visual cortex (which works in gray scale; Smith & Wall, 2008). How this part of the brain works, like many other parts, is a mystery. We know that direction-sensitive units (a neural cell) are mapped to specific places in vision and respond to changes (Bigun, 2006). This is of course without even beginning to talk about blindsight.