The voice of him that crieth in the wilderness. --Old Testament, Isaiah, XL, 3

Sound signal. 1. An incoming sign received through the ears, causing the brain to hear. 2. An outgoing sign produced by the vibration of physical objects (e.g., drum heads, reeds, and strings) or body parts (e.g., the hands in clapping, and the larynx in speaking).

Usage I: Like touch cues, auditory cues are psychologically "real" (i.e., tangible) to human beings. Because hearing evolved as a specialized form of touch, sounds share some properties of tactile signals. (N.B.: The telephone company's commercial jingle, "Reach out and touch someone," carries more than a figurative ring of truth.)

Usage II: Auditory cues may be used a. linguistically (in speech), as well as b. emotionally (to transmit information about attitudes, feelings, and moods; see TONE OF VOICE).

Courtship. In the speaking phase of courtship, auditory cues play a tactile role as they pave the way toward touching itself (see LOVE SIGNALS III).

Anatomy. Auditory cues are received, as vibrations, by specialized hair cells in the inner ear's cochlea. There, the vibrations are transformed (as electrical signals) in the auditory nerve, which links to auditory modules of the midbrain (i.e., the inferior colliculi) and the forebrain (e.g., the primary auditory cortex).

Evolution I. 1. "The visceral skeleton (splanchnocranium) of vertebrates consists of a series of cartilages or bones arising in the embryonic visceral (pharyngeal) arches" (Kent 1969:155). 2. "In lung-breathing tetrapods the visceral skeleton has been modified for transmission of sound (malleus, incus, and stapes), for attachment of the muscles of the modified tongue, and for support of the larynx (cricoid, thyroid, and arytenoid cartilages)" (Kent 1969:162).

Evolution II. "When the first amphibia left the Silurian seas two or three hundred million years ago, with their heads resting on the ground, they relied entirely on bone conduction of vibration for hearing. The vibrations in the earth were transmitted from the bones of their lower jaws to the bone surrounding the inner ear. In order to hear, they probably kept their lower jaws touching the ground" (Nathan 1988:34).

Right brain, left brain I. Regarding auditory signals, the right-brain hemisphere is superior to the left when dealing with music, metaphorical and figurative speech, sequences of verbalized events, verbal stress and intonation patterns, and human non-speech sounds. The left-brain hemisphere is superior in processing spoken words, numbers, and nonsense syllables. (See HUMAN BRAIN, Right brain, left brain.)

Right brain, left brain II. As reported by Reuters Health (July 4, 2001), "If you want to tell someone you love them you should tell them through their left ear, research suggests. People are more likely to remember emotional words, such as 'love,' if they are spoken into their left ear, according to a study by psychologists at Sam Houston State University in Huntsville, Texas." Words heard through the right ear are more likely to be forgotten, according to Dr. Teow-Chong Sim and his colleagues who presented the study at the European Congress of Psychology in London. Accuracy of recall of emotional words through the left ear measured 64.43%, and measured 58.15% through the right.

Neuro-notes I. The amphibian brain's inferior colliculi receive auditory cues from the lateral lemniscus and control such auditory reflexes as flinching in response, e.g., to a karate master's yell (see STARTLE REFLEX). Postural reflexes to loud sounds are triggered by the inferior and superior colliculi, through brain-stem-cervical cord interneurons to anterior horn motor neurons that are linked to spinal nerves in charge of muscle spindles.

Neuro-notes II. As in the visual neocortex, modules of auditory neocortex in the temporal lobe have specialized functions, e.g., to decode information about the frequency, intensity, and timing of sounds.