When one sensory input, hearing, is blocked altogether or reduced to some degree, a greater load of communications is placed on vision. Not surprisingly, the deaf and hearingimpaired have long relied on two visual substitutes for speech: lip reading and sign language. To make these skills easier to learn, two contestants in the Johns Hopkins University Search for Applications of Personal Computing to Aid the Handicapped have devised ways of simulating lip positions and hand signs on a display. In both cases the main intent of the software packages is to train not only the deaf and hearing-impaired, but also those who want to communicate with them.