Automatic Animated Sign Language Generation System

The NHK Science & Technology Research Laboratories are conducting basic research into technology to automatically generate animated sign language in order to expand sign language broadcasts. They introduced an automatic sign language translation system at Technology Open House 2011, converting strings of words using Japanese as well as sign language sample texts.

"Subtitles are fine for people who understand Japanese, and who lost their hearing at some point. Meanwhile, people who are deaf from birth learn sign language first, naturally they study Japanese after that, but they find that sign language is easier to understand than subtitles, so we are conducting research in sign language."

"Currently we are able to translate at the text level, but the text that can be translated is extremely limited. The technology must convert the Japanese language that is input into a string of sign language words. We use samples that have the same content in both Japanese and sign language, compare them to what is actually input, and then replace words that differ to achieve the translation. Meanwhile, to create the sign language CG we need to make automatic transitions between words, and to a degree we are able to create that. However, there are parts that we are not able to express well yet, and we have made an interface in which a human can correct those parts."

The goal for the time being is to apply this to the news when a disaster occurs. When there is a sudden newsflash, it can be difficult to obtain a human sign language interpreter, so this kind of system would be invaluable. Also, in disaster reporting there are many sentences that are of a pre-determined nature, so it is relatively easy to translate.

"We asked a number of deaf people to watch the animations, and while they certainly could understand it word by word, they pointed out it still lacks fluency as sign language. In the future we also want to improve that level of fluency."