Posted
by
Unknown Lamer
on Wednesday July 11, 2012 @12:14PM
from the babel-gloves dept.

Zothecula writes about some pretty cool sensor gloves. From the article: "Since beginning in 2003, the Microsoft Imagine Cup has tasked students the world over with developing technology aimed at solving real-world problems. In this, its 10th year, students were asked to build their project around a specific Millennium Development Goal ... The winners have just been announced ... [and winning] first place (and US$25,000) in the Software Design category was the Ukraine's quadSquad with their EnableTalk gloves that translate sign language into speech in real time."

Wrong translation direction, this going from signs to speech so a deaf person doesn't have to carry a txt2speech or a notepad and pen or learn to speak (yes deaf people can learn to speak, like one of my friends did, confuses the hell out of people who assume being able to speak means being able to hear)

Another thing is one of my kids former school teachers worked her way thru school in the opposite direction translating speech to signs. The general impression I got was it was much closer to the fry cook pay level than the $20 claimed above. You can get $20 if you have deep technical knowledge and translate tech docs from english to Chinese, or if you have a security clearance and know Arabic or other ME languages, but...

Well, that all depends on the size of the "ship," now doesn't it? Methinks thou art giving AC a wee bit too much credit in regards to the size of their dinghy.

ps - My wife is an interpreter for the deaf

That's pretty awesome man, no sarc. FWIW, one of my closest cousins is deaf, so I've learned quite a bit of sign language by proxy, although most of what he has taught me cannot be repeated in polite conversation...

Wrong translation direction, this going from signs to speech so a deaf person doesn't have to carry a txt2speech or a notepad and pen or learn to speak

If you read TFA, it was designed for people "with hearing and speech disabilities".

Learning to speak is a big hurdle for many deaf people, but it is an insurmountable hurdle for those who are mute, even if they can hear.

Anyway. If you're going to wear a computerized glove that can speak for you, it seems that a chorded keyboard would be a much better choice. Faster, more accurate, and more expressive than an ASL translator.

That is not not something Ayn Rand was the first to state. It's even older than basic capitalism. For example, the ancient Greek and Romans knew that while slave labour was bad for the slave, it was essential for their society, and thus a net-gain for it.

But as you can quite clearly see, whether a net-gain for society is morally good or wrong depends on what kind of society you are talking about.

And then of course, the windowmaker-fallacy is also not very far away. Just because you break a window and give a

Well considering deaf people are the large majority of who needs this technology. A good portion of them are particularly good lip readers. Especially ones who are in the habit of being around non-deaf people. So them telling you what they want, is a tenfold larger challenge then them understanding your response.

"Although the software was developed under Windows Phone 7, the team was forced to turn to the older Windows Mobile platform for their entry because Windows Phone 7 doesn’t provide developers access to the Bluetooth stack, which is how the gloves communicate wirelessly with a mobile device running the translation software."

After watching the video, it seems that what they've done is create gloves which recognize the various fingerspelling signs. If somebody wants to sign "I need to withdraw money" (like, at a bank), what this allows them to do is to make the sign for "I", then "N, E, E, D", then "T,O", and so forth. Then the gloves feed that output into a TTS system. This works (because ASL users and English speakers share a writing system), but is horribly inefficient, and would be equivalent to a translation module that makes you speak every letter of the written words individually before putting the words into Spanish.

This is fundamentally different from "translating sign language", where the gloves would recognize the (much more complex and spatially oriented) sign for "I", for "need", for "withdraw" and for "money", and then translate that into "I need to withdraw money" and speak it aloud. Adding in the fact that ASL syntax is fundamentally different than in English, it's quite a tall order. Interpreters need not fear.

This is cool, nobody's denying that, and for some jobs, this might be great, but at the moment, I don't see it working much faster than taking out the requisite smartphone and writing down what you're trying to get across.

I came here to say what you said. There's quite a ways to go on this Sign language is conceptual - they sign the concepts, not the literal words. To complete it, they need to add a display that will do speech to text for the other person. At that point, the interpreters might start to worry.

Mod parent up! Finger Spelling is *not* Sign Language. If all this does is translate finger spelling into synthesized speech, the same thing could be done much faster and cheaper by just typing the words on a standard smartphone device.

This is not even cool. It is just, plain, wrong in so many ways. All of the money and hype spent making and marketing this device would reap 10X as much benefit if the same money were spent educating people about the real nature of deafness and sign language. The developers of this waste of time could start by taking a class about deafness themselves.

The fact that Slashdot perpetuates the inaccurate headline equating finger spelling with sign language just demonstrates how ignorant we all are.

And the next question is... which finger spelling language are they translating? American? Irish? British? Australian? International (a mish-mash of English finger-spelling systems)... every language uses a different set.

They are not translating anything. They are transcribing within the same language. In the demo clip I saw, they are transcribing from English fingerspelling to English speech. There is no translation involved. You are right, even within a language, there can be regional variations. American fingerspelling differs from British. But they are transcribing, not translating.

After watching the video, it seems that what they've done is create gloves which recognize the various fingerspelling signs. If somebody wants to sign "I need to withdraw money" (like, at a bank), what this allows them to do is to make the sign for "I", then "N, E, E, D", then "T,O", and so forth. Then the gloves feed that output into a TTS system. This works (because ASL users and English speakers share a writing system), but is horribly inefficient, and would be equivalent to a translation module that makes you speak every letter of the written words individually before putting the words into Spanish.

This is fundamentally different from "translating sign language", where the gloves would recognize the (much more complex and spatially oriented) sign for "I", for "need", for "withdraw" and for "money", and then translate that into "I need to withdraw money" and speak it aloud. Adding in the fact that ASL syntax is fundamentally different than in English, it's quite a tall order. Interpreters need not fear.

This is cool, nobody's denying that, and for some jobs, this might be great, but at the moment, I don't see it working much faster than taking out the requisite smartphone and writing down what you're trying to get across.

I work with a deaf person, and believe me, they can sign incredibly fast. I reckon that he could say something simple "do you want to get lunch" fast enough with finger spelling that it'd sound completely normal speed to me. Sure, adding memory for all the words for various languages would be neat (very hard though), but for now, it's quite useful. Keep in mind that they don't always carry paper/phones with them, so sometimes the only option is to sign.

You are right in every aspect, but you also have to factor in that this is was merely a student project. It serves extraordinary well as proof of concept. Now someone (perhaps MS?) needs to focus on it and improve it by making it able to translate the actual sign language gestures. If fingerspelling is possible, the jump to sign language should be not that hard.

I just want to know the use case. What if it were perfect and could precisely translate sign language into spoken or printed English - what advantage would it hold over a keyboard or smartphone or even a pen and paper? Plus, many (most?) deaf people can speak.

Maybe your mute? you can't speak. sign language (real sign language) is much faster then typing or writing from what I understand. Wear the gloves leave your phone in your pocket, wear a small speaker around your neck. You look fairly normal standing at the bank instead of like a freak waving a clip board.

keep in mind to do sign language, you don't need to be able to see your hands like with typing. you don't have to fish a device out your pocket every time you want to talk...

You are right I suppose - if they could get it to do more than just simple letter signs (which are slower than typing). And if they could get it to translate sign to English. And if you could still use your hands while wearing the gloves.

But I gotta tellya, when stuff like this is happening [msn.com], the uses for something that depends on exaggerated hand motions seems to diminish:)

Maybe your mute? you can't speak. sign language (real sign language) is much faster then typing or writing from what I understand.

If true, this is the really interesting part. If signs could be interpreted like typing, but faster, the gloves would replace keyboards in a way that voice-to-text can never do, even if done perfectly.

Presumably, but its not. These gloves transcribe finger spelling. Finger spelling is not a separate language, it does not require translation, and no deaf people use it as their primary form of communication. Any deaf person who knows how to type can easily type faster than they can finger spell.

'Real' sign languages (like ASL) are much harder to translate because they are somewhat non-linear. A single gesture can describe several things at once: size, direction, emotional state, etc. There's no way you can translate it without fully understanding the context of the speech. And we all know how good computers are at such tasks...

Then I guess it translates sign language. Finger-spelling is just an alphabet. I don't know about other sign languages, but ASL is a full language with its own grammar and conventions, and it would take a lot more than a glove to interpret it. Positions of hands with respect to the body are important, as are facial expressions, and ASL's pronoun system is largely spatial with the handshape only indicating the type of pronoun (e.g. personal vs. possessive). Even if a piece of technology could reliably capt

While I suspect that was a joke, in theory Kinect would actually be a better platform upon which to build an ASL-to-English translator, if not for the fact that the skeletal data it provides does not include fingers.