Posted
by
samzenpus
on Wednesday October 04, 2006 @10:50PM
from the your-cat-wants-an-upgrade dept.

An AC writes,"NewScientistTech has a story about robotic whiskers capable of sensing shape and texture in a similar way to those belonging to rats and seals. The 'bending moment,' or torque, exerted at the base of each whisker is used to extract feature information. The artificial whiskers could be used on interplanetary rovers, or allow underwater vehicles to track moving objects by their wake. Check out the slightly creepy video of them stroking a sculpted face."

Well thank fuck someone came out and said it. I'm a Catholic but for anyone to shove it down another's throat (even in flaming) is unacceptable and does the Church more harm than good. Fucking zealous pricks.

(Deist) Thomas Paine's The Age of Reason argued that Nature is the only reliable scripture!

Actually there's a whole field of "biomimetics" that recognizes that evolution has solved a lot of engineering problems already, giving us clues into things like ideal shapes for sails (birds' wings) and durable macro-scale materials (beehives). The trick to that field is in figuring out what aspects of nature to imitate; for instance, the Wright Brothers studied birds but didn't feel compelled to build a true "orni

Basically, Jewish law (Yorah Deah, Hilchos Sefer Torah) forbids erasing G-d's name. So, many have accustomed themselves to not write G-d's name in Hebrew outright. This means leaving out a letter, or using the wrong letter (e.g. daleth instead of hay) or using a dash in between the letters, which is very common when the intent is not to write G-d's name, but use a name that include's one. This is

I'm a Jew, and Rabbi Telushkin points out that placing a dash in God is based on a misreading of Torah. See "A Code of Jewish Ethics: Volume 1: You Shall Be Holy." I can't remember the exact passage, as it was from a library book. Although the reason you give is a different reason, partly.

Either he has it correctly, or he doesn't. And i do not believe that he does. As i have mentioned the reason. And i can quote chapter and verse, Siman and S'if, for the relevant laws, and i know why people do it in Hebrew, and i know that it has carried over into English.

Dragging a model head across the feelers without any feedback wasn't exactly a convincing demo.

I agree.

Plus they should have waited until that Chia-pet was fully developed before subjecting it to the rigors of a lab environment. I suspect that it will never be able to grow a uniform coat of sprouts now, thanks in large part to the creepy whiskers dragged across its face for (no doubt) hours at a time.

I think what you are suggesting is that there should be simultaneous video of (1) the whiskers sweeping over the sculpted head, and (2) the computer drawing the image generated by the whiskers. Is that what you mean by "feedback?" If so, you're right, that would be a more convincing video. However, the system doesn't yet operate in real time. Real time operation wasn't our goal. Our goal was to illustrate the basic mechanical principle (bending moment alone gives you all the info you need, even in the presence of significant slip), and to demonstrate that this principle could work for both robots and rats (and seals, underwater)

The video posted here was intended to give an intuitive impression of the size of the whiskers compared to the head, the speed of the whiskers (currently slow, but that could be changed), the extent that the whiskers "slip" when they hit the head. The fact that the whiskers slip so much makes feature extraction really difficult, especially with no force sensors.

Thus, while I understand that you're dissapointed that we didn't have real-time image extraction, I take issue with the epithet "lame" as applied to our video.;-) Real-time extraction wasn't the point of the paper. But thanks for your comments -- always interesting to hear different perspectives.

I don't think realtime data analysis is necessary to have a good demo. But when you're showing off the technology, it's got to be more than just dragging a couple wires across a dummy.I recommend taking down the "demo" as it serves no purpose except to make people question the project and the people behind it. Better to have a good paper and let the audience imagine how terrible the technology is than to give a demo that removes all doubt.

Remember that the media puts a very different spin on things than scientists and engineers might if they were presenting the work. We have virtually no control of the spin that any given website chooses to put on the article. Apparently this website has given the impression that the video is a "demonstration" of the technology. If I were presenting this video to an audience, I would not say that this was a demo, but rather that it is an illustration of how difficult the sensing problem is. Then I would explain our algorithm in some detail, describing how we overcame the obstacles exemplified in the video.

So when you say that the video has no purpose -- I think that is a result of the website not explaining the video correctly. The primary purpose of the video is to illustrate how hard the sensing problem is. How would you cope with the whiskers slipping and sliding all over the object, if you only had a sensor at the base of the whisker? It's a hard problem!

If the video is interpreted as a "demo," then it is a better demo if you look at the computer-generated image that the whiskers were able to extract. See the figure in the original article. The head on the left is the original sculpture. The image on the right is the sculpture as reconstructed by the whiskers.

You are absolutely correct that it's important to have a good paper when showing any scientific work. The peer-reviewed paper will appear in Nature tomorrow.

Yes, the video should be viewed in the context of the final image it
reconstructs. Once you see the final image - shown on the
article page for the 99% of readers who clicked on the video
but
didn't RTFA (a hyperlink labeled "video" being the/. equivalent of "ooh, shiny
button") - it is pretty damn impressive that the information can be
extracted from such "crude"-looking stroking. Although it's somewhat
eerie that the original sculpture looks vaguely "female", whereas the
reconstructed image looks vagu

Thanks for posting - it is a really interesting concept and I'm glad you're doing the work (though if you need another test-dummy, count me out please). I hope you'll post another article once you've got a shiny real-time working model that maps onto a monitor - this is slashdot after all:-P

I have a question though. Is the length of the whiskers pre-defined in constructing the image? Mammal whiskers are always growing, falling out, getting clipped (burned, in my cat's case - "ooh, that candle looks s

God, this is the most interesting and intelligent discussion I've ever seen on Slashdot! More articles by researchers please, Editors!

The change in cross-section hadn't occurred to me - which just goes to show how long it's been since I did Civil Engineering! Biologically speaking I suspect the shape of whiskers is a combination of the fact that they're specially adapted hairs, and the importance of keeping whiskers light and flexible. My structural mechanics is rusty, but I think a rigid, fixed-cross-sect

These whiskers tie in with existing research into artificial skin that can "feel." This 2005 NASA article [nasa.gov] describes mecha-skin that uses IR sensors to detect touch. Japanese researchers [bbc.co.uk] (2005) reported having a type that senses temperature and pressure through actual touching.

The skin research should be useful both for robotics and for replacement parts for humans, as an alternative to the clunky biological hand transplants that have been carried out. (I think I'd rather have a Luke Skywalker robot hand than a mismatched corpse's!) These artificial hand researchers [lucs.lu.se] will probably be interested as well, because having a prosthesis that can be sensed as well as controlled is necessary for it to be as good as the original. The big issue is how easy it will be to get these touch signals into the human nervous system in a useful way. For robots, the data can be built into existing software for making maps of a robot's surroundings. I picture a robot rat running a maze with a set of these whiskers. Won't whiskers serve as a low-energy-cost alternative to sonar and other sensing systems?

The odd thing is that here, the research is not into copying human abilities, but those of (nonhuman) animals. I wrote a silly article [anthrozine.com] arguing that future robots will be made to resemble animals, not humans, and Charles Van Doren (in A History of Knowledge) predicted "warm and fuzzy" robotics. Is that where we're headed?

Why don't you try posting some constructive comments, and putting the link to that page in your signature (where it may actually be regarded as a useful link and not spam). Anyone with any morals would now not use that site just because of your flagrant advertising, and anyone who actually needs that hopefully knows how to use Google to find a storage site, or even just setup their own hosting.

Man this is old stuff. I created wiskers for an Atari 400 computer way back in the early 80s. I was a Sr. In high school.The wiskers could detect a ball, box, wrench, dog bone, and pencil. Basiclly the wiskers were conductive and protruded througha conductive metal plate with circular holes for the "wiskers to poke through" Normaly the circuit was open.When an object brushed against the sensors it will close the circuit and I would detect a pattern. I had many different patternsstored for each object. T

The system described isn't quite the same... the innovation is that they are using 2-D torque sensors on the whiskers, and apparently are able to reconstruct the 3-D surface of the object whiskered using the data... that's a rather major improvement over just detecting a profile with on/off switches!;-)

They should have improved on it, man I did this 27 years ago! I wasn't going agter 3D surface reconstruction, I was going afterobject recognition, and a self learning applicaiton. This is much more than just detecting a profile.