Engadget RSS Feedhttp://www.engadget.com
Engadgethttp://www.blogsmithmedia.com/www.engadget.com/media/feedlogo.gifEngadgethttp://www.engadget.com
en-usCopyright 2015 AOL Inc. The contents of this feed are available for non-commercial use only.Blogsmith http://www.blogsmith.com/http://www.engadget.com/2014/10/31/scientists-can-make-your-inner-monologue-audible/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2014/10/31/scientists-can-make-your-inner-monologue-audible/http://www.engadget.com/2014/10/31/scientists-can-make-your-inner-monologue-audible/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#comments

When you hear someone else speak, specific neurons in your brain fire. Brian Pasley and a bunch of his colleagues discovered this at the University of California, Berkeley. And not only that, but those neurons all appeared to be tuned to specific sound frequencies. So, Pasley had a thought: "If you're reading text in a newspaper or a book, you hear a voice in your own head," so why can't we decode that internal voice simply by monitoring brain activity. It's similar to the idea that led to the creation of BrainPort, which lets you "see" with your tongue. Your eyes, ears or vocal chords don't really do the heavy lifting, it's your brain. And if you can give the brain another source of input or output you might be able to train it to approximate a lost ability like speech.

Engadgeteers spend a lot of their day staring at a screen, so it's no surprise that nearly all of us are blind without glasses or contact lenses. But wouldn't it be great if we could give our eyes a break and just stare at the screen without the aid of corrective lenses? That's the idea behind an experimental display that automatically adjusts itself to compensate for your lack of ocular prowess, enabling you to sit back and relax without eyewear. It works by placing a light-filtering screen in front of a regular LCD display that breaks down the picture in such a way that, when it reaches your eye, the light rays are reconstructed as a sharp image. The prototype and lots more details about the method will be shown off at SIGGRAPH next month, after which, its creators, a team from Berkeley, MIT and Microsoft, plan to develop a version that'll work in the home and, further down the line, with more than one person at a time.

Feeling smug about those brand-name cans you just bought? A pair of researchers from Berkeley just made 'em obsolete with some Graphene. Conventional gear needs an oscillator that has to be damped down to produce a constant sound between 20Hz and 20kHz. Graphene, on the other hand, can be tailored to do the same job without any complicated, and power draining, over-engineering. Qin Zhou and Alex Zettl found their power-sipping setup to be as good, if not better than the pair of Sennheisers they tested it against. We're hopeful that Graphene headphones aren't too far away, assuming Fiddy doesn't get to the pair first and shut 'em down.

Viruses are the swarming bullies of biology, but it turns out their alarming self-replication could one day power your iPod. We've seen them in batteries before, but researchers at Berkeley Labs have now coated electrodes with modified M13 bacteriophage, a harmless bacteria-eating virus, to create the first ever organic piezoelectric material -- which can convert force to electricity. The team explained that such a substance would be non-toxic, organize naturally into thin layers and self-regenerate, giving it a possible advantage over chemical options. In theory, by attaching a thin film of it to your shoes, power could be generated when walking, lending volts to the myriad electronics we pack around nowadays. To see a finger-powered video demo of our frequent-enemies making themselves useful for a change, stroll on past the break.

]]>
bacteriophagebacteriophagesBerkeleyBerkeley LabsBerkeleyLabsBioElectricbioelectric technologyBioelectricTechnologyelectricityenergyforcegreengreen technologyGreenTechnologypiezopiezoelectricpowerpower generationPowerGenerationrenewable energyRenewableEnergyvirusTue, 15 May 2012 17:41:00 -040021|20238115http://www.engadget.com/2011/05/16/paralyzed-student-uses-robotic-exoskeleton-to-walk-at-college-gr/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2011/05/16/paralyzed-student-uses-robotic-exoskeleton-to-walk-at-college-gr/http://www.engadget.com/2011/05/16/paralyzed-student-uses-robotic-exoskeleton-to-walk-at-college-gr/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#comments
Austin Whitney hasn't been able to walk since a 2007 car crash left him paralyzed, but on Saturday the 22-year-old triumphantly strode across the stage to accept his degree from UC Berkeley. He had a little help, in the form of a specially crafted robotic exoskeleton developed by Berkeley engineering professor Homayoon Kazerooni. Kazerooni and his team designed the exoskeleton with lightness and affordability in mind, resisting the urge to load it up with expensive hardware and tethering the mechanized walker to a backpack that houses a computer and a rechargeable, eight-hour battery. As a result, the Austin walker won't enable the kind of acrobatic leaps that would make Lt. Rasczak proud, but its reduced mobility comes at a reduced cost of just $15,000. That's certainly not an impulse buy, though it's a welcomed alternative to otherexoskeletons that retail for $100,000 or more. Walk past the break for a video of Whitney's momentous steps, along with a clip of Kazerooni describing his creation.

]]>
austin whitneyAustinWhitneyberkeleycollegecollege graduationCollegeGraduationcommencementcostdesignexoskeletongraduationhealthmoneyparalyzedparaplegicrechargeablerechargeable batteryRechargeableBatteryrobotic exoskeletonRoboticExoskeletonroboticsstudentsUC berkeleyUcBerkeleyUniversity of California BerkeleyUniversityOfCaliforniaBerkeleyvideoWalkerwalkingMon, 16 May 2011 08:45:00 -040021|19941175http://www.engadget.com/2011/02/07/scientists-grow-nanolasers-on-silicon-chips-prove-microscopic-b/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2011/02/07/scientists-grow-nanolasers-on-silicon-chips-prove-microscopic-b/http://www.engadget.com/2011/02/07/scientists-grow-nanolasers-on-silicon-chips-prove-microscopic-b/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#comments
What you see above may look like a nanoscale Obelisk of Light, ready to protect the tiny forces of Nod, but that's not it at all. It's a nanolaser, grown directly on a field of silicon by scientists at Berkeley. The idea is to rely on light to transmit data inside of computers, rather than physical connections, but until now finding a way to generate that light on a small enough scale to work inside circuitry without damaging it has been impossible. These indium gallium arsenide nanopillars could solve that, grown on and integrated within silicon without doing harm. Once embedded they emit light at a wavelength of 950nm, as shown in the video below.

]]>
berkeleylaserlaser interfaceLaserInterfacelight interfaceLightInterfacenanonanolasernanoscalesemiconductorsiliconuc berkeleyUcBerkeleyvideoMon, 07 Feb 2011 08:53:00 -050021|19831330http://www.engadget.com/2011/01/14/worlds-first-room-temperature-semiconductor-plasmon-nanolaser-c/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2011/01/14/worlds-first-room-temperature-semiconductor-plasmon-nanolaser-c/http://www.engadget.com/2011/01/14/worlds-first-room-temperature-semiconductor-plasmon-nanolaser-c/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#commentsWe're big proponents of the idea that everything is better with lasers, and a team of researchers at UC Berkeley has created a new type of semiconductor plasmon nanolaser, or spaser, that could eventually find a home in many of your favorite devices. The big breakthrough is that Berkeley's spaser operates at room temperature -- previous spasers could only sustain lasing at temperatures below -250° C -- enabling its use in commercial products. Plasmon lasers work by amplifying surface plasmons, which can be confined to a much smaller area than the light particles amplified by conventional lasers. This allows for extreme miniaturization of optical devices for ultra-high-resolution imaging, high sensitivity biological sensors, and optical circuits 100 times faster than the electronic variety. There's no word on how soon the technology will be commercially available, so you'll have to wait a bit longer for your first laser computer.

]]>
berkeleylaserlasersnanonano lasernanolasernanotechnanotechnologyplasmonplasmon nanolaserPlasmonNanolaserspaserUC berkeleyUcBerkeleyFri, 14 Jan 2011 01:33:00 -050021|19800324http://www.engadget.com/2010/09/13/uc-berkeley-researchers-craft-ultra-sensitive-artificial-skin-r/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2010/09/13/uc-berkeley-researchers-craft-ultra-sensitive-artificial-skin-r/http://www.engadget.com/2010/09/13/uc-berkeley-researchers-craft-ultra-sensitive-artificial-skin-r/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#commentsResearchers and engineers have been toiling on synthetic skins for years now, but most of 'em have run into one major problem: the fact that organic materials are poor semiconductors. In other words, older skins have required high levels of power to operate, and those using inorganic materials have traditionally been too fragile for use on prosthetics. Thanks to a team of researchers at UC Berkeley, though, we're looking at a new "pressure-sensitive electronic material from semiconductor nanowires." The new 'e-skin' is supposedly the first material made out of inorganic single crystalline semiconductors, and at least in theory, it could be widely used in at least two applications. First off, robots could use this skin to accurately determine how much force should be applied (or not applied, as the case may be) to hold a given object. Secondly, this skin could give touch back to those with artificial hands and limbs, though that would first require "significant advances in the integration of electronic sensors with the human nervous system. Dollars to donuts this gets tested on the gridiron when UCLA and / or Stanford comes to town.

]]>
artificial skinArtificialSkinberkeleye-skinengineernanowirenanowiresprostheticprostheticsresearchsciencesensingsensorsensorsskinUC BerkeleyUcBerkeleyuniversityMon, 13 Sep 2010 12:37:00 -040021|19631147http://www.engadget.com/2010/08/11/laser-backpack-creates-instant-3d-maps-venkman-reminds-you-to-n/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2010/08/11/laser-backpack-creates-instant-3d-maps-venkman-reminds-you-to-n/http://www.engadget.com/2010/08/11/laser-backpack-creates-instant-3d-maps-venkman-reminds-you-to-n/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#comments
Total protonic reversal? Small price to pay for an instantaneous 3D scan of a building's interior. That's what the backpack pictured above delivers, a project from UC Berkeley students and faculty Matthew Carlberg, Avideh Zakhor, John Kua, and George Chen. The pack contains a suite of laser scanners and positional sensors that enable it to capture images of building interiors as a fleshy assistant roams their halls. Those images can then be automatically pieced back together to create a 3D representation. We're having visions of instant Doom II WADs but the real boon here could be an extension to Google Maps where you could not only get a Street View but also an interior view. You know, really scope out that little Thai joint before you schlep yourself all the way downtown.

We've already seen Willow Garage's PR2 robot learn to roam offices in search of a power outlet, and it looks like some researchers at UC Berkeley have now helped it pull off its most impressive feat yet: folding towels. That may not sound like too hard a task, but it's actually proven to be quite a conundrum for robotic laundry researchers, since robots need to first pick up a towel from a pile and then somehow determine that this previously unseen shape is, in fact, a towel that can be folded. While it's still a long way from being the Roomba of laundry, the JR2 bot is now able to fold at the blistering speed of 25 minutes per towel, and the researchers are hopeful that the same computer vision-based approach can also be applied to a range of other tasks that have previously stumped robots. Head on past the break for the video -- don't worry, it's sped up.

]]>
berkeleyjr2laundrylaundry robotlaundry roboticsLaundryRobotLaundryRoboticstowel-foldingUC berkeleyUcBerkeleyvideowillow garageWillowGarageMon, 05 Apr 2010 22:21:00 -040021|19427444http://www.engadget.com/2009/07/25/monkeys-and-scientists-develop-persistent-plug-and-play-contro/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2009/07/25/monkeys-and-scientists-develop-persistent-plug-and-play-contro/http://www.engadget.com/2009/07/25/monkeys-and-scientists-develop-persistent-plug-and-play-contro/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#comments
While we've seen some pretty amazing things so far with computers jacked into human and monkey brains, systems so far have had to be re-learned each session by their subjects. In a new development, researchers at Berkeley have managed to get their monkeys to develop a "memory" for the controls, and recall them instantly each day. To do this, the scientists kept track of specific neurons from day to day -- a little tough to do, but obviously worth the hassle. It's good news for future brain-to-computer interfaces that will enable the disabled and the truly lazy to perform tasks and kick ass through the mere power of thought, but we're a little afraid of giving these monkeys too much in the way of internet access: the world doesn't need another 4chan.

]]>
berkeleybrain interfacebrain scanBrainInterfaceBrainScancomputer brain interfacecomputer to brain interfaceComputerBrainInterfaceComputerToBrainInterfacemonkeysuniversity of california berkeleyUniversityOfCaliforniaBerkeleySat, 25 Jul 2009 16:07:00 -040021|19109588http://www.engadget.com/2009/07/22/cellscope-the-cellphone-microscope-gets-uv-upgrade-to-spot-tin/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2009/07/22/cellscope-the-cellphone-microscope-gets-uv-upgrade-to-spot-tin/http://www.engadget.com/2009/07/22/cellscope-the-cellphone-microscope-gets-uv-upgrade-to-spot-tin/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#comments
It was over a year ago that UC Berkeley introduced the world to CellScope, the 60x microscope for cellphones made from cheap, off the shelf components (like a re-purposed belt clip). Now, even though we're disappointingly still not seeing this thing in stores, there's an upgraded version able to take pictures of even smaller nasties. Using a filter the scope can now spot microscopic critters tagged with dye that glows under fluorescent light -- things like Mycobacterium tuberculosis (that's the cause of TB if you, like us, lack a med degree). A software app is able to then count the number of cells within a given sample and tell you whether to worry about that annoying cough. There's still no word on whether this product will ever actually start scoping out such things in the wild, but we certainly hope it will -- if only so that we can keep our vast collection of cellphone accessories complete. Video after the break.

]]>
60xaccessoriesberkeleycellscopemicroscopemobiletbtuberculosisuc berkeleyUcBerkeleyuniversity of california berkeleyUniversityOfCaliforniaBerkeleyWed, 22 Jul 2009 09:23:00 -040021|19106091http://www.engadget.com/2009/07/22/cellscope-the-cellphone-microscope-gets-uv-upgrade-to-spot-tin/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2009/07/22/cellscope-the-cellphone-microscope-gets-uv-upgrade-to-spot-tin/http://www.engadget.com/2009/07/22/cellscope-the-cellphone-microscope-gets-uv-upgrade-to-spot-tin/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#comments
It was over a year ago that UC Berkeley introduced the world to CellScope, the 60x microscope for cellphones made from cheap, off the shelf components (like a re-purposed belt clip). Now, even though we're disappointingly still not seeing this thing in stores, there's an upgraded version able to take pictures of even smaller nasties. Using a filter the scope can now spot microscopic critters tagged with dye that glows under fluorescent light -- things like Mycobacterium tuberculosis (that's the cause of TB if you, like us, lack a med degree). A software app is able to then count the number of cells within a given sample and tell you whether to worry about that annoying cough. There's still no word on whether this product will ever actually start scoping out such things in the wild, but we certainly hope it will -- if only so that we can keep our vast collection of cellphone accessories complete. Video after the break.

]]>
60xberkeleycellscopemicroscopetbtuberculosisuc berkeleyUcBerkeleyuniversity of california berkeleyUniversityOfCaliforniaBerkeleyWed, 22 Jul 2009 09:22:00 -040021|19106073http://www.engadget.com/2009/04/30/quest-for-invisibility-cloaks-revisited-by-two-research-groups/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2009/04/30/quest-for-invisibility-cloaks-revisited-by-two-research-groups/http://www.engadget.com/2009/04/30/quest-for-invisibility-cloaks-revisited-by-two-research-groups/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#comments
After a brief period of no news, it's time to revisit the world of invisiblecloaks. Inspired by the ideas of theoretical physicist John Pendry at Imperial College, London, two separate groups of researchers from Cornell University and UC Berkeley claim to have prototyped their own cloaking devices. Both work essentially the same way: the object is hidden by mirrors that look entirely flat thanks to tiny silicon nanopillars that steer reflected light in such a way to create the illusion. It gets a bit technical, sure, but hopefully from at least one of these projects we'll get a video presentation that's sure to make us downright giddy.

]]>
berkeleycornellcornell universityCornellUniversityimperial collegeImperialCollegeinvisibilityinvisibility cloakinvisibility cloaksInvisibilityCloakInvisibilityCloaksinvisiblejohn pendryJohnPendrymichael lipsonMichaelLipsonsilicon nanopillarsilicon nanopillarsSiliconNanopillarSiliconNanopillarsuc berkeleyUcBerkeleyXiang ZhangXiangZhangThu, 30 Apr 2009 05:26:00 -040021|1532582http://www.engadget.com/2009/01/29/cyborg-beetles-commandeered-for-test-flight-laser-beams-not-ye/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2009/01/29/cyborg-beetles-commandeered-for-test-flight-laser-beams-not-ye/http://www.engadget.com/2009/01/29/cyborg-beetles-commandeered-for-test-flight-laser-beams-not-ye/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#comments
Remember that DARPA initiative from a few years back to create cyborg insects? With funding from the agency, researchers at the University of California, Berkeley have managed to control a rhinoceros beetle via radio signals, demonstrated in a flight test shown on video at this week's IEEE MEMS 2009 conference. A module placed on the arthropod uses six electrodes affixed to the brain and muscles to commandeer its free will. The device weighs 1.3g -- much less than the 3g payload these guys can handle, and with enough wiggle room to attach sensors for surveillance. Ultimately, scientists say they want to use the beetle's own sensors -- namely, its eyes -- to capture intel and its own body energy to power the apparatus. Keep an eye on this one, we expect it to play a major role in the impending robots vs. humans war.

]]>
beetleberkeleycyborgcyborgsdarpaieeeieee mems 2009IeeeMems2009memsmems 2009Mems2009robotuc berkeleyUcBerkeleyuniversity of californiaUniversityOfCaliforniaThu, 29 Jan 2009 03:02:00 -050021|1443581http://www.engadget.com/2008/07/31/researchers-find-ways-to-squeeze-light-into-spaces-never-thought/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2008/07/31/researchers-find-ways-to-squeeze-light-into-spaces-never-thought/http://www.engadget.com/2008/07/31/researchers-find-ways-to-squeeze-light-into-spaces-never-thought/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#commentsIt looks like a team of UC Berkeley researchers led by mechanical engineering professor Xiang Zhang (pictured) have found a way to squeeze light into tighter spaces than ever though possible, which they say could lead to breakthroughs in the fields of optical communications, miniature lasers, and optical computers. The key to this new technique, it seems, is the use of a "hybrid" optical fiber consisting of a very thin semiconductor wire placed close to a smooth sheet of silver, which effectively acts as a capacitor that traps the light waves in the gap between the wire and the metal sheet and lets it slip though spaces as tiny as 10 nanometers (or more than 100 times thinner than current optical fibers). That's apparently as opposed to previous attempts that relied on surface plasmonics, in which light binds to electrons and allows it to travel along the surface of metal, which only proved effective over short distances. While all of this is still in the theoretical stage, the researchers seem to think they're on to something big, with research associate Rupert Olten saying that this new development "means we can potentially do some things we have never done before.

It's no surprise that more displays is always better, but when it comes to mimicking the act of reading a book, dual displays is a clear step forward. Researchers at Maryland and Berkeley Universities developed a prototype dual-face, modular e-book reader that allows readers to fan pages to advance in a book or via trackball. If you're doing some serious research, the displays separate from one another, allowing one to display in landscape mode while the other runs in portrait. To complete the book meme, the device can be folded over to run in a more compact manner, and a simple flip changes the page. Possibilities for future e-book readers are endless here, so we applaud Maryland and Berkeley for using those research dollars.

]]>
berkeleydisplayse booke bookse readere readerse-booke-bookse-readerE-readersebookebooksmaryland universityMarylandUniversityThu, 26 Jun 2008 12:53:00 -040021|1237573http://www.engadget.com/2008/03/25/intel-and-microsoft-fund-20m-grant-to-reinvent-computing-where/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2008/03/25/intel-and-microsoft-fund-20m-grant-to-reinvent-computing-where/http://www.engadget.com/2008/03/25/intel-and-microsoft-fund-20m-grant-to-reinvent-computing-where/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#commentsAlthough both Microsoft and Intel's R&D departments have been responsible for some nifty futuristictech, the two companies got together last week and announced a $20M grant to two universities to "start over" and develop next-gen computing systems based around parallel processing. The grant will fund Universal Parallel Computing Research Centers at UC -Berkeley, which is kicking in another $7M, and the University of Illinois at Champaign / Urbana, which is donating $8M of its own. According to Mark Snir, head of the UIUC lab, the goal is to find a way to make "parallelism so easy to use that parallel programming becomes synonymous with programming" -- an increasingly important priority as current multi-core processors aren't necessarily being fully utilized, and 100-core processors aren't far off. That leads us to wonder: what to do with all that newly-unlocked processing power? Virtual-reality Facebook? Real-time visual augmentation? Finally being able to run Crysis? We know you've got ideas -- sound off in comments!

]]>
berkeleychampaignintelmicrosoftparallel processingparallelismParallelProcessinguiucUniversal Parallel Computing Research CentersUniversalParallelComputingResearchCentersupcrcTue, 25 Mar 2008 20:02:00 -040021|1148996http://www.engadget.com/2008/03/22/cellphone-as-microscope-on-the-cheap-bugs-beware/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2008/03/22/cellphone-as-microscope-on-the-cheap-bugs-beware/http://www.engadget.com/2008/03/22/cellphone-as-microscope-on-the-cheap-bugs-beware/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#commentsGo tech students! This handy idea, brought to you by the minds at University of California, Berkeley, brings up to a 60x microscope to your cell for roughly $75. The 60x attachment is useful for diagnosing things like Malaria while in the field, while its weaker 5x sibling can be used to look at skin conditions. The prototype was apparently made from off the shelf components -- including some low power LEDs that illuminate the subject -- and snaps in place with a modified belt clip. This is a pretty handy piece of kit when you consider how much easier it may be to snap a pic of something and fire it off to a lab via a data connection instead of having to physically bring a sample. We're sure the poor soul featured in the pic we have here agrees, as it looks like he / she may have a tiny shrimp infestation.

]]>
BerkeleycellcellscopeCulturediagnosismicroscopemobileperipheralsSat, 22 Mar 2008 23:58:00 -040021|1146687http://www.engadget.com/2007/08/25/study-finds-no-link-between-car-accidents-and-yapping-whilst-dri/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2007/08/25/study-finds-no-link-between-car-accidents-and-yapping-whilst-dri/http://www.engadget.com/2007/08/25/study-finds-no-link-between-car-accidents-and-yapping-whilst-dri/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#commentsIf you're ready for a healthy dose of unconventional wisdom, you've come to the right place, as a couple of confident graduate student economists at UC-Berkeley are purporting that there is "no match in the evening cellphone use spike and crash data." Basically, the duo is suggesting that although we've been on the mobile horn a lot more these days, the number of fatal vehicular accidents over the past 18 years have not experienced the same leap. Weird logic, we know, so take it for whatever it is (or isn't) worth.

]]>
berkeleycarconnectionCulturedrivingmobileresearchstudiesstudytalking drivingTalkingDrivinguniversitySat, 25 Aug 2007 19:16:00 -040021|973230http://www.engadget.com/2007/05/05/berkeleys-respectful-surveillance-cameras-disregard-faces/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2007/05/05/berkeleys-respectful-surveillance-cameras-disregard-faces/http://www.engadget.com/2007/05/05/berkeleys-respectful-surveillance-cameras-disregard-faces/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#commentsWhile blatantly spying on us is one thing, attempting to freshen it up by suggesting a venerating alternative is bordering on preposterous. As we've seen at the Sky Harbor airport, officials are trying nearly anything they can to make forthright invasions of privacy seem a bit less offensive, and a CCTV camera developed by researchers at the University of California, Berkeley is next up to bat. The so-called "respectful cameras" are aimed at places of employment, where specified workers would wear a given marker that could be recognized by the camera. After being identified, the camera would then spot out the face of the individual to provide some sort of false assurance that their identity is magically safe. The best, er, worse part, however, is that the system doesn't actually delete the face beneath the circle, as it "allows for the privacy oval to be removed from a given set of footage in the event of an investigation." So much for dodging Big Brother.

]]>
Berkeleybig brotherBigBrothercctvprivacyrespectful cameraRespectfulCamerasecuritysurveillancetracetracktrackingSat, 05 May 2007 20:01:00 -040021|889334http://www.engadget.com/2007/04/20/craigslist-founder-hosts-webcam-for-mmo-birdwatching/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2007/04/20/craigslist-founder-hosts-webcam-for-mmo-birdwatching/http://www.engadget.com/2007/04/20/craigslist-founder-hosts-webcam-for-mmo-birdwatching/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#comments
Those nights of jubilation spent wasting time on City of Heroes, World of Warcraft, and Ultima Online just aren't what they used to be, but in just three days, the wait for the world's next incredible MMO will be over. Alright, so maybe online birdwatching won't take the globe by storm, but researchers at UC Berkeley and Texas A&M will be watching intently as the MMO goes live from the back porch of Craig Newmark. The Craigslist founder will be hosting a "remotely controllable robotic video camera" from the back deck of his San Francisco domicile, and interested users can log on to discover and classify wild birds in the Sutro Forest. By utilizing a "collaborative control interface," dozens of users can reportedly share the webcam simultaneously, which uses "highly responsive algorithms that automatically compute the optimal camera viewpoint." Gamers can rack up points by snapping shots of rare birds and then seeing just how many users can correctly classify it, so we'd highly recommend brushing up on your aves knowledge in preparation for April 23rd.