neurosciencehttp://www.scienceandentertainmentexchange.org/taxonomy/term/47/all
enCreative Sciencehttp://www.scienceandentertainmentexchange.org/blog/creative-science
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>Imagine you’ve been working on a problem for days, maybe even weeks, but you can’t seem to figure it out. Your brain is working over solutions constantly but you feel stumped. So, you take a break. You walk down the street to the nearest coffee shop but as you’re walking home, sipping your drink and watching cars drive by, the solution rushes into your brain. <em>That’s it!</em></p>
<h2>Spontaneously Creative</h2>
<p><img class="imgRight" src="/sites/default/files/ah-ha-moments.jpg" border="0" alt="You can thank your basal ganglia for all those &quot;Ah-hah!&quot; moments." title="You can thank your basal ganglia for all those &quot;Ah-hah!&quot; moments." width="160" height="160" />In the above scenario, you aren’t thinking of the problem. You’re thinking of the coffee in your hand, the cars driving by, or the directions back to your house. That problem, that horribly frustrating problem, is solved almost out of thin air. What you’ve experienced is known as spontaneous and cognitive creativity. Your basal ganglia, a part of the brain involved in unconscious functions, took over your conscious brain’s efforts to solve the problem. Even while you were walking to the coffee shop and consciously paying attention to other stimuli (the coffee, the cars), your basal ganglia was working through the problem until it arriving at a solution. That’s the spontaneous part; the cognitive part is need for prior knowledge. You can’t have a spontaneous solution to a biology problem, for example, if you know nothing about biology.</p>
<p>This type of creativity brings up an interesting framework for understanding why scientists and engineers experience a different type of creativity from those in the entertainment industry. It’s not that the old myth of analytical, non-creative scientists and engineers is true – it’s just a different way of being creative (and we’ve got the science to back up that claim!).</p>
<h2>Four Ways to Be Creative</h2>
<p><img class="imgRight" src="/sites/default/files/Thomas_Edison.jpg" border="0" alt="Thomas Edison is a great example of a deliberate and cognitive thinker." title="Thomas Edison is a great example of a deliberate and cognitive thinker" width="145" height="184" />Neuropsychologist Arne Dietrich studies the neuroscience of creativity, among other topics like consciousness. His research segments creativity into four types: deliberate and cognitive, deliberate and emotional, spontaneous and cognitive, and spontaneous and emotional. The type most associated with scientists and engineers is deliberate and cognitive, which equates to sustained focus in an area, such as a neuroscientist studying the brain. </p>
<p>Deliberate and cognitive creativity come from the prefrontal cortex, the part of the brain located behind your forehead. It is the area responsible for focused attention and forming connections between information stored in the brain. You can see how this might relate to a scientist or engineer: deliberate focus, cognitive knowledge of a subject area, and the ability to form connections between information.</p>
<p>On the other hand, artists, musicians, writers, and other traditional creative careers are associated with spontaneous and emotional creativity. This type of creativity stems from the amygdala, the part of the brain that processes basic emotions. In between are spontaneous and cognitive creativity (explained above) and deliberate and emotional creativity, which is akin to working through emotional thoughts or issues. But if you work in one mode of creativity, it does not preclude you from the other types of creativity. A scientist might study research through deliberate and cognitive creativity, but experience spontaneous and emotional creativity while engaged in other creative endeavors. (In fact, we know quite a few scientists and engineers who are also artists or musicians.) </p>
<p>Or, when a problem can’t be solved by deliberate and cognitive creativity, the brain can use spontaneous and cognitive creativity to find a solution. <em>The Big Bang Theory</em>’s Sheldon Cooper found this out during the episode "The Einstein Approximation." Everyone’s favorite fictional theoretical physicist found himself stumped on a problem, and nearly drove himself mad by trying to solve it with deliberate and cognitive creativity. But a short stint as a waiter at The Cheesecake Factory, where he was able to focus on other tasks, freed his basal ganglia to work toward a solution.</p>
<p>Sheldon also discovered another way to solve a problem in the episode: sleep. When you can’t think of a solution, studies have shown REM sleep will help you figure it out!</p>
<p> </p>
<p>Homepage image credit: <a href="http://www.flickr.com/photos/opensourceway/4371001192/" target="_blank">opensourceway</a></p>
<p>Top image credit: <a href="qpemfacilitators.wordpress.com" target="_blank">qpemfacilitators.wordpress.com</a></p>
</div></div></div><div class="field field-name-field-taxonomy field-type-taxonomy-term-reference field-label-above clearfix"><h3 class="field-label">Tags: </h3><ul class="links"><li class="taxonomy-term-reference-0"><a href="/blog-tags/creativity" typeof="skos:Concept" property="rdfs:label skos:prefLabel">creativity</a></li><li class="taxonomy-term-reference-1"><a href="/blog-tags/neuroscience" typeof="skos:Concept" property="rdfs:label skos:prefLabel">neuroscience</a></li></ul></div>Thu, 22 Dec 2011 19:13:32 +0000rummykhan297 at http://www.scienceandentertainmentexchange.orghttp://www.scienceandentertainmentexchange.org/blog/creative-science#commentsMemories: It’s All in Your Headhttp://www.scienceandentertainmentexchange.org/blog/memories-it%E2%80%99s-all-your-head
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p><img class="imgRight" src="/sites/default/files/60573_0.jpg" border="0" alt="Memories are consolidated from short-term to long-term in the hippocampus." title="Memories are consolidated from short-term to long-term in the hippocampus." width="200" height="205" />Forgetting is as simple as walking through a doorway – that is the finding of a new study that experimented with memories and ways to walk through a home. Researchers asked participants to complete a simple task (exchanging one object for another) in either the same room or by walking through a doorway to another room. The result: people asked to complete the task in the other room were two to three times more likely to forget what they were supposed to do. “Entering or exiting through a doorway serves as an ‘event boundary’ in the mind, which separates episodes of activity and files them away,” <a href="http://www.lifeslittlemysteries.com/walking-doorways-forgetting-2192/" target="_blank">said</a> lead researcher Gabriel Radvansky, a psychologist at the University of Notre Dame. “Recalling the decision or activity that was made in a different room is difficult because it has been compartmentalized.”</p>
<p>As the doorway study suggests, memories can be fleeting. One moment, you are walking into the kitchen to grab a coffee mug. The next, you are standing in the kitchen, wondering why you walked into the room. But even if you recall a memory, there is no guarantee it is accurate. “Human memory is not like a computer where information is stored and retrieved in an identical fashion,” said Mary Spiers, neuropsychologist at Drexel University. “Our memories are more constructive and changeable.” In fact, Spiers told us when we recall a memory over and over again, it could change the original memory.</p>
<h2>Making Memories</h2>
<p>To understand why memories are fragile and changeable, you need to know how they are stored in the brain. It is a common misconception that memories are stored as single units, like a file in a computer. “Memories do not exist as packaged, isolated things stored somewhere in the brain,” explained Ricardo Gil da Costa, a neuroscientist at the Salk Institute. Instead, he told us, a memory is “represented by a neural network that includes multiple brain areas.” For example, a memory of a cat would be distributed among multiple areas of the brain. The visual of a cat is processed in the visual cortex, while the sound of a cat’s meow is processed in the auditory cortex, and the emotional feelings are processed in the limbic cortex, etc.</p>
<p>The processed information is then sent to the hippocampus, a part of the brain belonging to the limbic system, where it is consolidated into memory. Normally, memories are stored in the areas of the brain where they were processed (so, the visual of cat is stored in the visual cortex and so on). But consolidation can be a lengthy process. “Consolidation depends on neuronal changes,” explained Spiers, “and this may take hours, days, weeks, or months.”</p>
<h2>The Man Without a Hippocampus</h2>
<p><img class="imgRight" src="/sites/default/files/220px-Henry_Gustav_1.jpg" border="0" alt="Henry Molaison, known as H.M., had two-thirds of his hippocampus removed in 1953." title="Henry Molaison, known as H.M., had two-thirds of his hippocampus removed in 1953." width="220" height="167" />During this consolidation, memories can be lost or damaged by neurological injury. Or, in the case of Henry Molaison, famously known as H.M., memories can be lost by the removal of the hippocampus. In an attempt to lessen debilitating seizures, doctors removed two-thirds of Molaison’s hippocampus. It did alleviate Molaison’s seizures but the surgery had another curious side affect: he could not form new memories. Molaison could remember everything 1 to 2 years before the surgery but afterward, his memories were short term.</p>
<p>His condition might remind you of Dory from <em>Finding Nemo</em>, the animated regal tang fish with short-term memory loss. “She has a problem in remembering anything for more than a minute or so,” said Spiers. “Neuropsychologists term this ‘anterograde amnesia.’ It is a difficulty in encoding, and therefore learning new things.” Popular films, like <em>The Bourne Identity</em>, rely on a different type of amnesia: retrograde amnesia, where information before an event (like a head injury) is lost. Some films take it to the extreme, where a character cannot remember who he is, like Jason Bourne, but this is not an accurate depiction of retrograde amnesia. A person with amnesia would not lose memories of his name or identity and retrograde amnesia can usually be explained, Spiers said, as “a problem of incomplete consolidation of recently learned memory.”</p>
<p>Even though Molaison lost his ability to consolidate memories from short term to long term, he could still develop new motor skills. In one experiment, he was asked to draw within the lines of a star shape while looking in a mirror. It is a difficult task and researchers believed his anterogade amnesia would make him lose this newly gained skilled the next day. But he did not. As you can see, there is more to memory than we can explain in an article (or two or three) but if you have any questions, leave them in the comments and we will try to follow up!</p>
<p>Scientists are still experimenting with memory, and even though Molaison died in 2008, his contribution to science lives on: his brain, cut into slices, resides at the University of California, San Diego, for further study.</p>
</div></div></div><div class="field field-name-field-taxonomy field-type-taxonomy-term-reference field-label-above clearfix"><h3 class="field-label">Tags: </h3><ul class="links"><li class="taxonomy-term-reference-0"><a href="/blog-tags/memory" typeof="skos:Concept" property="rdfs:label skos:prefLabel">memory</a></li><li class="taxonomy-term-reference-1"><a href="/blog-tags/neuroscience" typeof="skos:Concept" property="rdfs:label skos:prefLabel">neuroscience</a></li><li class="taxonomy-term-reference-2"><a href="/blog-tags/science" typeof="skos:Concept" property="rdfs:label skos:prefLabel">science of</a></li></ul></div>Fri, 16 Dec 2011 05:00:00 +0000rummykhan294 at http://www.scienceandentertainmentexchange.orghttp://www.scienceandentertainmentexchange.org/blog/memories-it%E2%80%99s-all-your-head#commentsLight-Up Neurons Are Fireworks in Your Brainhttp://www.scienceandentertainmentexchange.org/blog/light-neurons-are-fireworks-your-brain
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>File this under “science we’d love to see onscreen.” Researchers at Harvard University <a href="http://news.harvard.edu/gazette/story/2011/11/guiding-lights/" target="_blank">genetically altered neurons to light up </a>as they fire. Imagine, for a minute, your brain covered in bursts of light, like a fireworks show under your skull. </p>
<p>The researchers altered brain cells with a virus containing a gene from a Dead Sea organism. The gene produces a protein that, when exposed to an electrical signal, fluoresces. The virus introduces the gene to the brain cells, which are cultured in a lab, causing the cells to produce the protein, which lights up as the neurons fire.</p>
<p><img src="/sites/default/files/light_up_neurons_605.jpg" border="0" title="Light-up neuron" width="400" height="266" style="display: block; margin-left: auto; margin-right: auto;" /></p>
<p>“The way a neuron works is it has a membrane around the whole cell, sort of like a wire and insulation, except in a neuron the membrane is an active substance. Normally, the inside of the cell is negatively charged relative to the outside,” explained Adam Cohen, the study’s lead researcher. “When a neuron fires, the voltage reverses for a very short time, about one-one-thousandth of a second. This brief spike in voltage travels down the neuron and then activates other neurons downstream. Our protein is sitting in the membrane of the neurons, so as that pulse washes over the proteins, they light up, giving us an image of the neurons as they fire.”</p>
<p>The research has implications for observing how signals spread through the brain, how drugs affect neuron pathways, and the study of genetic conditions such as heart disease or depression. Researchers also hope to be able to trace these signals in living animals.</p>
<p>But all we can imagine is a futuristic lab where doctors diagnose disease by measuring bursts of light in a brain. You have to admit, it is much more visually arresting than x-rays or CT scans.</p>
<p> </p>
<p>Image credit: <a href="http://news.harvard.edu/gazette/story/2011/11/guiding-lights/" target="_blank">Adam Cohen</a></p>
</div></div></div><div class="field field-name-field-taxonomy field-type-taxonomy-term-reference field-label-above clearfix"><h3 class="field-label">Tags: </h3><ul class="links"><li class="taxonomy-term-reference-0"><a href="/blog-tags/neuroscience" typeof="skos:Concept" property="rdfs:label skos:prefLabel">neuroscience</a></li></ul></div>Thu, 08 Dec 2011 11:05:49 +0000rummykhan290 at http://www.scienceandentertainmentexchange.orghttp://www.scienceandentertainmentexchange.org/blog/light-neurons-are-fireworks-your-brain#commentsAnnoyed? Blame Your Brainhttp://www.scienceandentertainmentexchange.org/blog/annoyed-blame-your-brain
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>Hey, wanna hear the <a href="http://www.youtube.com/watch?v=0cVlTeIATBs&amp;feature=related" target="_blank">most annoying sound in the world</a>? Harry and Lloyd from <em>Dumb &amp; Dumber</em> are clueless about almost everything around them – except a surprising expertise in how to be annoying. Lloyd’s “most annoying sound in the world” hits all the right, aggravating notes; it’s unpleasant, distracting, difficult to ignore, and you do not know when it will stop – it’s pure obnoxiousness. How could it not be annoying?</p>
<p><img class="imgRight" src="/sites/default/files/images/annoying-cs.jpg" border="0" width="101" height="149" />Well, since you asked, a new book has the exact answer you need. Written by NPR Science Correspondent <a href="http://www.npr.org/people/2101004/joe-palca" target="_blank">Joe Palca</a> and Science Friday’s <a href="http://www.sciencefriday.com/blog/author/FloraLichtman/" target="_blank">Flora Lichtman</a>, <em>Annoying: The Science of What Bugs Us</em> explores thescientific findings of what irks us and why. From loud cell phone conversations to your spouse’s grating habits, Palca and Lichtman uncover the fascinating science behind being annoyed.</p>
<p>So, how could Lloyd’s screaming not annoy you? If you were missing a specific part of your brain, the cingulate cortex, the “most annoying sound in the world” might not bother you at all. The cingulate cortex is involved in emotion forming and processing. So without it you cannot form an emotional response to Lloyd’s “most annoying sound in the world.” How do we know this? Patients who underwent a cingulotomy (a removal of the cingulate cortex) to treat an obsession were able to ignore the obsession post-surgery. In addition, at the University of Southern California a study examined the brains of annoyed participants in an MRI machine found a positive correlation between blood flow to the cingulate cortex and level of anger. </p>
<p><img class="imgLeft" src="/sites/default/files/images/cingul%20cs.gif" border="0" alt="Say hello to the reason you feel annoyed: the cingulate cortext (highlighted in red)." width="200" height="150" />Those studies suggest the cingulate cortex plays an important role in annoyance, but definitive research has yet to be done. So, for fun, let’s keep your cingulate cortex intact and start removing two other portions of your brain: the hippocampus, an area of the brain crucial for consolidating memories from short- to long-term, and the amygdala, the section of the brain responsible for forming and retaining emotional memories. Say you are sitting in a quiet café reading a book and the person next to you sucks on their straw loudly. At first, it does not bother you – you assume the person will stop – but then the person keeps doing it and the sound becomes more and more annoying until you cannot even attempt to read your book. Without the hippocampus, you would not remember that the straw sound even annoyed you. But, and here’s the interesting part, you would feel annoyed if you still had your amygdala intact. With your cingulate cortex in place, you would become annoyed and your amygdala would translate your annoyance into an emotional memory. So, even if you believed it was the first time you heard that sound, you would feel annoyed. Cutting out the hippocampus and amygdala might even stop you from getting annoyed altogether – as long as you cannot form an emotional memory and you forget the straw sound before you become annoyed, you can read your book in peace.</p>
<p>The brain is a small focus of <em>Annoying: The Science of What Bugs Us</em> – there’s more to what irritates us. Annoyance is a complex mix of biology, psychology, chemistry, and other science disciplines – and even if understanding it does not solve those daily annoyances, at least you will understand why.</p>
<p> </p>
<p> </p>
</div></div></div><div class="field field-name-field-taxonomy field-type-taxonomy-term-reference field-label-above clearfix"><h3 class="field-label">Tags: </h3><ul class="links"><li class="taxonomy-term-reference-0"><a href="/blog-tags/neuroscience" typeof="skos:Concept" property="rdfs:label skos:prefLabel">neuroscience</a></li><li class="taxonomy-term-reference-1"><a href="/blog-tags/annoyance" typeof="skos:Concept" property="rdfs:label skos:prefLabel">annoyance</a></li><li class="taxonomy-term-reference-2"><a href="/blog-tags/science" typeof="skos:Concept" property="rdfs:label skos:prefLabel">science of</a></li><li class="taxonomy-term-reference-3"><a href="/blog-tags/biology" typeof="skos:Concept" property="rdfs:label skos:prefLabel">biology</a></li></ul></div>Thu, 05 May 2011 18:52:38 +0000rummykhan74 at http://www.scienceandentertainmentexchange.orghttp://www.scienceandentertainmentexchange.org/blog/annoyed-blame-your-brain#commentsScience of Cyborgshttp://www.scienceandentertainmentexchange.org/blog/science-cyborgs
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p><img class="imgLeft" src="/sites/default/files/images/Cyborg_41.jpg" border="0" alt="Michel Maharbiz answers an audience member's question during the Q&amp;A session. " />A beetle is flying through the air, wings buzzing as it moves forward, and then – suddenly – it falls to the ground. Then the wings start up again, the beetle is back in the air – then again, the wings halt and the beetle lands on the floor. It’s almost as though it’s being controlled by a remote, flying and dropping out of the air as if someone were pushing the “Start” and “Stop” buttons over and over again.</p>
<p>But wait – that beetle is being controlled by a remote. How? It’s a <a href="http://www.eecs.berkeley.edu/~maharbiz/Cyborg.html" target="_blank">cyborg</a>, of course – half insect, half machine. As the beetle’s creator <a href="http://www.eecs.berkeley.edu/Faculty/Homepages/maharbiz.html" target="_blank">Michel Maharbiz</a> (associate professor with the Department of Electrical Engineering and Computer Science at the University of California, Berkeley) explained to the audience of The Exchange’s latest event “The Science of Cyborgs,” the beetles (June bugs and other varieties) have been implanted with neural stimulating electrodes in their brains and muscles. Researchers fly the beetle via a radio signals between the electrodes and a hacked Wii remote.</p>
<p>It’s an amazing technological advancement – but so are artificial retinas and robotic fish, the focus of two other presentations by <a href="http://www.usc.edu/programs/neuroscience/faculty/profile.php?fid=45" target="_blank">Mark Humayun</a> (Cornelius J. Pings Chair in Biomedical Sciences at the Doheny Eye Institute at the University of Southern California) and <a href="http://www.neuromech.northwestern.edu/" target="_blank">Malcolm MacIver</a> (associate professor of mechanical and biomedical engineering at Northwestern University), respectively. “The Science of Cyborgs,” held March 1 at the Directors Guild of America, also featured <a href="http://www.imdb.com/name/nm0609236/" target="_blank">Jonathan Mostow</a> (director of <em>Terminator 3: Rise of the Machines and Surrogates</em>) and moderators Chuck Bryant and Josh Clark from the <a href="http://science.howstuffworks.com/stuff-you-should-know-podcast.htm" target="_blank">How Stuff Works’ Stuff You Should Know Podcast</a>.</p>
<p>Before the cyborg beetles, artificial retinas, and robotic fish, Bryant and Clark opened the evening with a look at transhumanism – the interface between humans and machines. What did they want the audience to know? “If you think this is in the future, if you think this is all something in the movies, then we’ve got some news for you, the future is now,” said Bryant.</p>
<p><img class="imgRight" src="/sites/default/files/images/Cyborg_7.jpg" border="0" alt="From left to right: Josh Clark, Jonathan Mostow, Mark Humayun, Michel Maharbiz, Malcolm MacIver, and Chuck Bryant" width="230" height="160" />In television and film, though, that future seems, well, dark. Mostow offered his two cents on why Hollywood gravitates toward a negative view of technology. “The idea that I think is behind [films with evil technology] is it’s servicing some kind of generalized anxiety we have about how we relate to technology. In the sense that we are now more connected to each other than ever before, and yet with all of this connectedness, there’s less human contact then there’s ever been before,” he said. “Why do we go to the dark place? I think we go to the dark place because these movies allow us to explore our anxieties.”</p>
<p>But for every film with evil cyborgs, there is a real-life example of technology being part of the solution, not the problem. Take, for instance, the U.S. Department of Energy’s <a href="http://artificialretina.energy.gov/" target="_blank">Artificial Retina Project</a>. The project (lead by Mark Humayun, a member of both the National Academy of Sciences and the Institute of Medicine) aims to restore sight to the blind. “If you look at your five senses, the sensory input through your eye is the greatest. It’s about 200 megabytes through each one of your eyes. So, how are we going to restore sight to the blind?” Humayun said. Previously, patients were dissuaded from experimental operations to restore sight because the operations required implanting devices in the brain. “I started thinking, ‘If the optic nerve from the eye to the brain was still intact, could we jumpstart the blind eye?’ And that’s where this work began,” explained Humayun.</p>
<p> The artificial retina is a system of a retinal implant with a receiver and a camera mounted in eyeglasses. “[The camera] captures the image and then converts it – all in real time – and sends it to the implant in the eye. When this information is sent in, the receiver decodes it … and the information is then sent by these very delicate, Saran wrap–thin electrodes that ‘jumpstart’ the otherwise blind eye,” Humayun explained. The artificial retina does not automatically restore vision – it provides only 60 pixels of black and white visuals at the moment. However, most patients relearned visual cues in two months. “[Patients] tell us they could detect lights on the Christmas tree for the first time. For the first time they could see fireworks on Fourth of July,” said Humayun.</p>
<p><iframe style="display: block; margin-left: auto; margin-right: auto;" src="http://www.youtube.com/embed/uQf-1V2JmoE" width="465" height="349"></iframe></p>
<p>Another technology meant to benefit humanity is Malcolm MacIver’s robotic fish. A team of researchers (lead by MacIver) developed the robot by studying the black ghost knifefish, which emits a weak electric signal to detect prey. Why study and then make a robot of the knifefish? MacIver showed the audience a video of the fish swimming by its prey. The knifefish automatically reversed its body and attacked the prey. “It’s really interesting to me to sort of highlight what it is that animals can do that we can’t really touch yet with machines. And really, what that’s about is we’re good as engineers at devising systems that can go very fast in a straight line,” he said. “What animals can do very well is moving with extraordinary agility around cluttered spaces. What exactly that requires is sucking in this vast amount of sensory data, doing a quick little bit of processing in a few milliseconds and spitting that out.”</p>
<p style="text-align: center;"><iframe style="display: block; margin-left: auto; margin-right: auto;" src="http://player.vimeo.com/video/19073262" frameborder="0" width="465" height="465"></iframe><a href="http://vimeo.com/19073262" style="color: #1744b7; text-decoration: none;">Black Ghost Knifefish Swimming</a> from <a href="http://vimeo.com/user5810940" style="color: #1744b7; text-decoration: none;">Malcolm MacIver</a> on <a href="http://vimeo.com" style="color: #1744b7; text-decoration: none;">Vimeo</a>.</p>
<p>he robotic fish, dubbed GhostBot, mimics the knifefish with 32 independently controlled motors, 32 very thin rods with a lycra fin, and an electrosensory system. Built from computer simulations of the knifefish, GhostBot can move like its animal counterpart: forward, backward, vertically, instantly. What’s the immediate application of the GhostBot’s technology? “It’s a system that has exceptional agility. One of the problems we saw with contemporary underwater vehicles … was that current underwater vehicles are about as maneuverable as a submerged bathtub,” explained MacIver. “What we’d like to do is develop this alternative approach which gives you very high agility and some autonomy to that system so that it senses when it’s about to collide. We’d like to give the people who maintain these submerged structures a technology for doing up-close work and avoiding collisions.” </p>
<p style="text-align: center;"><iframe style="display: block; margin-left: auto; margin-right: auto;" src="http://player.vimeo.com/video/19073651" frameborder="0" width="465" height="349"></iframe><a href="http://vimeo.com/19073651" style="color: #1744b7; text-decoration: none;">GhostBot Swimming</a> from <a href="http://vimeo.com/user5810940" style="color: #1744b7; text-decoration: none;">Malcolm MacIver</a> on <a href="http://vimeo.com" style="color: #1744b7; text-decoration: none;">Vimeo</a>.</p>
<p>Beetles hardwired for remote-controlled flying, restoring sight to the blind, and studying fish to create better underwater vehicles; there’s never a dull moment at an Exchange event. The audience members flocked to the speakers at the end of the event, wanting just a little more of the amazing ideas presented. As audience member Kath Lingenfelter (writer/producer on <em>House</em>) remarked, “Whenever I receive an invitation from The Exchange, I always say yes. Every event, be it screening, tour, or lecture, is an adventure. I walk away energized and full of ideas. Some people thrill at the prospect of getting Spielberg on the phone, but that's nothing compared to being able to get Neil deGrasse Tyson or Sean Carroll.”</p>
<p> </p>
</div></div></div><div class="field field-name-field-taxonomy field-type-taxonomy-term-reference field-label-above clearfix"><h3 class="field-label">Tags: </h3><ul class="links"><li class="taxonomy-term-reference-0"><a href="/blog-tags/event-recap" typeof="skos:Concept" property="rdfs:label skos:prefLabel">event recap</a></li><li class="taxonomy-term-reference-1"><a href="/blog-tags/cyborgs" typeof="skos:Concept" property="rdfs:label skos:prefLabel">cyborgs</a></li><li class="taxonomy-term-reference-2"><a href="/blog-tags/transhumanism" typeof="skos:Concept" property="rdfs:label skos:prefLabel">transhumanism</a></li><li class="taxonomy-term-reference-3"><a href="/blog-tags/neuroscience" typeof="skos:Concept" property="rdfs:label skos:prefLabel">neuroscience</a></li></ul></div>Fri, 04 Mar 2011 23:31:40 +0000rummykhan49 at http://www.scienceandentertainmentexchange.orghttp://www.scienceandentertainmentexchange.org/blog/science-cyborgs#commentsThis Is Your Brain On Lieshttp://www.scienceandentertainmentexchange.org/blog/your-brain-lies
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>What would the world be like if nobody could lie -- not even a harmless little white lie? It would probably be like the world envisioned by British comic actor Ricky Gervais in <em>The Invention of Lying</em>, where brutal honesty is the order of the day, until Gervais' hapless character suddenly develops the ability to lie, or in his words, "I said something... that<em>wasn't</em>!" We are treated to an image of neurons in his brain firing in new ways at that pivotal evolutionary moment.</p>
<p>That image seems appropriate, given that scientists have been looking into using functional magnetic resonance imaging (fMRI) to achieve a kind of "<a href="http://www.physorg.com/news163156190.html" target="_blank">brain fingerprinting</a>" as a means of lie detection. In fMRI, when certain parts of the brain are engaged during a specific cognitive activity, those areas light up in the brain scan -- and if a person happens to be "dissembling," it should be possible to tell that they are lying just by looking at the scan.</p>
<p><span class="Apple-style-span" style="font-family: 'courier new', courier, sans-serif; font-size: 15px; color: #000000; line-height: normal;"><a href="http://1.bp.blogspot.com/_AFetE4zuDe4/SsvxM0Ig1QI/AAAAAAAAAHo/hP9RD08CYrE/s1600-h/FF_143_lying2_f.jpg" style="text-decoration: underline; color: #000000;"><img src="http://1.bp.blogspot.com/_AFetE4zuDe4/SsvxM0Ig1QI/AAAAAAAAAHo/hP9RD08CYrE/s320/FF_143_lying2_f.jpg" border="0" alt="" style="border-width: initial; border-color: initial; float: right; margin-top: 0px; margin-right: 0px; margin-bottom: 10px; margin-left: 10px; cursor: pointer; width: 300px; height: 300px; border-style: none;" /></a></span>Traditional polygraph tests, which measure physical responses such as respiration, heart rate, pulse, and electrical skin conductance to determine if a subject is lying, are notoriously unreliable, and scientists are still looking for the equivalent of Wonder Woman's magical lasso of truth. Brain fingerprinting seems to offer something closer to an objective analysis of whether or not not someone is lying. How can a brain scan lie, after all?</p>
<p>Well, maybe the scan doesn't lie, but how we interpret those images is prone to human error, particularly since we don't fully understand how this complicated organ called the brain actually works. Chief among the naysayers of this new "mind reading" technology is Melissa Littlefield, who argues that the technique is based on fundamentally wrong assumptions, most notably "truth" is the baseline, the natural state of being, and lying is adding "a story on top of the truth." That might be true in Gervais-Land, but the real world is far more complicated.</p>
<p>An fMRI scan might reveal a lie if the person knew he or she was lying -- if it were a conscious decision. But "some people don't actually know that they're lying, or have a told a lie for so long that it becomes their subjective interpretation of reality," Littlefield explains. And just as with the polygraph test, it's possible to cheat and <a href="http://io9.com/5277492/how-to-beat-your-futuristic-lie-detector" target="_blank">beat the machin</a>e: just clench your teeth or move your head slightly. FMRI requires the subject to hold perfectly still to get a useable image.</p>
<p>There are defenders of the technique's potential for lie detection as well. The <a href="http://io9.com/5313825/your-brain-will-eventually-be-used-against-you" target="_blank">most recent fMRI </a>work on truthfulness comes to us via Joshua Greene of Harvard University, who <a href="http://www.harvardscience.harvard.edu/culture-society/articles/neuroimaging-suggests-truthfulness-requires-no-act-will-honest-people" target="_blank">published his results</a> recently in the <em>Proceedings of the National Academy of Sciences</em>. He found that honest subjects showed almost no additional brain activity when telling the truth, as might be expected -- you're not inventing a lie, after all. But dishonest subjects <em>did</em> show extra brain activity... even when they were telling the truth. Greene's conclusion: "Being honest is not so much a matter of exercising willpower as it is being disposed to behave honestly in a more effortless kind of way."</p>
</div></div></div><div class="field field-name-field-taxonomy field-type-taxonomy-term-reference field-label-above clearfix"><h3 class="field-label">Tags: </h3><ul class="links"><li class="taxonomy-term-reference-0"><a href="/blog-tags/science" typeof="skos:Concept" property="rdfs:label skos:prefLabel">science of</a></li><li class="taxonomy-term-reference-1"><a href="/blog-tags/lying" typeof="skos:Concept" property="rdfs:label skos:prefLabel">lying</a></li><li class="taxonomy-term-reference-2"><a href="/blog-tags/neuroscience" typeof="skos:Concept" property="rdfs:label skos:prefLabel">neuroscience</a></li></ul></div>Tue, 06 Oct 2009 05:00:00 +0000rummykhan133 at http://www.scienceandentertainmentexchange.orghttp://www.scienceandentertainmentexchange.org/blog/your-brain-lies#comments