Control of Cybersecurity Becomes Divisive Issue New York Times (04/17/09) P. A16; Risen, James; Lichtblau, Eric

The U.S. National Security Agency (NSA) is lobbying to head the government's cybersecurity programs, and some officials are concerned that such a maneuver would grant the spy agency too much sway over government computer networks. Rod Beckstrom, who resigned in March as director of the Homeland Security Department's National Cyber Security Center, said in an interview that if the NSA gained that much power, it would have the authority to collect and analyze every email message, text message, and Google search performed by every worker in every federal agency. "Power over information is so important, and it is so difficult to monitor, that we need to have checks and balances," he stressed. Beckstrom said he assumed that an intelligence agency designed to contend with outside threats should not be given so much influence over information traffic in the U.S. government. Detecting threats against the computer infrastructure requires cybersecurity guardians to have virtually unlimited access to networks, and Beckstrom argues for the division of those responsibilities among agencies. National intelligence director Dennis C. Blair recently told Congress that the NSA should oversee federal cybersecurity, claiming that the group possesses the computer "wizards" with the skills needed. "The NSA's expertise, which is impressive and very, very deep, is focused primarily on the needs of the military and the intelligence community," said University of Pennsylvania computer security expert Matt Blaze. "Their track record in dealing with civilian communications security is mixed at best."

A team at the Tokyo Institute of Technology is trying to enable humans and robots to communicate nonverbally. Yoichi Yamazaki and colleagues have developed an "eye robot" that can convey a wide range of nonverbal signals. A computer would need to understand the message that is being conveyed, so the researchers addressed this issue by having the system with synthetic eyeballs produce expressions at random, and then having viewers evaluate each expression. They used questionnaires to create a "mentality space" for expressions. As users talk to the eye robot, the robot uses a speech recognition program to evaluate the conversation and then selects the appropriate eye expression from the mentality space. "The proposed system provides a user-friendly interface so that humans and robots communicate in natural fashion," says the team. Still, developing an eye robot that can hold its own in a nonverbal conversation will be more of a challenge than making the synthetic eyeballs look happy or sad.

Researchers working on the European Union-funded Psychologically Augmented Social Interaction Over Networks (PASION) project have tested several approaches for bringing non-verbal communication and context information into information and communication technology. One research partner examined cues for proximity and localization, another explored social feedback, and a third examined physiology and emotion among online poker players. Other efforts have explored different ways of assessing mood or context, including investigating facial expression analysis and physiological sensors that provide real-time feedback on a user's physiology. The Helsinki School of Economics developed tools to augment voice, text, and instant messaging with non-verbal communication for use by knowledge workers. Meanwhile, the University of Lincoln developed Familiars, an online social game that incorporates facial expression analysis, psycho-physiological data, and social indicators based on user interactions. The technology developed by PASION researchers could be used in a variety of applications that the researchers are still exploring. "How this will be used we can't say just yet, but we can say there are many applications, and while older people are perhaps a little bit hesitant about revealing their state of mind, young people and people who use social networking sites are very keen," says PASION's Richard Walker.

Biological Sensors Are the Future of Personalized Treatment Universidad Politecnica de Madrid (04/15/09)

Madrid recently hosted the International Symposium on Research in Grid/Nano/Bio/Medical Informatics (Bioinforsalud 2009). Organized by ACTION-Grid, Bioinforsalud 2009 gave 20 scientists from around the world an opportunity to discuss their views on nanotechnology and the personalization of medicine. Peter Ghazal, chair of Edinburgh University's Department of Molecular Genetics and Biomedicine, said biological sensors could be used to detect infections and to devise personalized treatments. In order to treat patients with a certain drug, patients would need to have their own profile. Martin Fritts of U.S. National Cancer Institute's Nanotechnology Characterization Lab said he is working to accelerate the use of nanotechnology concepts to treat cancer in clinical research. Informatics is important for knowledge discovery and transfer in clinical research, he said. Fritts added that researchers would need to understand how nanoparticles interact with their environment at the molecular level for nanomedicine-based treatments and diagnoses to be a success. ACTION-Grid is funded by the European Commission, and encourages cooperation on biomedical informatics, grid technologies, and nanotechnology by scientists in Latin America, the Balkans, and North Africa.

Massachusetts Institute of Technology (MIT) researchers have developed a new method for etching extremely fine lines onto a microchip, a breakthrough that could lead to new processes for microchip manufacturing and nano-scale technologies. The method uses a photochromic material that can be switched from transparent to opaque and back by exposing it to certain light wavelengths. Although the material is not new, the researchers discovered a new way of using it to create a mask with exceptionally narrow lines of transparency, which can be used to create a correspondingly narrow line on an underlying material. The key is using interference patterns, in which different wavelengths of light can either reinforce each other or negate each other. The researchers exposed the photochromic material to a pair of interference patterns, each of a different wavelength, simultaneously. When the bright lines at one wavelength met the dark lines at another wavelength, extremely narrow lines were created in the opaque material. The lines served as a mask through which the first wavelength illuminates a material underneath. The new technique, called absorbance modulation, makes it possible to create lines one about one-tenth as wide as the wavelength used to create them. MIT researcher Rajesh Menon says the discovery could have a major impact on chipmaking, and could lead to new work in a variety of fields that rely on nano-scale patterning, including nanophotonics, nanofluidics, nanoelectronics, and nano-biological systems.

Harnessing Cloud Computing for Data-Intensive Research on Oceans, Galaxies University of Washington News and Information (04/14/09) Hickey, Hannah

Universities, private companies, and government agencies are working together to bring scientific research into the world of cloud computing, where massive clusters of computers connected through the Internet handle large computing challenges. The University of Washington (UW) has received three recent awards from the National Science Foundation (NSF) for its cloud computing efforts. Two of the grants will be used on projects studying ocean climate simulations and analyzing astronomical images. These two projects will provide tools that will allow researchers to use cloud computing to easily interact with massive datasets, which are becoming increasingly common in scientific research. The third grant will provide curriculum and training to teach cloud computing. The programs are funded through the NSF's Cluster Exploratory program, which provides access to a cloud datacenter launched by Google, IBM, and six academic institutions, including UW, for educational purposes. Climate modelers are starting to use computer simulations in new, exploratory ways, according to UW eScience Institute researcher Bill Howe. Instead of running a simulation to test an individual hypothesis, climate scientists are running long-term simulations and the sorting through the massive amounts of data those simulations generate to find trends. Howe has developed a tool, GridFields, to visualize the polygonal mesh of climate simulation output, and is working to repurpose GridFields to work in a cloud computing environment.

U.Va.'s Institute on Aging Teams University Researchers With Health Care Technology Company to Develop Novel Sensor Devices for the Elderly University of Virginia (04/13/09) Richards, Zak

University of Virginia researchers are developing wireless body sensor networks that could be used to monitor a person's gait, potentially helping prevent falls among older adults. The researchers are developing wearable sensors for residents of long-term care facilities and are running "living laboratories" to test and improve the technology. Falls are the leading cause of injury death among people older than 65, according to the Centers for Disease Control and Prevention, and they also are the major cause of non-fatal injury and hospitalizations for trauma. The wireless body sensors will help identify problems in a person's gait that could cause a fall. By wirelessly connecting a network of sensors, the researchers will be able to receive real-time data on nursing home residents' gaits. Virginia professor John Lach has developed sensors that can quantitatively measure the walking patterns that will likely lead to a fall. This information will be used to create a system that will enable geriatricians to accurately assess gait problems and provide interventions before a fall occurs. Lach says his research could lead to a new standard of care that is more cost effective and can be applied to a variety of real-world settings. The same technology could potentially be adapted and used by the military to monitor and analyze soldiers' movements in combat.

The performance of closed-circuit television (CCTV) operators could be improved by analyzing their gaze, according to researchers in Turkey. Ulas Vural and Yusuf Akgul of the Gebze Institute of Technology have developed a gaze-tracking camera system to watch the eyeballs of CCTV operators as they work. The gaze-tracking system would train a Webcam-style camera on the irises of people who watch CCTV images in the control room. CCTV operators could miss criminal or antisocial activity because they have so many screens to monitor simultaneously. After the system uses an algorithm to analyze where CCTV operators are looking, it uses software to create a video of sequences missed during the shift. "This increases the reliability of the surveillance system by giving a second chance to the operator," the researchers write in the journal Pattern Recognition Letters. The gaze-tracking camera system runs on a standard PC and processes the images in real time, making summary frames ready to browse, similar to a fast-motion flip book.

Sixth Sense is a wearable computer designed by Massachusetts Institute of Technology postdoctoral student Pranav Mistry to free Web access from the constraints of the mouse-keyboard-screen template via a combination Web camera, battery, miniature projector, and Internet-enabled cell phone. The camera focuses on objects whose specific details can be identified through an Internet search, and the retrieved information is displayed on a surface by the projector. Mistry's system uses gestural controls in place of a mouse, and he has programmed the system to display grocery store inventories, book reviews from Amazon.com, and video clips about current events. "[Six Sense's] current representation is a pretty fun parlor trick that has the roots of being a transformative capability down the road," says consultant Jonas Lamis, who predicts that the system's projector will eventually be replaced by contact lenses that overlay data directly onto the wearer's field of view. Lamis also foresees the emergence of advertisements calibrated to consumers' precise location and interests, seamless access to virtual conversations, and perhaps the display of information about people we encounter through facial recognition capabilities. He cautions that such breakthroughs will depend on improvements to related technologies as well as the advent of a sound business model. There also are concerns that innovations such as Sixth Sense could create an environment characterized by relentlessly ubiquitous information, which could give rise to distraction and anxiety.

High Schoolers Learn IT by Defending Networks, Fighting Robots, Designing Games Iowa State University News Service (04/13/09) Krapfl, Mike

At the IT-Olympics at Iowa State University, more than 60 teams of high school students will compete in a variety of events, including defending computer networks from attacks, programming LEGO robots to win sumo-style matches, and demonstrating educational computer games they have developed. Iowa State University professor Doug Jacobson has been running cyberdefense competitions for years using the Internet-Scale Event and Attack Generation Environment, a virtual Internet developed by Jacobson that can be used to study and teach computer science. Jacobson says the 37 teams that will be competing in the security competition will make the IT-Olympics the world's largest cyberdefense competition. Jacobson says the competition is excellent at teaching technical skills, it demonstrates to students that there are lots of other people their age interested in computers, and shows students that there is a strong job market for information technology. "The goal of the IT-Olympics is really to celebrate information technology," says Jacobson. "We're trying to make this fun, with a lot of education, too. We want to get students excited. And we want to get more students interested in information technology."

The self-healing nature of biological systems has influenced the integrated circuits research of California Institute of Technology professor Ali Hajimiri. He plans to develop circuits that will be capable of detecting, isolating, and fixing their own problems. The self-healing circuits would be able to change the properties of the system when a transistor fails and seamlessly add more transistors. The Defense Advanced Research Projects Agency has awarded Hajimiri a four-year, $6 million grant to develop self-healing circuits for millimeter and microwave frequencies, which would be used in imaging, sensing, communications, and radar. Also, the Self-HEALing mixed-signal Integrated Circuits program would accommodate the Moore's scaling law. "As transistors approach atomic dimensions and run at very high frequencies, even very fine-scale variations within seemingly identical transistors can make a large difference in performance," leading to unpredictable behavior, Hajimiri says. "The way we see it, in a few years seal-healing circuits will allow faster, cheaper, and more robust circuits, making it possible to continue Moore's scaling law by making integrated circuits resemble living organisms in their ability to self-heal and adjust to changes in the environment."

The UW's Yoky Matsuoka Is Leading the Quest for Robotics That Take Orders From the Brain Seattle Times (04/05/09) Seven, Richard

University of Washington (UW) researcher and MacArthur Foundation Fellow recipient Yoky Matsuoka specializes in the field of neurobotics, which focuses on the development of robotic devices driven by brain signals. Such devices are seen as a way to help disabled people live fuller lives by restoring their motor skills via sophisticated prostheses. UW professor Ed Lazowska calls Matsuoka a pioneer in the integration of multiple disciplines that include neuroscience, computer science, mechanical engineering, and biomedical engineering. One of her projects is the Anatomically Correct Testbed robotic hand, an artificial limb designed to closely mimic human anatomy by using a maze of wires and miniature motors as muscles and an elastic string webbing as tendons. Matsuoka is convinced that emulating human anatomy is the best method for localizing and plotting out the effects of neural commands, and she says the insights she has gained from her research can help enhance the function of currently available prosthetic hands and ultimately facilitate "seamless integration" with nervous signals. "Our prostheses are primitive, so feeding them a lot of control signals is fruitless because they are not fully functional," Lazowska says. "But our ability to 'tap in' [to neural signals] is very limited, too, so even if we had a fully functional prosthesis, we wouldn't be able to get it the signals necessary to control it. You need to attack all aspects of the problem."

Bridging the Gap Between Wireless Sensor Networks and the Scientists Who Use Them University of Michigan News Service (04/06/09) Moore, Nicole Casal

Researchers at the University of Michigan and Northwestern University have developed a programming language for wireless sensor networks that will enable non-computer experts to program them for research. "Most existing programming languages for wireless sensor networks are a nightmare for nonprogrammers," says Michigan professor Robert Dick. "We're working on ways to allow the scientists who actually use the devices to program them reliably without having to hire an embedded systems programming expert." To create their simplified programming language, the researchers examined the variables that a scientist using a sensor network may want to monitor, and the areas in which a scientist would need flexibility. They identified 19 "application-level properties," which were grouped into seven categories or archetypes. The seven archetypes focus on specific types of monitoring that different researchers may use. The Wireless sensor network Archetype-Specific Programming Language (WASP) has already been developed, and others are underway. WASP allows scientists to tell the system what they want to do, instead of how they want it to complete the task. "Scientists enter the requirements and our system sorts out the implementation details for them automatically," Dick says.