IBM Reports Nanotube Chip Breakthrough New York Times (10/29/12) John Markoff

IBM scientists report success in patterning an array of carbon nanotubes on a silicon wafer and their application in the construction of hybrid chips with more than 10,000 functioning transistors. The scientists say their breakthrough promises to guarantee the sustained shrinkage of basic digital switches for more than 10 years. Stanford professor Subhasish Mitra says the advance also would likely boost the speed at which transistors can be turned on and off. "These devices outperformed any other switches made from any other material," notes IBM scientist Supratik Guha. "We had suspected this all along, and our device physicists ... showed that we would see a factor of five or more performance improvement over conventional silicon devices." IBM researchers think the refinement of carbon nanotube use will lead the way to sharp gains in both chip speed and transistor density. Guha says carbon nanotubes possess more promising performance traits than other materials engineers are investigating to help maintain Moore's Law, such as graphene. He notes that perfecting the process calls for a much purer form of the carbon nanotube material, in order to eliminate metallic characteristics.

Election watchdog groups are monitoring the use of paperless electronic voting machines or direct recording electronic systems (DREs) throughout the U.S., as 16 states will employ systems that do not provide a paper trail. The remaining states will use a combination of paper ballots and paperless DREs, and Verified Voting president Pamela Smith says paper trails should be supported by such systems at the very least, given that there is no way to easily verify that the systems are functioning properly. "You can't do a post-election vote tabulation audit in such cases because there is no independent record of the votes," Smith says. "You are checking the system against itself." University of Utah professor Thad Hall says DREs offer little in terms of auditability. "Sure they are auditable," he notes. "The problem is that people are not going to believe the audit record," because it is not independent of the system. However, Internet Voting Research and Education Fund CEO William Kelleher says worries about paperless DREs are exaggerated, although he acknowledges the systems may make rare counting errors if they need recalibration. "But humans make far more counting errors than the computers in the DREs," he argues.

European scientists have integrated robotics, video, and various sensor and display technologies to transport someone into a geographically distant meeting room under the auspices of the Beaming through augmented media for natural networked gatherings (Beaming) project. The European Union-funded effort utilizes immersive virtual reality technologies in which a robotic avatar functions as the meeting participant's eyes, ears, and mouth. The participant wears a head-mounted display and is connected to sensors, enabling them to receive the avatar's video and audio feeds in three dimensions. The two-way connection also enables the participant's movements and responses to be mimicked by the robot. A key challenge for the Beaming collaborators has been development of the system's framework data architecture, which defines how all the visual, audio, motion, and pressure data is packaged and relayed between the participant and their remote environment. It also sets up how the three-dimensional model of the remote location is produced for the participant to generate a strong sense of presence for them. "The purpose of the framework is to make Beaming entirely independent of the hardware or software involved," says project participant Stephen Dunne. "You'll be able to use any robot or any sensor, for example."

Carnegie Mellon University researchers Alessandro Oltramari and Christian Lebiere have developed computerized surveillance software that could replace humans monitoring camera feeds for signs of suspicious behavior and predict what people will do. The researchers presented a paper at the recent Semantic Technology for Intelligence, Defense, and Security conference saying the system's goal is "to approximate human visual intelligence in making effective and consistent detections." The method relies on innovations in machine vision, and Oltramari and Lebiere have built on the work of other Carnegie Mellon scientists to produce a cognitive engine capable of understanding the rules by which objects/people and their actions are allowed to interact. The cognitive engine embeds activity forecasting research that attempts to comprehend what humans will do by calculating the most probable physical trajectories. The software "models the effect of the physical environment on the choice of human actions," the researchers say. Both efforts are elements of Carnegie Mellon's Mind's Eye architecture for developing smart cameras for machine-based visual intelligence. "This work should support human operators and automatize video surveillance, both in military and civil applications," Oltramari says.

Countries leading in technological advancement have failed to open up their knowledge societies to women, according to a recent report from Women in Global Science & Technology (WIGSAT). The consulting group examined female scientific enrollment and employment and work-related policies in emerging economies such as Brazil, India, Indonesia, Korea, and South Africa, as well as the United States and the European Union. In these key economies, women account for less than one-third of the computer science, engineering, and physics fields. "Numbers of women in the science, technology, and innovation fields are alarmingly low in the world's leading economies, and are actually on the decline in many, including the United States," the report says. Although women accounted for 28 percent of science academy members in South Africa in 2010, women represented less than 12 percent in most other countries. The study notes that countries do not have adequate policies and social support for women and lack sex disaggregated data. Brazil ranked highest in supportive policies and employment of women. The countries are not "taking full advantage of women in the knowledge sector," says WIGSAT executive director Sophia Huyer.

Killing the Computer to Save It New York Times (10/30/12) John Markoff

SRI International computer scientist Peter G. Neumann subscribes to the philosophy that threats to computer security stem from the increasing complexity of hardware and software that has made it virtually impossible to identify system defects and vulnerabilities and ensure that they are secure and trustworthy. He argues that computers and software must be redesigned from a "clean slate." Neumann, chair of ACM’s Computers and Public Policy Committee and editor of the RISKS forum, leads a research team committed to such a redesign, and he says the only practical solution is to analyze the past 50 years' research, select the best ideas, and then build a new model from the bottom up. The Clean Slate effort funds research investigating how to design less intruder-susceptible and more recoverable computer systems, with one area of focus being creating software that continuously shape-shifts to foil aspiring hackers. One design strategy Neumann's team is pursuing is tagged architecture, in which each piece of data in the system must carry encryption code credentials to guarantee the data's trustworthiness. The related capability architecture requires every software object in the system to carry special data describing its access rights on the computer, which is checked by a special element of the processor.

Now the Mobile Phone Goes Emotional University of Helsinki (10/25/2012)

A synchronous haptic communication system could provide mobile device users with another way of expressing greetings, presence, and emotions. According to a study from researchers at the Helsinki Institute of Information Technology (HIIT) and the University of Helsinki, haptic interpersonal communication could be incorporated into a standard mobile device. Participants in the study said pressure and tactile techniques could be used to emphasize speech, express affection and presence, and to playfully surprise each other. The participants noted that they tended to pause briefly after sending a pressage to "make space for it in the conversation." Phone calls lasted on average four minutes and 43 seconds, and participants sent an average of 15.56 pressages during each call. HIIT and Nokia developed the prototype phone, ForcePhone, used in the study. ForcePhone is an augmented, commercially available mobile device with pressure input and vibrotactile output.

Fujifilm and IBM researchers have developed ultra-dense tape drive prototypes that can store 35 terabytes of data. The technology will be employed to store the massive amount of data expected to be generated by the Square Kilometer Array, the world's largest radio telescope. By the time the telescope comes online in 2024, the researchers expect to be able to store 100 terabytes on a cartridge of a similar size to their prototype, says IBM researcher Evangelos Eleftheriou. Data centers based on disk drive arrays use more than 200 times more power than would a tape library of similar size, according to a 2010 Clipper Group study. Although tapes are currently slower to access than hard disks, Eleftheriou says the Linear Tape File System under development expedites this process to make it comparable to disk drives. He says hard drive density improvements are confronted with physical constraints, which means they can only add more power-consuming platters. "It's time to take advantage of the low power and low cost of tape," Eleftheriou argues.

Microsoft researchers recently presented a paper that argues for allowing cloud tenants to buy resources based on a "job-centric" model, which would add another layer of abstraction to the cloud by having an interface that lets tenants specify performance and cost goals. The researchers say a job-centric interface would make like easier for cloud tenants by removing the task of translating their goals into the corresponding resource requirements, and add flexibility for cloud providers by enabling them to decide how many resources are needed for a specific job. The appeal of a job-centric interface is that different combinations of resources can be used to complete the same job for a customer, according to the providers. To illustrate this idea, the researchers created Bazaar, a cloud framework for MapReduce jobs. Bazaar allows tenants to specify MapReduce job constraints and then determines the best resource combination from the provider for completing the job within those constraints. The researchers note a Bazaar-style model also could affect cloud-pricing models. "A job-centric cloud, coupled with job-based pricing, can thus enable a symbiotic tenant provider relationship where tenants benefit due to fixed costs upfront, and better-than-desired performance," the researchers say.

Virginia Tech researchers have developed a computer-based information system that can find useful information on vehicle defects from consumer-generated content on the Internet and social media. The content from public discussion forums, social networks, product reviews, visitor comments, wikis, or user-written news articles is characterized by variable quality, notes Virginia Tech professor Weiguo Fan. The researchers analyzed online discussion forums for owners of Honda, Toyota, and Chevrolet vehicles, and then developed and tested the decision support system. They believe the system provides an efficient way for auto manufacturers to discover and classify vehicle defects. "A lot of useful but hidden data on vehicle quality is embedded in social media that is largely untapped by auto manufacturers," says Virginia Tech professor Alan Abrahams. He also notes that "vehicle quality management professionals would greatly benefit in terms of productivity by employing a vehicle defect discovery system like ours to sift defects from unrelated posts."

The U.S. Office of Naval Research (ONR) is working with researchers from across the country to develop the next generation of robotics that can operate under water. For example, researchers from Harvard University, the University of Georgia, and Drexel University are studying how fish behave to develop underwater robots. "There are great things we can learn from fish," says Drexel researcher James Tangorra. "The way they propel themselves, the way in which they sense water." Robots modeled after water species could be more efficient and harder to detect than conventional robots, and could move through dangerous waters without putting people in harms way, according to the researchers. ONR also recently opened a 50,000-square-foot robotics laboratory, which has produced a prototype Razor robot that uses flippers for stealth. University of Virginia researchers are studying manta rays, while University of New Orleans researchers are studying eel robots that utilize hydrodynamics and could be used as a possible surveillance tool. Meanwhile, Massachusetts Institute of Technology researchers have a robot pike, a robot sea turtle, and two generations of Charlie the Robotuna, and Michigan State University researchers are developing a robotic school of fish.

Michigan State University (MSU) professor James Fairweather will serve as co-principal investigator of the Association of American Universities' (AAU's) national science, technology, engineering, and mathematics (STEM) initiative. Fairweather is the director of MSU's Center for Higher and Adult Education and a leading scholar in the study of faculty work and reforming undergraduate STEM education. AAU is working to improve undergraduate teaching and learning in the STEM fields. The association of 59 U.S. and two Canadian research institutions is developing a framework for effective STEM teaching and learning, and will set up demonstration sites to implement and test the framework over the next three years. "For AAU institutions--the top research universities in the United States and Canada--the dilemma centers on how to promote research and scholarship while at the same time better preparing undergraduate students in STEM," Fairweather says. "I look forward to working with AAU and its member institutions to find ways to improve undergraduate teaching and learning in STEM with the goal of spreading successful innovations to a broad array of colleges and universities."

Efforts to boost graduates from science, technology, engineering, and math (STEM) programs may be impeded by the continued reliance on lecturing. Fewer than 40 percent of students entering college intending to be a STEM major complete a degree in a STEM field, according the President's Council of Advisers on Science and Technology. However, an institutional survey in 1997 revealed poor teaching to be the main concern among both STEM graduates and those who had left those majors, and STEM teaching at that time consisted of lecturing. A more recent UCLA poll found that STEM faculty tend to grade on a curve at twice the rate of non-STEM faculty. University of British Columbia professor Carl E. Wieman reports that grading on a curve signifies STEM educators' unfamiliarity with educationally effective practices. He also says faculty are seldom trained in creating valid learning measures, and they lack feedback on the quality of their examinations. Wieman emphasizes that faculty design practice tasks for students that are suitable to their levels of comprehension, but still rigorous enough to require intense intellectual effort. Work assigned inside and outside class should link patterns of expert thinking in the field to students' already acquired knowledge, and motivate them appropriately.