Conquering the Chaos in Modern, Multiprocessor Computers University of Washington News and Information (03/10/10) Hickey, Hannah

University of Washington (UW) researchers have devised a method for coaxing predictable behavior out of modern, multiprocessor computers by automatically parceling sets of commands and assigning them to specific sites. The program runs faster than it would on a single processor, due to the concurrent calculation of the command sets. "We've developed a basic technique that could be used in a range of systems, from cell phones to data centers," says UW professor Luis Ceze. "Ultimately, I want to make it really easy for people to design high-performing, low-energy and secure systems." One application of the system can aid the proper testing of programs by making errors reproducible. A software-based version of this system, which could be used on existing machines, was presented this week by UW graduate student Tom Bergan at ACM's International Conference on Architectural Support for Programming Languages and Operating Systems.

A European project has developed software to maximize packing efficiency through the creation of algorithms that employ a "constraint programming approach." The scheme's first step involves applying all of a problem's various constraints as a way to reduce possible solutions in an effort to find the best answer. The second step constitutes the actual search for the solution, which is accelerated and made more efficient by keeping the shrinkage of the search space active. The project researchers also have eliminated "forbidden regions," approaches to the problem that will ultimately lead to a dead end or an unworkable solution. An entire language to express a problem's constraints was developed as well. Such constraints include stability constraints for a pallet or business constraints such as maximum weights. The system, developed by the Net-WMS consortium, includes the spatial algorithms, the rule program, and a virtual reality module, which are interlinked into a software solution for modeling, simulating, and optimizing the packing process.

Black college instructors play a significant role in encouraging black science students to persist as science majors, according to a study by Cornell University doctoral student Joshua A. Price. He examined data on more than 157,000 students who enrolled as first-time freshmen in one of the 13 four-year universities in Ohio between 1998 and 2002 and who said that they planned to major in science, technology, engineering, or mathematics (STEM). Price then analyzed the likelihood of black students who had a black instructor and female students who had a female instructor sticking with their STEM major compared to those who did not. He found that black students who had at least one black science instructor as freshmen were statistically more likely to remain STEM majors than those who did not. Meanwhile, the presence or non-presence of at least one female instructor had no significant statistical impact on the persistence of female STEM majors. The study also found that black STEM students were more likely than white students to wind up in STEM courses or sections led by black instructors.

Research Streamlines Data Processing to Solve Problems More Efficiently North Carolina State University (03/10/10) Shipman, Matt

Enabling systems to determine which pieces of information would help solve a specific problem will allow them to work faster, according to researchers at North Carolina State University. Professor Joel Trussell and former Ph.D. student Huiwen Zeng have developed a new analytical method that enables systems using large amounts of data to work more efficiently. "The work we've done here allows for a more efficient collection of data by targeting exactly what information is most important to the decision-making process," Trussell says. "Basically, we've created a new algorithm that can be used to determine how much data is needed to make a decision with a minimal rate of error." The researchers say the algorithm has potential applications in analyzing hyperspectral data from military cameras, analyzing imaging tests in medical facilities, and in analyzing video and camera feeds for the U.S. Department of Homeland Security.

The HaptiMap Project Aims to Make Maps Accessible Through Touch, Hearing and Vision Basque Research (03/09/10)

A European consortium plans to develop toolkits and concepts for improving the multimodal perceptualizations of mobile devices. Researchers on the HaptiMap project believe that gestures, tactile/haptic interaction, and sound would make maps and location-based services more accessible for users of mobile devices, especially when they are walking or cycling, are in bright sunlight, or for users who may have poor eyesight. The project is expected to lead to new guidelines on accessibility issues, suggestions for extending current design practices, and the creation of new tools to help developers add adaptable multimodal components to applications. The HaptiMap project hopes to fund a commercial partner to develop a location-based service that allows users to interact with haptic, audio, and/or visual information. Sweden's Lund University is the coordinator of the project, whose 13 members also include the Finnish Geodetic Institute, the University of Glasgow, Queen's University, and TECNALIA.

In an interview, Microsoft researcher Charles Thacker, who received ACM's 2009 A.M. Turing Award, says the computer industry's journey toward a new programming model to tap multicore processing is only just beginning. "You have to have parallel computers before you can figure out ways to program them, but you have to have parallel programs before you can build systems that can run them well," Thacker says. His recent areas of concentration include the Berkeley Emulation Engine, version 3, an FPGA-based system for devising multicore architectures. He also developed Bee Hive, a multicore system used to investigate parallel architectures such as transactional memory. Thacker says the acceptance of the tablet PC, which he helped pioneer, has been limited because of less than stellar battery life and handwriting recognition capability, although the former problem has been rectified. In a nominating letter, fellow A.M. Turing Award recipient and Microsoft researcher Butler Lampson called Thacker "an engineer's engineer," noting that "his skills span the full range."

Vienna University of Technology physicists Volkmar Putz and Karl Svozil have devised a way to process information that exceeds the speed of light. They see no reason why a nonlocal quantum phenomenon such as entanglement cannot be exploited to process information at superluminal speeds, and they note that the phenomenon can lead to materials with a refraction index of less than 1. For instance, light passing through a vacuum can be coaxed to spontaneously organize into an entangled electron-positron pair, which then recombines to form a photon again. This is an instantaneous process that lets the photon "jump" across space, and a material with a minus 1 refraction index would promote this type of process, according to Putz and Svozil. They propose that a vacuum filled with either electrons or positrons would be sufficient. The next step is the immersion of a computer within this medium, facilitating superluminal computation. The resulting hypercomputer would surpass the power of Turing machines and support non-Turing computations.

Researchers at the University of Waterloo's Institute for Quantum Computing and Singapore's Centre for Quantum Technologies have agreed to collaborate to build the world's first quantum computer. Researchers from both institutes already have worked together on experiments and co-written research papers, and the agreement formalizes a partnership that has been in place for several years. "We both have a critical mass right now," says National University of Singapore provost Tan Eng Chye. "If we put them together, the chances we will be successful are much higher." The collaboration will complement work already being done at both institutes, says Singapore Institute director Artur Ekert. The agreement provides faculty at both institutes with access to each other's facilities and the right to use equipment that their institute doesn't have. "There is a friendly competition, but it is not adversarial," Ekert says. "There is a feeling now that we are working in a global village."

Group Seeks to Open Source Data-Center Design IDG News Service (03/05/10) Niccolai, James

The recently announced Open Source Data Center Initiative is looking to apply open source principles to the design and construction of data centers. The initiative, which will act as a repository and testbed for mechanical and engineering advances in data center design, hopes to attract innovations from small engineering firms, graduate students performing research with federal grant money, and other researchers. Group adviser Michael Manos hopes the initiative will motivate established engineering firms to rethink how data centers are built. "When you think of all the great things we've been talking about at data center conferences, about moving to greener designs and driving efficiency with new technologies--a lot of that innovation is being held back because competition for those ideas is not out there," he says. The group also will get involved in education, and will publish real-world data on the cost of implementing projects, such as fresh-air cooling systems, so customers have more transparency when making decisions.

Karlsruhe Institute of Technology professor Tanja Shultz demonstrated a prototype device for communicating without speaking at the recent CeBIT conference. The technology uses electromyography to detect the electrical signals muscles produce when someone speaks, which also is a technique commonly used to diagnose certain diseases. For the prototype, nine electrodes are attached to a user's face. "These capture the electrical potentials that result from you moving your articulatory muscles," Shultz says. The technology passes the electrical pulses to a device that records and amplifies them, and transmits the signal via Bluetooth to a laptop, where software translates the signals into text that can be spoken by a synthesizer. Shultz says the technology could be used by people who have lost their voice, or could support an instant translation system. Also, by integrating the lip-reading technology into mobile phones, commuters would be able to engage in silent phone conversations, Shultz says.

Five Years, Half a Million Members, and 300,000 Years of Service to Society University of Texas at Austin (03/03/10) Dubrow, Aaron

Since its inception more than five years ago, the World Community Grid volunteer distributed computer network has attracted 500,000 registered users and approximately 1.5 million connected devices, which have collectively performed more than 320,000 years of research in service to society. The Grid taps the computing muscle of off-duty computing cycles so that researchers can tackle a variety of Grand Challenge problems. The Grid divides the workload of massive calculations into small pieces that can be performed on a typical PC and then sent back to the Grid for reassembly. One Grid project supported by the Texas Advanced Computing Center is attempting to find drug-like molecules that can effectively halt the replication of dengue, West Nile, and hepatitis C viruses. In the past 12 months, between 30,000 and 100,000 PCs have focused on the problem.

A team of researchers at the University of Wisconsin-Madison has devised a new technique for printing much smaller computer chips that borrows from the original lithographic method and could lead to advances in ultra-high-density computer storage. Molecular transfer printing entails transferring ink from a master to a replica, wherein a master chip fabricated via conventional photolithography is coated with block copolymers. Two mixed copolymers self-assemble to form crystal patterns that reproduce those of the etched silicon chip, and then the chip is heated for several hours to lock the pattern in place as a copolymer film. UW-Madison research team leader Paul Nealey says the block copolymer technique permits printing at a "significantly higher resolution" than the master copy. He says the technique could be used in computer storage, where dense arrays of identically repeating punch-card-like patterns are required.

NCSA to Provide Ember as Shared-Memory Resource for Nation's Researchers National Center for Supercomputing Applications (03/02/10)

The National Center for Supercomputing Applications (NCSA) soon will deploy a highly parallel shared memory supercomputer that will double the performance of its five-year-old predecessor, the Cobalt system. Ember, which will have a peak performance of 16 teraflops, will feature 1,536 processor cores and 8 TB of memory. It also will offer 170 TB of storage with 13.5 GB/s I/O bandwidth. "There has been a clear increase in the demand for shared memory resources," says TeraGrid Forum chairman John Towns, whose persistent infrastructure team at NCSA will deploy and support Ember. "Allocation requests for shared-memory systems have consistently exceeded the available resources--by as much as sevenfold in a recent review of requests--and have been followed by a series of results highlighted in TeraGrid publications." The system will support applications that require large-scale shared memory architecture. Researchers will be able to access Ember initially through the U.S. National Science Foundation's TeraGrid and then in March 2011 through the eXtreme Digital program.