NIST Technology Program Announces 20 Research Project Awards

The National Institute of Standards and Technology (NIST) today announced funding for 20 new research projects under its Technology Innovation Program (TIP), including projects ranging from unmanned, hovering aircraft for inspecting bridges to a high-speed sorting system for recycling aerospace metals to nanomaterials for advanced batteries. The cost-sharing awards represent up to $146 million in new research over the next two to five years with up to $71 million in funding provided by TIP.

The NIST Technology Innovation Program supports innovative, high-risk research in new technologies that address critical national needs. The merit-based, competitive program provides cost-shared funding for research projects by single small- or medium-sized businesses or by joint ventures that also may include institutions of higher education, nonprofit research organizations and national laboratories.

The 20 projects announced today were selected from a TIP competition announced on March 26, 2009, seeking projects addressing two broad areas of national interest, the practical application of advanced materials including nanomaterials, advanced alloys and composites in manufacturing, and the monitoring or retrofit of major public infrastructure systems, including water systems, dams and levees, and bridges, roads and highways.

Among Apes, Teeth Are Made for the Toughest Times

The teeth of some apes are formed primarily to handle the most stressful times when food is scarce, according to new research* performed at the National Institute of Standards and Technology (NIST). The findings imply that if humanity is serious about protecting its close evolutionary cousins, the food apes eat during these tough periods—and where they find it—must be included in conservation efforts.

The interdisciplinary team, which brought together anthropologists from George Washington University (GWU) and fracture mechanics experts from NIST, has provided the first evidence that natural selection in three ape species has favored individuals whose teeth can most easily handle the “fallback foods” they choose when their preferred fare is less available. All of these apes—gorillas, orangutans and chimpanzees—favor a diet of fruit whenever possible. But when fruit disappears from their usual foraging grounds, each species responds in a different way—and has developed teeth formed to reflect the differences.

“It makes sense if you think about it,” says GWU’s Paul Constantino. “When resources are scarce, that’s when natural selection is highly active in weeding out the less fit, so animals without the necessary equipment to get through those tough times won’t pass on their genes to the next generation.”

In this case, the necessary equipment is the right set of molars. The team examined ape tooth enamel and found that several aspects of molar shape and structure can be explained in terms of adapting to eat fallback foods. For instance, gorillas’ second choice is leaves and tree bark, which are much tougher than fruit, while orangutans fall back to nuts and seeds, which are comparatively hard.

For these reasons, the researchers theorized that gorillas would have evolved broader back teeth than a fruit diet would require in order to chew leaves, but orangutans would have thicker enamel to handle the increased stress of crunching seeds.

NIST scientists developed models of how teeth fracture while chewing different foods. By fracturing teeth in the laboratory, they verified fundamental fracture mechanics models incorporating tooth shape and structure. These efforts revealed the effects of food stiffness and how various foods likely would damage ape teeth. “The research at NIST supports our theories and several related ones,” Constantino says. “It’s likely that fallback foods have influenced jaw and skull shape as well.”

Constantino adds that the findings suggest mankind must protect not only forest areas where commonly eaten fruits grow, but also the places where apes’ fallback resources appear. While identifying precisely what these resources are is a job for ecologists, he says, the new research shows just how important and influential these foods are in primate ecology.

“Among orangutans, for example, timber companies are harvesting the sources of their fallbacks,” Constantino says. “These apes have evolved the right tools to survive on fallback foods, but they need to be able to find these foods in the first place.”

Everlasting Quantum Wave: NIST Physicists Predict New Form of Soliton in Ultracold Gases

Solitary waves that run a long distance without losing their shape or dying out are a special class of waves called solitons. These everlasting waves are exotic enough, but theoreticians at the Joint Quantum Institute (JQI) , a collaboration of the National Institute of Standards and Technology (NIST) and the University of Maryland, and their colleagues in India and the George Mason University, now believe that there may be a new kind of soliton that’s even more special. Expected to be found in certain types of ultracold gases, the new soliton would not be just a low-temperature atomic curiosity, it also may provide profound insights into other physical systems, including the early universe.

A newly predicted “immortal” soliton (left) as compared to a conventional “dark” soliton (right). The horizontal axis depicts the width of the soliton wavefronts (bounded by yellow in the left panel and purple on the right panel, with different colors representing different wave heights). The vertical axis corresponds to the speed of the soliton as a fraction of the velocity of sound. The immortal soliton on the left maintains its shape right up to the sound barrier.

Credit: I. Satija et al., JQI

Solitons can occur everywhere. In the 1830s, Scottish scientist John Scott Russell first identified them while riding along a narrow canal, where he saw a water wave maintaining its shape over long distances, instead of dying away. This “singular and beautiful” phenomenon, as Russell termed it, has since been observed, created and exploited in many systems, including light waves in optical-fiber telecommunications, the vibrational waves that sweep through atomic crystals, and even “atom waves” in Bose-Einstein condensates (BECs), an ultracold state of matter. Atoms in BECs can join together to form single large waves that travel through the gas. The atom waves in BECs can even split up, interfere with one another, and cancel each other out. In BECs with weakly interacting atoms, this has resulted in observations of “dark solitons,” long-lasting waves that represent absences of atoms propagating through the gas, and “bright” solitons (those carrying actual matter).

By taking a new theoretical approach, the JQI work* predicts a third, even more exotic “immortal” soliton—never before seen in any other physical system. This new soliton can occur in BECs made of “hard-core bosons”—atoms that repel each other strongly and thus interact intensely —organized in an egg-crate-like arrangement known as an “optical lattice.” In 1990, one of the coauthors of the present work, Radha Balakrishnan of the Institute of Mathematical Sciences in India, wrote down the mathematical description of these new solitons, but considered her work merely to approximate the behavior of a BEC made of strongly interacting gas atoms. With the subsequent observations of BECs, the JQI researchers recently realized both that Balakrishnan’s equations provide an almost exact description of a BEC with strongly interacting atoms, and that this previously unknown type of soliton actually can exist. While all previously known solitons die down as their wave velocity approaches the speed of sound, this new soliton would survive, maintaining its wave height (amplitude) even at sonic speeds.

If the “immortal” soliton could be created to order, it could provide a new avenue for investigating the behavior of strongly interacting quantum systems, whose members include high-temperature superconductors and magnets. As atoms cooling into a BEC represent a phase transition (like water turning to ice), the new soliton could also serve as an important tool for better understanding phase transitions, even those that took place in the early universe as it expanded and cooled.

Demonstration Network Planned for Public Safety 700 MHz Broadband

The National Institute of Standards and Technology (NIST) and the National Telecommunications and Information Administration (NTIA) have announced plans to create a demonstration broadband communications network for the nation’s emergency services agencies using a portion of the radiofrequency spectrum freed up by the recent transition of U.S. broadcast television from analog to digital technologies. The new system will provide a common demonstration site for manufacturers, carriers, and public safety agencies to test and evaluate advanced broadband communications equipment and software tailored specifically to the needs of emergency first responders.

Public safety agencies are looking to make use of the 700 megahertz (MHz) broadband spectrum cleared by the switch to digital TV. A unified broadband system would allow public safety agencies to communicate with nationwide roaming and enhanced interoperability. However, there are currently no government or independent laboratory facilities in the United States to test and demonstrate the public safety specific behaviors of this yet-to-be-deployed 700 MHz network and the applications that could run on top of it.

To address this critical gap, NIST and NTIA, through their Public Safety Communications Research (PSCR) program, will begin building a Public Safety Broadband Demonstration Network to provide manufacturers with a site for early deployment of their systems, an opportunity to evaluate them in a multi-vendor environment, and create integration opportunities for commercial service providers. A national broadband network could offer public safety groups around the country access to advanced communications technologies including video, mapping and GPS applications and more. Emergency responders, vendors, carriers, academia and other pertinent stakeholders also will able to access the demonstration network.

“This is an excellent opportunity for NIST and the PSCR to leverage our skills and assets to ensure the successful adoption and deployment of a new, nationwide communications system for public safety,” says Dereck Orr, PSCR program manager. “The demonstration of these new technologies, implementations and services is a critical step in successfully deploying the next generation of mission-critical systems.”

This demonstration network is currently in the preliminary planning stages and is expected to go live in mid-2010. Interested industry and public safety representatives can contact Orr at (303) 497-5400, dereck.orr@nist.gov, or Jeff Bratcher at (303) 497-4610, jbratcher@its.bldrdoc.gov, for information on how to get involved.

The PSCR program is a partnership of the NIST Office of Law Enforcement Standards and the NTIA’s Institute for Telecommunication Sciences (NTIA ITS). PSCR provides objective technical support—research, development, testing and evaluation—in order to foster nationwide public safety communications interoperability. More information is available on the PSCR Web site at www.pscr.gov.

Verifying the accuracy of network analyzers--instruments that are used to measure key performance characteristics of electronic networks--was once an awkward process involving multiple steps and pieces of equipment. Now, thanks to an electronic verification standard and accompanying software developed by electrical engineers at the National Institute of Standards and Technology (NIST), much of that process can be automated and the performance of the calibration system checked with NIST via the Internet. Results are both more complete and available in a matter of minutes, not hours or days as once was the case.

Verifying the performance of vector network analyzers—devices that check how components used in cell phones, radios and satellites transmit signals—has gotten a lot easier and faster with this plug-in device and accompanying software developed by NIST. A process that once took hours or days can now be done in minutes.

So-called “vector network analyzers” (VNAs) have become workhorse tools for checking how well complicated electronic components—systems used in cell phones, wireless Internet links, radar components, radios and satellites, for example—transmit signals. Until now, calibrating VNAs involved plugging a number of mechanical artifacts with known performance characteristics into the testing machines and running tests on that individual artifact. Then another artifact with different characteristics would be swapped out, measured, recorded, etc. This took up to an hour to complete, the artifacts were expensive, and the measurements were not always reliable.

NIST electronics engineer Dylan Williams and team has eliminated much of that complexity with a new verification procedure based on a device with a wide variety of measurement characteristics that plugs into the computer being used to do the calibration.* By using the software tool provide by NIST, a equivalent calibration can be done quickly via automation—the user doesn’t have to play an active role in the process.

Williams said this new procedure also has the benefit of being traceable directly to NIST labs, comparing measurements to an independent calibration process and authenticating the tests both electronically and with a printable certificate from NIST that includes the serial number of the device being calibrated.

“Every time a vector network analyzer, a common electrical measurement instrument, took a measurement, it would measure eight different parameters at once and you were never sure if it was measuring them all correctly,” Williams says. “It has been a nagging problem for some time with no real way to check it. Now, you can verify the performance of your analyzer and cover the whole space of what the instrument can measure.”

With thousands of vector network analyzers in use by industry every day, Williams expects there to be a constant demand for their verification devices. For further information on obtaining the software and one of the hardware devices needed for the calibration procedure, contact Ronald Alan Ginley at (303) 497-3634, ronald.ginley@nist.gov.

NIST Seeks Comments on Cryptography Standards Publication

The National Institute of Standards and Technology (NIST) has released for public comment a new revision of one of its key computer security documents, a set of information processing standards governing the use of cryptographic modules by civilian federal agencies and government contractors.

The NIST document, the Revised Draft of Federal Information Processing Standards (FIPS) 140-3, updates the federal government’s guiding document for testing and validation of cryptographic modules, which are computers’ primary line of defense for confidential data. Each module receives a security level rating that depends on the amount of protection it provides. The revised draft of FIPS 140-3 will be available for public comment until March 11, 2010.

When completed, the update will be the third version of the original cryptography standards document, which was created in 1995 as FIPS 140-1 and first updated in 2001. NIST’s William Burr says that another update is needed because of the evolution of computing systems, how they do cryptography, as well as the evolution of attacks. “It used to be these modules were a dedicated separate device, protecting a single link between two points. But in the majority of cases nowadays you’re running a security program instead, on a general purpose computer— encrypting traffic over the Internet, connected by many links to different points,” says Burr, a computer security specialist. “We’re also now widely using cryptography on things like ID cards that are exposed to different kinds of attacks. We have to take these changes into account.”

The Revised Draft incorporates improvements made to a previous draft, which was released for public comment in July 2007. This new second-round draft differs from both the 2007 document and the 2001 updated version (FIPS 140-2). Some of the Revised Draft’s key changes include:

While the 2007 draft proposed five levels of security, the Revised Draft reverts to the four levels currently specified in FIPS 140-2.

The Revised Draft also reintroduces the notion of a cryptographic module made with “firmware” (software only a manufacturer can alter) and defines the security requirements for it.

It removes the requirement for a manufacturer to provide a formal model of the cryptographic module and the details of its operation in order for it to attain the highest security level rating.

Requirements now exist at higher security levels for mitigating non-invasive attacks, which can find the keys to access a secure system not by analyzing encrypted data, but by measuring other operating characteristics, such as precise power consumption.

For more information, including details on how to comment on the Revised Draft, visit http://csrc.nist.gov/news_events/index.html#dec11.

NIST Updates Automated Computer Security Validation Guidelines

The National Institute of Standards and Technology (NIST) has issued a draft publication for public comment that describes changes to the Security Content Automation Protocol (SCAP). SCAP is a suite of specifications that use the eXtensible Markup Language (XML) to standardize how software products exchange information about software flaws and security configurations.

SCAP incorporates software flaw and security configuration standard reference data from the National Vulnerability Database, which is managed by NIST and sponsored by the Department of Homeland Security. SCAP supports automated vulnerability checking, technical control compliance activities and security measurement. The federal government is adopting SCAP and encourages its use to automate security activities including compliance with the Federal Desktop Core Configuration (FDCC), a group of security settings mandated for federal computers that run Windows XP and Vista. Agencies can use SCAP to automate technical compliance with other information technology requirements, such as the Federal Information Security Management Act (FISMA) and the Payment Card Industry (PCI) framework.

Special Publication 800-126 Revision 1, The Technical Specification for the Security Content Automation Protocol (SCAP): SCAP Version 1.1, facilitates development of interoperable SCAP tools and content. The publication has significant changes from the version 1.0 specification defined in the original Special Publication 800-126 release.

The most notable change is the addition to SCAP of the Open Checklist Interactive Language (OCIL), which is a framework for expressing security checks that cannot be fully automated—those that require some human interaction or feedback. OCIL provides a standardized way of performing these manual checks through questionnaires, with language constructs for questions, user instructions and possible responses to questions.

SP 800-126 Revision 1, The Technical Specification for the Security Content Automation Protocol (SCAP): SCAP Version 1.1 can be found at http://csrc.nist.gov/publications/drafts/800-126-r1/draft-sp800-126r1.pdf. The public comment period runs through Jan. 23, 2010. Comments should be addressed to 800-126comments@nist.gov.

NIST Team Demystifies Utility of Power Factor Correction Devices

If you've seen an Internet ad for capacitor-type power factor correction devices, you might be led to believe that using one can save you money on your residential electricity bill. However, a team including specialists at the National Institute of Standards and Technology (NIST) have recently explained* why the devices actually provide no savings by discussing the underlying physics.

(1) Power factor correction devices have no effect on a typical household electric bill because of the relationship shown here—activating the device reduces the current drawn from a power line but simultaneously increases the power factor. Electric bills are based on the product of the two, which remains the same.(2) Power factor correction in operation. Current from the power line is reduced by introducing a charge, QC, on a capacitor (C), and creating an oscillating charge or current, IL, between the capacitor and the inductive load, such as a refrigerator.

The devices—sometimes referred to as Amp Reduction Units or KVARs**—are touted as good investments because they reduce the amount of current drawn from power lines while simultaneously providing the necessary amount of current to appliances inside the house. Though engineers elsewhere have discredited the devices for use in typical residences already, NIST physicist Martin Misakian and two of his colleagues decided to write a brief primer describing the devices' inner workings for readers who are not power engineers, but who still have some technical background.

“One of the important functions of our primer is to remove the mystery of how current from the power line can decrease while at the same time current going to an appliance remains the same,” says Misakian. The nine-page Technical Note explains this result in terms that might interest readers with knowledge of college-level physical sciences. It shows that although the devices can indeed reduce current flow from the power line, it is not just the current flowing from the power line that determines your electric bill, but the product of the power factor and the current. Though current decreases with a power factor correction device, the power factor increases correspondingly, meaning the product of the two remains the same—with or without the device. Because a residential electric bill is proportional to this product, the cost remains unchanged.

Power factor correction devices have some use, though. The authors point out that while they will not reduce the average homeowner's bill, they may benefit the environment. When electricity travels from a local transformer to a residence, some power is lost due to electrical resistance. But because a utility would need to supply less current to a residence that employs a power factor correction device, these losses would decrease—thus potentially reducing the amount of greenhouse gases a fossil fuel-burning utility would emit. But while the primer does provide a rough calculation of a utility's savings by considering the operation of a residential air conditioner, Misakian says readers must investigate the details of these options for themselves.

“If homeowners wanted to help reduce the amount of carbon dioxide produced, they could install a device,” Misakian says, “but they would also have to consider the greenhouse gases generated during the fabrication of the device itself.”

Handheld Touch Screen Device May Lead to Mobile Fingerprint ID

The Federal Bureau of Investigation (FBI) Hostage Rescue Team had a problem--they needed a small, portable tool to identify fingerprints and faces, but couldn't get anyone interested in building a solution for such a limited market. So they came to the National Institute of Standards and Technology (NIST). The FBI told NIST they wanted something more portable than the 20-pound rugged laptop plus fingerprint scanner their hostage rescue teams lug around to aid in their anti-terrorism efforts, and this led to NIST developing a new application for a handheld touch-screen device.

(Left) Prototype “Quick Capture Platform.” mobile fingerprint device running on a smart phone.(Right) Watching the FBI work in the field helped the NIST team design the “Quick Capture Platform.”

The original task given to NIST by the FBI was simply to design and compile the requirements for the software the FBI needed to run on their platform of choice: a handheld device with a touch screen about the size of an index card. Paring down the visual interface to a mini-screen requires detailed understanding of what functionalities are most important. NIST researchers Mary Theofanos, Brian Stanton, Yee-Yin Choong and Ross Micheals brainstormed with the FBI team about what they required and, more importantly, watched them doing their work since most people can demonstrate what they need far better than they can articulate it.

The research paid off. Despite having worked closely with the NIST team, even the FBI Hostage Rescue Team was surprised at how well the ultimate design matched their needs: a small tool that could take pictures of fingerprints or faces and send the data wirelessly to a central hub for analysis, all with a minimum of touch strokes.

But Theofanos, Stanton and Choong wanted to take the program further. Smart phones with touch screen devices were becoming available—could they scale their design down even more to fit a 2-inch x 3-inch screen? The team created a demo program for just such an available screen—and it scaled beautifully.

The NIST team already had been collaborating with other security agencies on something called Mobile ID, a method to help officers identify people quickly and easily on the scene, instead of taking people back to headquarters to be fingerprinted. The NIST researchers think this demo program might just be the solution. The next step is to integrate an actual finger print sensor into the demo program.

Five Organizations Honored with 2009 Baldrige National Quality Award

President Barack Obama and Commerce Secretary Gary Locke announced on Dec. 7, 2009, that five organizations have been named the recipients of the 2009 Malcolm Baldrige National Quality Award, the nation’s highest Presidential honor for innovation and performance excellence.

The 2009 Baldrige Award recipients were selected from a field of 70 applicants. All of the applicants were evaluated rigorously by an independent board of examiners in seven areas: leadership; strategic planning; customer focus; measurement, analysis and knowledge management; workforce focus; process management; and results. The evaluation process for each of the recipients included about 1,000 hours of review and an on-site visit by a team of examiners to clarify questions and verify information in the applications.

The 2009 Baldrige Award recipients are expected to be presented with their awards in a ceremony in Washington, D.C., next year.

Named after Malcolm Baldrige, the 26th Secretary of Commerce, the Baldrige Award was established by Congress in 1987 to enhance the competitiveness and performance of U.S. businesses. The award promotes excellence in organizational performance, recognizes the achievements and results of U.S. organizations, and publicizes successful performance strategies. The award is not given for specific products or services. Since 1988, 80 organizations have received Baldrige Awards.

The Baldrige National Quality Program is managed by the Commerce Department’s National Institute of Standards and Technology (NIST) in conjunction with the private sector. For profiles of the five 2009 Baldrige Award recipients, as well as information on the Award and the Baldrige Program, go to www.nist.gov/baldrige.

JQI Researchers Create 'Synthetic Magnetic Fields' for Neutral Atoms

Achieving an important new capability in ultracold atomic gases, researchers at the Joint Quantum Institute, a collaboration of the National Institute of Standards and Technology (NIST) and the University of Maryland, have created “synthetic” magnetic fields for ultracold gas atoms, in effect “tricking” neutral atoms into acting as if they are electrically charged particles subjected to a real magnetic field. The demonstration, described in the latest issue of the journal Nature, not only paves the way for exploring the complex natural phenomena involving charged particles in magnetic fields, but may also contribute to an exotic new form of quantum computing.

A pair of laser beams (red arrows) impinges upon an ultracold gas cloud of rubidum atoms (green oval) to create synthetic magnetic fields (labeled Beff). (Inset) The beams, combined with an external magnetic field (not shown) cause the atoms to "feel" a rotational force; the swirling atoms create vortices in the gas.

As researchers have become increasingly proficient at creating and manipulating gaseous collections of atoms near absolute zero, these ultracold gases have become ideal laboratories for studying the complex behavior of material systems. Unlike usual crystalline materials, they are free of obfuscating properties, such as impurity atoms, that exist in normal solids and liquids. However, studying the effects of magnetic fields is problematic because the gases are made of neutral atoms and so do not respond to magnetic fields in the same way as charged particles do. So how would you simulate, for example, such important exotic phenomena as the quantum Hall effect, in which electrons can “divide” into quasiparticles carrying only a fraction of the electron’s electric charge?

A harbinger of the synthetic magnetic fields is the formation of vortices (spots). These spots, the number of which increases with increasing synthetic field, mark the points about which atoms swirled with a whirlpool-like motion. The measurement units in each panel indicate the size of the external magnetic field gradient applied to the gas of atoms, with larger external fields producing more vortices.

The answer Ian Spielman and his colleagues came up with is a clever physical trick to make the neutral atoms behave in a way that is mathematically identical to how charged particles move in a magnetic field. A pair of laser beams illuminates an ultracold gas of rubidium atoms already in a collective state known as a Bose-Einstein condensate. The laser light ties the atoms' internal energy to their external (kinetic) energy, modifying the relationship between their energy and momentum. Simultaneously, the researchers expose the atoms to a real magnetic field that varies along a single direction, so that the alteration also varies along that direction. In a strange inversion, the laser-illuminated neutral atoms react to the varying magnetic field in a way that is mathematically equivalent to the way a charged particle responds to a uniform magnetic field. The neutral atoms experience a force in a direction perpendicular to both their direction of motion and the direction of the magnetic field gradient in the trap. By fooling the atoms in this fashion, the researchers created vortices in which the atoms swirl in whirlpool-like motions in the gas clouds. The vortices are the “smoking gun,” Spielman says, for the presence of synthetic magnetic fields.

Previously, other researchers had physically spun gases of ultracold atoms to simulate the effects of magnetic fields, but rotating gases are unstable and tend to lose atoms at the highest rotation rates. In their next step, the JQI researchers plan to partition a nearly spherical system of 20,000 rubidium atoms into a stack of about 100 two-dimensional “pancakes” and increase their currently observed 12 vortices to about 200 per-pancake. At a one-vortex-per-atom ratio, they could observe the quantum Hall effect and control it in unprecedented ways. In turn, they hope to coax atoms to behave like a class of quasiparticles known as “non-abelian anyons,” a required component of “topological quantum computing,” in which anyons dancing in the gas would perform logical operations based on the laws of quantum mechanics.

Quicklinks

Web Site Highlights 50 Years of NIST Contributions to Laser Science

The year 2010 marks the 50th anniversary of the first demonstration of the laser, one of the most important inventions of the 20th century. Lasers now touch almost every aspect of modern life, from health care to entertainment, from manufacturing to communications. A yearlong celebration called Laserfest (http://www.laserfest.org) has been launched by the American Physical Society, Optical Society of America, SPIE and IEEE Photonics Society. They have invited other organizations to showcase their own contributions to the development and applications of this groundbreaking technology.

A new web site highlights important contributions to laser science by the National Institute of Standards and Technology (NIST). Researchers at NIST have helped turn lasers into tools that have redefined the measurement system, enabled the development of new technologies, enhanced understanding of science, and touched the lives of all Americans. Three NIST scientists have won Nobel Prizes for their work with lasers.

Deadline Extended for Entering NIST Microrobotics Challenge

An article in the Oct. 20, 2009, issue of NIST Tech Beat, “Is Your Microrobot Up for the (NIST) Challenge?”, invited university and collegiate student teams currently engaged in microrobotic, microelectronic or MicroElectroMechanical Systems (MEMS) research to participate in the 2010 NIST Mobile Microrobotics Challenge. The competition will be held as part of the IEEE International Conference on Robotics and Automation in May 2010 in Anchorage, Alaska. The deadline for entering the competition has been extended to Jan. 15, 2010. For details on how to apply and a list of the official rules, see the competition Web page at www.nist.gov/eeel/semiconductor/mmc.

Text Retrieval Expert Named Distinguished Scientist by Computing Group

Ellen Voorhees, leader of the text retrieval group in the Information Technology Laboratory at the National Institute of Standards and Technology (NIST), was recently recognized as a “Distinguished Scientist” by the Association for Computing Machinery (ACM) for establishing “key components of information retrieval evaluation methodology through individual research and directing international evaluation projects.”

Voorhees has been a leading figure in organizing NIST’s annual Text Retrieval Conference (TREC), which fosters research in information retrieval by creating the infrastructure required to do large-scale testing of retrieval technology. Information retrieval or search methodologies help find content that is not specially structured for computers. The technology is a key component of diverse applications including e-commerce, legal discovery, intelligence gathering, and scientific research, as well as Web search. More information about ACM distinguished scientists can be found at http://www.acm.org/news/featured/distinguished-09/. To learn more about the Text Retrieval Conference, see http://trec.nist.gov.