Welcome to the July 21, 2006 edition of ACM TechNews,
providing timely information for IT professionals three times a
week.

Sponsored by

Learn more about Texis, the text-oriented database providing
high-performance search engine features combined with SQL operations and a
development toolkit, that powers many diverse applications, including
Webinator and the Thunderstone Search Appliance.

Eugene Spafford, chairman of ACM's Committee on Public Policy, told a
joint House hearing Wednesday that ACM has concerns about the federal
qualification process for computerized voting technology. U.S. standards
for voting equipment are voluntary, but application of the federal specs
has been inconsistent, according to a recent report from the Government
Accounting Office. Meanwhile, critics of electronic voting machines say
they can be hacked into to compromise elections. "New federal standards
and a certification process hold promise for addressing some of these
problems, but more must be done to ensure the integrity of our elections in
the face of software and hardware errors as well as the possibility of
undetectable tampering," said Spafford. Clear security standards would be
helpful because they would reduce the number of designs, according to a
list of steps to shore up accuracy and security released by Spafford. A
more transparent testing process, a mechanism for periodic security
updates, and voter-verified paper trials are the other steps. For more
information about ACM's e-voting action, visit
http://www.acm.org/usacmClick Here to View Full Articleto the top

Since the Sept. 11, 2001, attacks, U.S. intelligence agencies have spent
millions of dollars on software to form connections between previously
unknown people and terror suspects, track the flow of money through
international financial institutions, and monitor global communications.
The actual cost of Pentagon and CIA data-mining programs is classified. At
least five such programs were developed under the Pentagon's now-defunct
Total Information Awareness (TIA) program. Congress scrapped the program
three years ago out of privacy concerns, but the Bush administration claims
that citizens' privacy is protected under the current surveillance
programs. Privacy advocates worry the administration's claim that
counterterrorism is a legitimate use for warrantless surveillance is the
first step down a slippery slope. "There's a tendency with all of these
systems to lead with terrorism and then find other applications," said Marc
Rotenberg, executive director of the Electronic Privacy Information Center.
Among the data-mining technologies in use by intelligence agencies are
hardware that can search through databases up to 4 million GB, a program
aimed at identifying terrorist networks and the most important members
within those networks, and software containing personal information about
Americans compiled by other agencies and commercial groups. At least eight
TIA projects, including the five data-mining initiatives, have continued
since Congress pulled the plug on the program in 2003. The law dismantling
TIA allowed research to continue in some areas, including the development
of two computer simulations to test a variety of counterterrorism
situations. Supporters claim the TIA's data-mining programs could be
continued under the exemptions, while critics warn of a lack of
accountability.Click Here to View Full Articleto the top

Researchers at Cornell University have developed a technique that promises
better and much faster simulations of blond hair. While computers can
develop accurate 3D structures that resemble hair, using computer graphics
to simulate its natural sheen and glow has always been a challenge. The
process of rendering demands sophisticated calculations that factor in the
spacing between the hairs, and current technologies can create adequate
simulations of dark and brown hair, but fall short of the mark when it
comes to blond hair. "The model that's been around since the 80s works for
black hair, and a model we introduced in 2003 in collaboration with workers
at Stanford gets brown hair right and makes blond hair better," said Steve
Marschner, assistant professor of computer science at Cornell. "Using that
model with our new work provides the first practical method to use
physically realistic rendering for blond hair and still get the right
color." Perfect rendering can only be achieved by a process known as
"path-tracing," where the computer begins with each pixel and works
backwards toward the original light source through a series of complex
calculations. While the rendering produces a perfect result, the process
takes hours, and most artists rely on approximations. The new algorithm
traces rays from the source of the light to the hair, using some
approximations to produce a scatter map of the photons throughout the hair.
In testing, the algorithm produced a simulation of blond hair that was
nearly identical to that produced by path tracing, though the new method
took only 2.5 hours, while path tracing for the simulation took 60 hours.
Marschner is now looking to develop better techniques for simulating the
motion of hair. Cornell graduate student Jonathan Moon will present the
new research at the 2006 SIGGRAPH conference in Boston on July 30.Click Here to View Full Articleto the top

While enrollments in science and engineering graduate programs rose
slightly in 2004, the number of first-time enrollments among foreign
students dropped for the third consecutive year, and the numbers were down
across the board in computer science, according to the NSF. The 0.3
percent increase in science and engineering enrollments in 2004 followed
four years during which enrollments rose almost 15 percent. Enrollment
among first-time foreign students fell more than 7 percent from 2003 to
2004 to 27,486, contributing to a nearly 20 percent decline from 2001 to
2004. After a three-year period marked by a 29 percent increase,
enrollments of first-time, full-time U.S. graduate students dropped 1
percent. Graduate computer science enrollments fell 6.3 percent to 50,331
in 2004. Graduate enrollments of first-time, full-time U.S. citizens and
permanent residents in computer science fell 6.2 percent; among foreign
students, enrollments dropped 3 percent. Despite the declines, there were
still 51 percent more first-time, full-time graduate students who were U.S.
citizens or permanent residents in 2004 than there were in 2000. The
number of science and engineering postdoctoral students dropped almost 2
percent in 2004, propelled by a 3 percent drop in enrollments among foreign
students, the first such drop since tracking of foreign enrollments began
in 1977.Click Here to View Full Articleto the top

In a recent interview, Sebastian Thrun, the leader of the Stanford
University team that took first prize in last year's DARPA Grand Challenge,
discussed his thoughts on artificial intelligence and robotics. The most
significant obstacle to widespread consumer adoption of robots remains
cost, Thrun said, noting that while iRobot's Roomba vacuum cleaner is a
"fantastic step," consumer robots are still nowhere near as practical as
robots used for industrial, scientific, or military applications. In order
for home assistance robots to become a reality, Thrun argues, robots need
to do a better job identifying household objects and understanding human
intentions. Robots' ability to manipulate objects lags behind their
navigational skill, Thrun says, noting that while the Roomba can ably
navigate around a room, its only functionality is collecting dust. Thrun
envisions household robots that users could log into to monitor their
homes, checking to make sure that the oven is off and the doors are locked.
Another potential use for robots would be to improve on the state of elder
care, which Thrun describes as "disastrous." Thrun also believes that
self-driving cars will become a consumer reality. They are already
feasible from a technological and cost perspective, he says, pointing to
features such as active cruise control and parking-assistance technologies,
noting that they will only become more intelligent as new features are
added. The enthusiasm behind robots comes from a mix of necessity and
novelty, Thrun believes. While the potential benefits of robots are clear,
people have historically had a fascination with robots and held them apart
from other machines. It remains to be seen if the most popular robots will
adopt a humanoid form, as some people feel strongly that they should
resemble humans, while others find human-like machines unsettling.
Microsoft's Robot Studio is a good initial step toward opening up the field
to the development community, though Thrun believes that it will have to be
refined to reach out to a broader group of users.Click Here to View Full Articleto the top

Democratic senators and national security experts opposed a Senate
surveillance bill proposed by Sen. Arlen Specter (R-Pa.) that would permit
the Bush administration to submit the National Security Agency's (NSA)
warrantless eavesdropping program to a clandestine intelligence court so
that its legal ramifications can be assessed, arguing that the legislation
would extend the government's powers to monitor Americans without being
watchdogged by the courts. The NSA program allows the agency to eavesdrop
on emails and phone calls between the United States and overseas locations
without court sanction if one of the parties is believed to have terrorism
links. Specter's bill would allow all pending lawsuits related to the NSA
program to be transferred to a secret Foreign Intelligence Surveillance Act
(FISA) appeals court that could dismiss the cases "for any reason," and
permit the White House to seek the legal okay for the NSA program from
another secret FISA court. In addition, the bill would extend the length
of time the government could monitor alleged terrorism suspects before
getting warrants, and would categorically assert the president's
"constitutional authority" to undertake spying programs by himself.
Critics complain that the legislation would eviscerate the FISA law and
allow the government excessive latitude in secret surveillance, as well as
let the FISA court approve surveillance in its entirety instead of
evaluating warrants for particular cases. Meanwhile, House GOP leaders
Reps. Peter Hoekstra (R-Mich.) and F. James Sensenbrenner Jr. (R-Wis.) are
endorsing a competing bill. All GOP proposals that address the NSA issue
are opposed by Rep. Jane Harman (D-Calif.) on the House intelligence
committee, who said the bills are "solutions in search of a problem."Click Here to View Full Articleto the top

Technology experts addressing the Senate Commerce Subcommittee on
Technology, Innovation, and Competitiveness on the importance of
high-performance computing received verbal assurances from the lawmakers
that supercomputing research would remain a government priority. "To stay
competitive as a nation, we must maintain U.S. leadership in
high-performance computing and computational sciences," said Sen. Maria
Cantwell (D-Wash.). While commercial ventures must shoulder some of the
burden of development, the government is still the primary user of
supercomputers, and the field depends heavily on government funding. "As
the largest user of supercomputing, the federal government understands how
necessary supercomputers are to fulfilling the requirements of government
missions--from national defense and homeland security to scientific
leadership," said Cray's Christopher Jehn. Subcommittee Chairman Sen. John
Ensign (R-Nev.) agreed, noting the role that high-performance computing
will play in managing oil resources, investigating alternative energy
sources, and researching cures for diseases such as Alzheimer's disease.
Cantwell was a co-sponsor of the High End Computing Revitalization Act of
2004, which dealt expressly with Energy Department programs. Cantwell is
now calling for similar legislation to support high-performance computing
in the other agencies of the federal government.Click Here to View Full Articleto the top

Working under a three-year, $2.4 million grant from the Department of
Homeland Security, researchers at the University of Illinois will develop
applications capable of processing vast quantities of data in a variety of
formats. Illinois is sharing a $10.4 million grant with researchers from
the University of Southern California, University of Pittsburgh, and
Rutgers University. The grant will help Illinois establish the Multimodal
Information Access and Synthesis (MIAS) Center. "The MIAS will advance the
understanding and technologies required to deal with large amounts of
information available today in multiple text forms," said Dan Roth,
professor of computer science at Illinois and the director of the center.
"The center builds on department of computer science strengths in such
areas as machine learning, natural language processing, information
retrieval, image processing, databases, and data mining." In the coming
years, scientific research will produce huge amounts of multimodal
information that will require systems capable of interpreting and analyzing
data in multiple formats, developing and verifying hypotheses, and
incorporating observed data into domain names. Though their work is
commissioned by DHS, the researchers expect the MIAS center to produce
technologies that will also have significant impact on the business
community. The center will also include a summer school for undergraduate
and graduate students. "Altogether, the overarching goal is to use science
and technology to reduce threats to our nation's security by providing new
knowledge and cutting edge technology and by helping produce a growing
number of professionals through our educational programs," Roth said. "It
will also have significant impact on the growing industry of information
access, search engines, and mining knowledge from data."Click Here to View Full Articleto the top

Researchers at Microsoft are working on a technology that can use any
smooth flat surface as a computer display and user interface, complete with
software to monitor hand movements in lieu of a mouse and keyboard. The
technology, called PlayAnywhere, can interface with a piece of paper, a
cell phone on the desk, and other items. PlayAnywhere is a more intuitive
method of interacting with computers, said Guri Sohi, chair of the computer
science department at the University of Wisconsin. "I think that's much
more powerful," said Sohi, one of the many computer science professors
attending the Microsoft Research Faculty Summit. With no specific
commercial applications in the works, the project is meant to be a more
general test of sensing and display technologies. Another innovation
showcased at the summit was a technology that could create a richly
detailed panoramic image that could fill out a billboard. The image on
display depicted the Seattle skyline, and consisted of 800 distinct images
taken from the top of a building one morning in February. The image is 700
times larger than a normal photo taken by a standard consumer digital
camera, according to Microsoft's Matt Uyttendaele. Microsoft has also
developed a digital version of the family calendar that stores the last 100
changes made, so parents can undelete appointments or notes if needed.Click Here to View Full Articleto the top

Calls to increase the cap on the H-1B temporary visa program have resulted
in the crafting of bills such as the Securing Knowledge, Innovation, and
Leadership (SKIL) Act of 2006 (S. 2691) in the Senate that would raise the
ceiling from 65,000 to 115,000 annually. The Comprehensive Immigration
Reform Act of 2006 (S. 2611), recently passed by the chamber, would go a
step further in automatically boosting the cap 20 percent in any year when
the 115,000 ceiling was met, and eliminating limits for the number of
foreign students who graduate from U.S. colleges and universities with
advanced degrees. However, Gartner analyst John Bace says the firm's
clients are not demanding more visas, and ultimately believes there will be
little change in the program. Former president of the Society for
Information Management Nancy Markle maintains that the program needs to be
expanded because of the shortage of skilled IT workers in the country.
"We're just not training [enough students] in technology, science,
engineering and mathematics," which is forcing employers to look elsewhere
for IT talent, says Markle. Some observers view an expansion of the
program as a negative for U.S. IT workers. Nonetheless, job opportunities
in IT will be available regardless of whether lawmakers decide to increase
the cap, leave it alone, or scale it back.Click Here to View Full Articleto the top

Despite widespread evidence that the technology job market has rebounded
and indeed is poised for substantial growth over the coming years, students
are ignoring the message. All of the 20 jobs projected to grow the most in
the next decade are related to IT or health care, and the highest salaries
will be in IT, according to the U.S. Bureau of Labor Statistics. Australia
is looking ahead to similar growth in its tech sector, if not quite as
dramatic. IT jobs online in Australia are up 44 percent from a year ago,
according to the Olivier Group. "The medium-term outlook is pretty good,"
said Olivier Group Director Robert Olivier. Despite these positive
indicators, student enrollment at Australia's top IT facilities has fallen
an average of 40 percent since 2000, though some schools report a slight
resurgence in the past year. If enrollments remain essentially flat,
Australia could face a critical shortage of skilled workers in the next few
years. Though the market that shortage will create will be favorable to
job seekers, it is not a viable situation for the IT industry as a whole.
Many IT deans believe the lack of student interest in IT is the result of
an image problem and the fear still lingering from the dot-com crash that
IT is a volatile field. Outsourcing has been a big issue, as well, though
the more intellectual development jobs are not migrating overseas,
according to Dubravka Cecez-Kecmanovic, head of the school of information
systems, technology, and management at the University of New South Wales.
"This is a huge misconception among not only students, but parents. In
fact, it's a widespread public belief that's absolutely wrong." The
sciences will be a major area where IT will come to play dominant role,
though students are largely unaware of that trend. The perception that
there is no social aspect of computing also detracts from the appeal of the
field, particularly among young women. In an effort to curb this trend,
universities have been partnering with schools to target students at a
young age, talking to parents, and modifying degree programs to appeal to
both students and employers.Click Here to View Full Articleto the top

A growing number of companies are adopting a model of product development
known as open innovation, where good ideas can come from developers within
the company or from licensing technology from other companies, according to
participants at a recent event hosted by Microsoft. Many look for the open
innovation model to replace the longstanding concept of the sieve, where
many ideas are produced but most are filtered out along the way. This is
especially important for large companies as their smaller competitors are
ramping up research spending. Indeed, companies with a workforce larger
than 25,000 accounted for $7 out of every $10 spent on research and
development in 1981. Twenty years later, they spent less than $4 out of
every $10, while companies with fewer than 1,000 employees accounted for a
quarter of all research spending. The shift toward open innovation affects
every industry, though its impact is especially strong in software
development. The emergence of companies centered around acquiring unused
technology will likely continue, while some businesses could even sell
their own technologies and lease back the right to use them. This concept
of a secondary patent market has been the subject of considerable
speculation, but has yet to materialize due to the problematic nature of
evaluating the worth of a patent. Some companies are tempted to hold on to
unused patents because of the leverage they can provide in legal disputes.
Still, companies that scoop up patents for the express purpose of licensing
the technology to others can come under intense criticism.Click Here to View Full Articleto the top

Radford University in Virginia gave high school girls from around the
state an opportunity to learn more about the information technology
industry during a three-day camp in late June. The Summer Bridge Program,
hosted by Radford's College of Information Science and Technology, drew 29
young girls, including Elizabeth Meade, a rising senior at Lebanon High
School who says the thought of taking an IT class full of guys was a
turnoff to her girlfriends. Radford officials involved in the math,
science, and IT program hope it can help get more young girls interested in
pursuing a career in IT. They say females accounted for 40 percent of IT
students in 1986, but 20 years later females make up less than 10 percent
of IT students. Hwajung Lee, an assistant professor in the IT department
and director of the program, says young girls are not attracted to the
field because it is largely viewed as a domain for boys, there are not
enough female students to act as a support system, and because they believe
they will have to sit alone in front of a computer for years to come. The
free camp introduced the girls to a network track, a database track, and a
Web site track, and included lectures, projects, presentations, and an ice
cream social and game night among its activities. Radford offered the camp
for the first time this year, and hopes to bring it back next year.Click Here to View Full Articleto the top

Researchers at the Oregon Health & Science University have shown that the
popular computer card game FreeCell can be modified with cognitive
performance assessment algorithms to help detect cognitive changes in the
elderly. Mild cognitive impairment is a leading indicator of a person's
likelihood to develop dementia, which is most frequently caused by
Alzheimer's disease. "We discovered that we can take an existing computer
game that people already have found enjoyable and extract cognitive
assessment measures from it," said Holly Jimison, OHSU assistant professor
of medical informatics and clinical epidemiology. Playing FreeCell
successfully requires planning, says Jimison, which a key ability that
neuropsychologists try to evaluate in clinical settings. Early trials have
highlighted differences between seniors with even mild cognitive impairment
and those with normal capacity. A "solver" within the program that
calculates the minimal number of moves required to complete the game
evaluates each player's efficiency. Jimison describes the solver as a
"dynamic algorithm that is solving the game at every moment in time, and it
knows the minimal number of steps you would need to complete it." The
FreeCell research paved the way for follow-up studies, such as the research
funded by the National Institute of Standards and Technology's Advanced
Technology Program that led to a system that can adjust the difficulty of
the game based on a player's previous results. As the elderly population
increases, Misha Pavel, OHSU professor of biomedical engineering and
computer science and electrical engineering, believes that unobtrusive
home-monitoring technologies will become a mainstay in health care. Click Here to View Full Articleto the top

Google, Yahoo, and other major Internet companies are working to make
their sites more compatible with screen-reading software to improve access
for blind users. The complex programming behind feature-heavy sites can be
difficult for many screen readers to translate. Screen readers generally
read a description of the site aloud and sometimes display descriptions in
Braille. To better meet the needs of its blind users, Google is rolling
out Accessible Search, a search application that bases its rankings on the
simplicity of the pages' layout, giving higher rankings to sites that have
numerous subject headings and other features that make them easier for
screen readers to understand. AOL is updating its Web mail to make it more
accessible to screen readers, and, in the same vein, Yahoo included a
greater number of subject headings when it redesigned its home page.
Meanwhile, Microsoft is developing tools that will make it easier for
screen readers to navigate feature-rich Web sites. Of the approximately 10
million blind or visually impaired Americans, just around 200,000 who are
completely without sight have access to the Internet, according to the
American Foundation for the Blind. "The biggest frustrations are these
sites with some 500 different links and lots of graphics," said Dena
Shumila, who is blind and runs a consultancy in Minneapolis. When Web
designers do not adequately label their links and buttons, the screen
reader translates them into generic commands like "nav bar link one" and
"nav bar link two." "Then you don't have a clue what is going on," Shumila
says. Online shopping is a constant challenge, as graphics and videos are
indecipherable to a screen reader unless they carry alternative text. The
redesign of many Internet companies' sites coincides with a drive to revise
federal standards for Web accessibility; currently, there is no law
requiring Web sites to be accessible to the disabled.Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top

The University of Manchester is using a virtual reality world to provide a
more objective environment for testing telepathic abilities. Dr. Toby
Howard, a researcher in the University's School of Computer Science who
helped implement the virtual computer world, says questions surround
previous studies because the results could be manipulated by participants
to give the impression that telepathy was real. "By creating a virtual
environment we are creating a completely objective environment which makes
it impossible for participants to leave signals or even unconscious clues
as to which object they have chosen," says Howard. Nearly 100 people are
expected to participate in the experiment that will feel like a life-size
computer game, in which participants wear a head-mounted 3D display and
electronic glove to move through the virtual world. During the tests, two
people who know one another will be ushered to different parts of the same
building to prevent any communication while they are in the virtual world
and presented simultaneously with a lineup of computer-generated objects.
Each participant will try to choose which object the other person is
thinking about transmitting to him or her. "Our aim is not to prove or
disprove its existence but to create an experimental method which stands up
to scientific scrutiny," says project researcher David Wilde from the
School of Psychological Sciences.Click Here to View Full Articleto the top

A bipartisan Senate panel headed by Sens. Byron Dorgan (D-N.D.) and John
Cornyn (R-Texas) last week convened for the first time to address the
future of radio frequency identification (RFID) technology and related
privacy concerns. Commonly used now to track goods, the technology could
one day be a mainstay of commerce, utilized much like barcodes are today.
But RFID technology can store much more than just product information and
can be scanned from a distance. Privacy advocates have expressed concern
that the technology could leave consumers exposed to tracking and are
calling for RFID sensors to be disabled at the point of sale. States have
begun to implement their own measures to address such worries. Wisconsin
in May passed a measure that makes it a crime to require a person to be
implanted with an RFID chip for security clearance purposes. Thirteen
states or more are considering similar controls on the use of the
technology. Dorgan and Cornyn would like to see guidelines for use set at
the federal level without threatening the U.S. lead in RFID technology
R&D.Click Here to View Full Article
- Web Link to Publication Homepage
to the top

Concerns that multicore computer chips may hit a performance wall are
prompting computer scientists to pursue alternatives in the form of
specialized chips that can ramp up supercomputing speed without
significantly raising heat output, power requirements, parallelism, or
costs. Specialized chips include field-programmable gate arrays, graphics
processors, video game-designed chips, and application-specific integrated
circuits. To facilitate an evolution in computing performance via hardware
acceleration, technologists must meet formidable challenges, such as
programming. IBM's Dave Turek notes, for example, that programming
accelerators carry a "prohibitively high" cost for IBM customers; they
require a lot of development time, a new code base, and in-house skills
that are currently in short supply. The specialized chip approach
dovetails better for applications with predictable patterns than for Web
searches. The specialized strategy allows applications to be accelerated
by multiples of their original speed, offering enhanced performance while
consuming less power and eliminating the need for additional node-to-node
networking. Among the companies, institutions, and research centers
focused on supercomputing via hardware acceleration are the Tokyo Institute
of Technology, AMD, Cray, Sun Microsystems, and the University of
California at Berkeley. The increasing convergence of high-performance
computing and business data processing could hasten the hardware
acceleration approach's transition from a research project to a practical
technology.Click Here to View Full Articleto the top

In addition to improving the user's Web experience by updating only small
portions of a Web application instead of refreshing the whole page, AJAX
can also moderate the strain on bandwidth and servers. Using AJAX requires
an advanced proficiency in JavaScript, which, though often maligned, is a
powerful object-oriented scripting language with a closer resemblance to
academic languages such as SELF and Scheme than Java itself. There is a
significant shortage of tool support, which makes debugging a laborious
process. Though Microsoft and Sun are working to include AJAX in their
mainstream applications, many AJAX developers continue to create their own
libraries and use open-source frameworks such as the Dojo Toolkit. Moving
farther toward the client side, applications tend to have a faster
responsiveness, though that often comes at the expense of lesser
reliability, more complexity, and a more opaque user interface. Many AJAX
applications are built out of the assumption that they will be executed in
ideal environments, where bandwidth is plentiful, security concerns are
nonexistent, and errors occur only infrequently. In the real world,
however, this is never the case, and AJAX applications can be slowed or
halted by network outages. AJAX can also disrupt the "one URL equals one
resource" notion that has long characterized the Web, altering the
semantics of the "back" button on a browser and making bookmarking
difficult. Since conventional Web applications deliver vast amounts of
redundant information, particularly if the coding is in HTML and loaded
with tags, AJAX can conserve bandwidth by only updating data as needed.
Still, some developers feel that XML is a cumbersome transport format, and
opt instead for a format called JavaScript Object Notation. Others are
concerned that the continual polling of some AJAX applications could place
an undue strain on a server farm, particularly if the demands of many users
are synchronized.Click Here to View Full Articleto the top