The authors of last year's report, "Asking the Right Questions About
Electronic Voting," have found that some jurisdictions may not have
adequate safeguards in place to ensure the security and reliability of
electronic voting systems in time for the November elections. Election
Data Services reports that more than a third of the 8,000 voting
jurisdictions in the country will use e-voting systems for the first time
this year. "This is a moment of truth for electronic voting," said Richard
Thornburgh, co-chairman of the National Research Council panel that
authored the study and a former Republican governor of Pennsylvania.
"You've got a lot of people who are working for the first time with the new
technology. It should impart a greater note of caution than what you might
normally attend to a regular election." Though the Help America Vote Act
(HAVA) of 2002 accelerated deployments of e-voting technology, just 10
percent to 15 percent of jurisdictions had replaced traditional voting
machines with electronic systems by 2004. The initial concerns about
electronic voting centered around fears that the results could be
manipulated, but most problems that have arisen in this year's primaries
have involved machines breaking down or being improperly used by election
officials. The National Research Council notes that because some states
may not be able to meet the HAVA deadlines for upgrading their systems, it
is unclear if they will use old or new machines in this year's elections.
Other questions involve the potential confusion some voters could have
about the machines and whether jurisdictions will have enough time to train
poll workers. One of the council's recommendations is that election
officials conduct random tests of the machines on Election Day. "You're
always looking for the latest threat," said Dana DeBeauvoir, a clerk of
Travis County, Texas, where she is credited with having developed one of
the most thorough plans for Election Day. For information regarding ACM's
e-votings actions, visit
http://www.acm.org/usacmClick Here to View Full Articleto the top

Speaking at the opening session of Microsoft's annual faculty summit,
University of Maryland President Dan Mote addressed the problem of
declining student interest in computer science and IT. "Students do not
see opportunity in our field," said Mote, who co-authored a report on
federal report last fall highlighting the problem. Declining student
enrollments could translate into critical worker shortages for technology
companies in the future. In an attempt to develop strategies to remain
competitive with India and China and to figure out why U.S. students
produce such low test scores in math and science, Sens. Lamar Alexander
(R-Tenn.) and Jeff Bingaman (D-N.M.) turned to the National Academies of
science, engineering, and medicine. The Academies assembled a group of
academic and industry leaders, including Mote, to address the issue. The
committee recommended using scholarships to recruit 10,000 math and science
teachers each year, and increasing federal funding for basic research.
Their report also recommended giving visa extensions to international
students working toward PhDs in science, technology, engineering, and math.
Meanwhile, fewer than one-third of U.S. fourth- and eighth-graders
demonstrated proficiency on math tests, the committee found. Also, the
committee reported that most grade-school math and science teachers are not
credentialed in the field. Most grade-school computing instruction focuses
more on literacy than on fluency, said Lucy Sanders, CEO of the National
Center for Women & IT. Too often schools equate computing with programming
in their coursework, Sanders says, noting that computer science is most
productively viewed as a tool to solve problems, rather than a laundry list
of esoteric programming languages.Click Here to View Full Articleto the top

Leading academic database expert Raghu Ramakrishnan will lead Yahoo's
social search research effort as the company attempts to develop a social
search . Ramakrishnan, a specialist in databases, data mining, and
privacy-preserving technologies, was a computer science professor at the
University of Wisconsin, Madison, for nearly 20 years, and he co-founded
the university's Data Mining Institute. Social search involves the mining
of the collective knowledge of users to improve Web-search tools. "At
Yahoo you have this unique opportunity to integrate conventional search
with Flickr, Del.icio.us, Yahoo Answers, Yahoo Groups, and Yahoo Mail,"
Ramakrishnan says of the company's services that focus on human
contributions. "How do you make all of this [search activity] as natural
as possible to users?" Ramakrishnan, 45, will serve as a vice president
and research fellow at Yahoo. He is the chair ACM's Special Interest Group
on Management of Data, and an ACM fellow.Click Here to View Full Articleto the top

The EDA industry must do a better job of addressing the issue of
multi-core system programmability, said Gary Smith, managing vice president
of design and engineering research at Gartner Dataquest. "Programmability
has now replaced power as the number one impediment to the continuation of
Moore's law," Smith said during the annual Sunday pre-DAC EDAC-Gartner
forecast panel. Smith noted that the cost of designing an IC has fallen to
between $10 million and $20 million since 1997, and verification costs have
been steady over the past seven years. However, the cost of embedded
software is raising design costs. "We can do as good a job as we can and
we have been doing a great job keeping the design costs down for the
silicon design, and we're not doing that bad with the PCB design either but
the cost of software is killing us right now, and we've got to do something
about that," said Smith. With 38 percent of designers using EDA tools
developed in-house, compared with 27 percent last year, opportunities for
growth remain, and there needs to be some focus on analog, RF, and systems
design tools, according to Smith. The industry must find a way to design
software concurrently to use in a multi-core environment, said Smith,
adding that the architectural workbench will be the most popular killer
app.Click Here to View Full Articleto the top

Among the most exciting ideas presented at this year's New Paradigms for
Using Computers conference held at IBM's Almaden Research Center was a
wallet-sized device that could serve as a cell phone, car key, iPod, credit
card, and TV remote control. Another concept presented at the conference
was the technology that would enable a user to navigate the Web simply by
staring at the screen and pushing a button on the keyboard. The all-in-one
device, called Lil'me, would have a color screen and be able to obtain new
music without linking to a computer. It would be a voice and video phone,
and could be programmed to open compatible electronic locks. With GPS
capabilities, Lil'me could help users navigate or transmit its location if
it was lost, and it could store and transmit credit card information to
cash registers and gas pumps. To counter identity theft, Lil'me would
verify its user's identity through fingerprints, retina, or voice scans.
IBM's John Varghese, who presented the device, believes that Lil'me could
be ready for the commercial market within two years, noting that most of
the technology already exists. The Gaze-enhanced User Interface Design, or
GUIDe, is the product of Stanford University graduate student Manu Kumar.
To calibrate the system, the user sits in front of the screen and follows
the movements of a yellow dot. At that point, the user can lock his eyes
on a portion of the screen, press a button on the keyboard, and the small
section will appear on the screen magnified. After looking into the
blown-up area and depressing the button, the Web browser follows the link
the user was focusing on. "My hypothesis is that it will be easier and
faster than using a mouse," Kumar said. Complex tasks such as drawing
lines would still require a mouse, but routine applications such as surfing
the Web and switching applications could use the eye-tracking technology.
Click Here to View Full Articleto the top

In partnership with the NSF and the University of Michigan, the
Semiconductor Research Corporation (SRC) is launching a three-year program
to develop defect-tolerant chips that can detect and fix flaws using online
collaboration software. The ability for chips to heal themselves without
human assistance is expected to increase the lifespan of products powered
by semiconductors. "On the chip, there can be system-level checking going
on and monitors so that when parts of it fail, the computation can be
switched to other parts of the chip and maintain the functionality while
not having to throw that chip out or having the system fail," said SRC's
William Joyner. "In general this will be transparent to the user. The
chips would be able to recover without great overhead in space and
performance through the use of online software and components within the
chip." Chips currently discover and diagnose problems using redundancy,
which takes up considerable space on the chip and undermines performance
power. The researchers want to create chips that are more efficient at
scanning for problems and can move the operations of a problem area to a
different part of the chip. The new chips would then be able to
autonomously fix the problem area before putting it back to work. The
researchers are not attempting to create flawless chips, says Todd Austin,
associate professor of electrical engineering at the University of
Michigan. Rather, their work is driven by the reality that the continued
scaling of chip features inevitably leads to errors that new architectures
must be able to work through. The researchers will explore a variety of
new failure models, evaluating the reliability of everything from the
software applications that the system runs to individual circuits and
wires.Click Here to View Full Articleto the top

BBN Technologies has reported the development of a mesh network that
requires far less power than cellular, Wi-Fi, and other conventional
wireless networks without compromising speed. DARPA funded the research to
create ad hoc networks for combat situations, though some features of the
network could also be applied to emergency networks or cell phone systems
in remote locations, and could even be used to extend the battery life of
wireless devices. Mesh networks are most commonly deployed in research
settings, where scientists pepper a volcano or swath of rainforest with
sensors to collect data about the natural world. Mesh networks can be
created quickly, and, since they do not require costly infrastructure, they
are useful for establishing communications in areas without electricity.
Slow data-transfer rates and high power-consumption rates have thus far
hampered the utility of mesh networks. With double-digit Kbps
data-transfer rates, existing mesh networks are not nearly fast enough for
applications such as military surveillance. The mesh network developed by
the BBN team can send megabits of data per second, achieving speeds
comparable to Wi-Fi networks and sufficient for streaming video. The BBN
researchers modified the hardware in each network node to develop more
energy-efficient radios. They also reexamined the algorithms that
synchronize communication among the nodes and developed an energy-saving
protocol that eliminates the need for nodes to be constantly listening for
each other. The third change they made to existing mesh networks was to
apply a different type of algorithm that monitors for network traffic and
directs the nodes to adjust their activity accordingly, an adaptability
that is critical for power conservation. DARPA will test the network in
the field beginning next year.Click Here to View Full Articleto the top

Romanian-born computer artist Alex Dragulescu has been developing a new
form of creative expression based on computational modeling and information
visualization. In one project, he developed algorithms to analyze the text
and data points in spam to develop images of plant-like structures, or
"spam plants," that grow continuously based on the arrival of new email.
In his latest project, Dragulescu is developing software based on
computational linguistics algorithms that culls together information from
thousands of blogs across the Internet to produce a meaningful graphic
novel. "By analyzing text using computational linguistics methods, you can
detect anger and sadness. Turning those into gestures in three dimensions,
that would be interesting," said Dragulescu. His work is especially
relevant at a time when scientists and researchers are struggling to
extract meaning from the massive quantities of data being amassed in fields
such as earth science, drug discovery, and genetic research. That effort
collides with art, Dragulescu says, when novel visualization techniques can
help researchers identify patterns that otherwise would have gone
unnoticed. To create the spam plants, Dragulescu parsed the data in the
email such as the subject lines, headers, and footers to uncover hidden
relationships, which he then represented visually. The email's size might
determine how bushy the plant is, for instance, and specific keywords, such
as "Nigerian," could create more branches. Dragulescu recently completed a
project that can create images from Mozart's music using software to
analyze the characteristics of the notes.Click Here to View Full Articleto the top

The "Women in Design Automation" workshop at the Design Automation
Conference addressed a broad spectrum of issues ranging from getting the
kids ready for their day in the morning to a blunt consideration of
traditional management practices. The workshop's topic was "Working the
80/20 Rule for Success"--80 percent of the results come from just 20
percent of the effort applied, but NVidia's Reynette Au said the goal
should always be 100 percent. Au argued that women need to be
single-minded and identify what motivates them and what they do best.
Convincing others, including supervisors and colleagues, of the importance
of their work is also important, Au said. "Rarely people around you pay
attention to what you do. What you do is often unappreciated and
undervalued." Rather than concentrating on strategies for climbing the
corporate ladder, engineers should focus on doing the best job they can,
"then results will follow," said Kathy Papermaster, director of the
Sony/Toshiba/IBM Design Center. Participants also stressed the importance
of networking, both online and in person.Click Here to View Full Articleto the top

The U.S. Commerce Department plans to hold a hearing on Wednesday to
discuss whether the nation should cede control of the Internet domain name
system (DNS). The Commerce Department faces a Sept. 30 deadline to either
cede the DNS or assert continued control over it. Various Internet
stakeholders have heterogeneous opinions on this matter. NetChoice
Coalition director Steve DelBianco believes the United States should
control ICANN and the DNS for another two years until ICANN as an
organization is strong enough to provide impartial global leadership. Some
European Internet stakeholders are still seething about U.S. control of the
Internet in the wake of ICANN's refusal to launch .xxx, a cancellation some
blame on behind-the-scenes U.S. influence.Click Here to View Full Articleto the top

The sixth Hackers on Planet Earth (HOPE) conference in New York drew about
2,000 attendees of all ages and affiliations. The conference, run by
members of the hacker group 2600, embraced the spirit of the hacker
community, said Greg Newby, co-organizer of the conference and a professor
of computer science at the University of Alaska. "This involves political
awakening, as well as open sharing of information," he said. Presentations
offered tips for stymieing wiretappers, history lessons on computer crime,
and technical discussions of security patches. There were hands-on
tutorials where participants, all in attendance anonymously, could try
their hand at picking locks, creating light graffiti with so-called "light
throwies," and learning about the technology behind ham radios. Among the
speakers at the conference was free-software pioneer Richard Stallman.
Another speaker described a cell phone jammer she created that can block
out cellular signals emanating from a tower. A trio of security
consultants shared their analysis of data intercepted from improperly
configured networks. They found that companies were using protocols that
lacked the proper authentication and sending tax information, trade
secrets, and other sensitive information over their wireless networks.
"Attackers could exploit these vulnerabilities to turn themselves into a
node on the corporate network," said security expert Raven Alder. At that
point, the hacker could initiate a denial-of-service attack or launch a
spam campaign. The emergence of Wi-Fi has only amplified these security
problems, Alder said. Computer scientist Matt Blaze demonstrated how he
and three of his graduate students had used two phones and wiretap
equipment he had purchased on eBay to send out false phone numbers to
defeat eavesdroppers and neutralize wiretap recorders by playing a
low-volume tone through the connection.Click Here to View Full Articleto the top

An obscure research institute under the Department of Homeland Security is
working to formulate an estimate of the real cost of cyberattacks. Despite
frequent studies by consultancies and an annual FBI report, the actual
extent of the damage brought on by denial-of-service attacks and other
malicious network activity is obfuscated by the fact that the companies
that own the infrastructure typically do not want to reveal the
information. "So much of what we have been hearing about cyberattacks was
just hearsay," said Scott Borg, director of the U.S. Cyber Consequences
Unit (US-CCU). "We found a lot of things people were worried about were
extremely unlikely." US-CCU was created in 2004 with a shoestring budget
to spend four months conducting surveys of the electrical power and health
care industries, with more critical infrastructure areas to be incorporated
later. Having been extended into its second one-year contract, the scope
of the project became much larger than initially planned. So far, US-CCU
is reporting that a devastating attack on the Internet or the power grid is
not likely to occur in the immediate future. To ensure that such a
scenario never materializes, US-CCU has developed a checklist of 478 items
that organizations can address to shore up their portion of the nation's
infrastructure, though it has elicited little response from leaders in the
DHS Cyber Security Division. Through his on-site visits, Borg began to see
that many organizations had adopted best security practices, but that they
contained gaping vulnerabilities nonetheless because they only dealt with
security at the periphery. Cybersecurity is hamstrung by the lack of
adequate tools for measuring the consequences of attacks and the reluctance
of companies to share data, Borg says. He argues for a holistic view of
the costs of cyberattacks that factors in the broad shifts in business
operations that can arise from security breaches. Borg says his next goal
is to develop security tools tailored to individual industries.Click Here to View Full Articleto the top

Though Sun's June research exhibition was a chance to show off its
bleeding-edge gadgets and projects such as the experimental scientific
computing language Fortress, the company's efforts to replace flawed
floating point math with interval-based programming stole the show. Sun
introduced interval-based programming to its Fortran compilers in 1995, and
later extended interval support in the form of an external library to its
C++ compilers. Sun researcher G. William Walster likens intervals to
ranges of numbers that, when calculated ahead of time, can quickly
determine if an equation solution is correct or incorrect. Though they
have the capacity to deal with extremely large numbers, most CPUs can still
only calculate to a certain number of digits, meaning that at a certain
point, numbers will have to be rounded. Replacing floating point units
(FPU) is still a ways off, Walster says, though he stresses the importance
of continued research to improve the accuracy of scientific and technical
calculations. Walster says researchers "more and more want to use the
speed of digital computers to replace physical experiments that are
difficult or expensive or impossible to conduct, and replace them with
cheap, fast substitutes. The difficulty is, if you know nothing about the
accuracy of your computing, there's a huge risk involved." In Sun's
Fortran and Fortress compilers, intervals are treated like any group of
real numbers or integers. Fortress, which Walster says "is designed to do
for Fortran what Java did for C++," is a natively parallel language, making
it ideally suited for clustered environments. Among the other projects on
display at Sun's exhibition were its Sun SPOT (Small Programmable Object
Technology) sensors that can detect motion, light, and wireless traffic, as
well as its research in technologies to improve digital media on the
Internet.Click Here to View Full Articleto the top

Researchers at the National Institute of Standards and Technology have
developed an electromagnetic "trap" for ions that could bring the mass
production of quantum computers closer to reality. The trap is different
from previous traps for electrically charged atoms in that all electrodes
are arranged in a single, horizontal layer. The single layer would make it
easier to scale components and processes in manufacturing. Described in
June in the journal Physical Review Letters, the single-layer can trap a
dozen magnesium ions without the onset of a heating problem from changes in
electrode voltage, according to David Wineland, who heads the NIST team.
The researchers are now focused on building more complex structures for the
single-layer traps. Quantum computing researchers hope to use ions as
quantum bits, represented by a 0 and a 1 simultaneously, to achieve the
enormous calculation speeds of a quantum computer. In theory, problems
that take today's computers hours to solve could be cracked in seconds with
a quantum computer. They could be used to break data encryption codes and
to search large databases quickly, for example.Click Here to View Full Articleto the top

ICANN executives acknowledged during last month's meeting in Marrakech,
Morocco, that the Internet oversight body has not done enough to help
Africa develop its IT and communications infrastructure, according to the
chairman of the Sudan Internet Society (SIS), a group that promotes the
Internet in Sudan. SIS Chairman Mohamed El Bashir Ahmed also blamed
ICANN's failure to involve African representatives in its policy decisions
as a reason for the underdevelopment of the Internet in Africa. "In view
of the diversity of Internet users in Africa, issues such as
multilingualism and multiculturalism have to be resolved," Ahmed said,
adding that these issues can be resolved by increasing African involvement
in policy decisions. ICANN has plans to open an office in Africa to help
coordinate the group's African initiatives. A decision has not yet been
made on where the African office will be located.Click Here to View Full Articleto the top

Before brain implants can allow paralysis victims to control robotic
limbs, neuroscientists must understand the mechanisms that enable the brain
to know the relative spatial positions of different parts of the body, an
ability known as proprioception. Several projects have taken great strides
in the field of neuroprosthetics: A team of scientists at Brown University
and Cybernetics Neurotechnology Systems has enabled a quadriplegic man to
move an onscreen cursor by thought through an electrode array implanted in
his motor cortex, while a Stanford University group has used a
brain-computer interface integrated with the premotor cortex of a
non-paralyzed primate to explore similar neuroprosthetic applications. But
a method must be worked out for neuroprosthetic devices to deliver feedback
to the brain if the devices are to replicate more sophisticated functions,
according to Daofen Chen of the U.S. National Institute of Neurological
Disorders and Stroke. Deeper understanding of sensory input is key to
making interactive brain-machine interfaces a reality. One project
conducted by University of Pittsburgh researchers involves stimulating the
sensory nerves from the limbs of an anesthetized cat via electrodes just
before they enter the spinal column and concurrently recording from sensory
cortex neurons, and then repeating the recording while manipulating the
cat's limbs manually. The pattern of cortical neural activity will be
compared in both experiments to see whether the researchers can imitate the
patterns received in reaction to passive movements with artificial
stimulation. A neurophysiologist based at Northwestern University is
electrically stimulating the region of the primate cortex responsible for
processing proprioception while simultaneously recording neuronal activity
in the motor cortex in the hope that such work will eventually facilitate
the design of stimulation patterns capable of mimicking the brain's own
proprioception signal processing.Click Here to View Full Article
- Web Link to Publication Homepage
to the top

A lack of diversity among computer operating systems makes those systems
highly vulnerable to hackers, and a cadre of computer scientists is
attempting to address this problem by cultivating "software diversity"
through the electronic equivalent of genetic mutations. "Every computer
should have its own unique properties," remarks University of New Mexico at
Albuquerque computer scientist Stephanie Forrest. She realized that
buffer-overflow attacks, though simple to mount, require the hacker to know
exactly what part of the computer's memory to assault; Forrest reasoned
that scrambling the way a program employs a computer's memory can thwart
such attacks. The computer scientist pioneered the technique of
memory-space randomization to test her theory, and her method has been
adopted by Linux provider Red Hat and Microsoft, which is prepping a new
Windows operating system, Vista, that uses Forrest's technique. Forrest
has also experimented with another method involving the replacement of the
"translator" program responsible for interpreting instruction sets with a
specially modified program that encrypts the sets with a randomized
encoding key. Studies have shown that randomization can be foiled by
"brute force" attacks, but absolute immunity may not be a prerequisite for
survival. Epidemiologist Dan Geer thinks a small number of diverse
computers connected to the Internet would be enough to fragment the
operating system monoculture and make systems immune to many digital
attacks. Gabriel Barrantes, a researcher who collaborated with Forrest on
the instruction set encryption randomization experiment, believes blending
distinct randomization techniques yields the best kind of protection.Click Here to View Full Article
- Web Link to Publication Homepage
to the top