ACM's Special Interest Group on Algorithms and Computing Theory (SIGACT)
announced that Alexander A. Razborov and Steven Rudich, two computer
scientists who developed a rare finding that addresses the P vs. NP
Problem, will receive the 2007 Godel Prize for outstanding papers in
theoretical computer science at the ACM Symposium on Theory of Computing,
which takes place June 11-13, 2007, in San Diego. P vs. NP is a
fundamental problem in computer and network security techniques and many
security optimization techniques. For years, questions on the limits of
proof and computation raised by P vs. NP has hindered computer scientists.
These questions affect complex mathematical problems common in creating
security solutions for ATM cards, computer passwords, and electronic
commerce. In a paper titled "Natural Proofs," originally presented at the
ACM Symposium on Theory of Computing in 1994, Razborov and Rudich addressed
what is widely considered to be the most important question in computing
theory, and is one of seven $1 million reward Prize Problems by the Clay
Mathematics Institute in Cambridge, Mass. The questions asks if the
solution to a question is easily checked, is the problem easy to solve?
Razborov and Rudich proved that there is no "Natural Proof" that certain
computational problems used in cryptography are hard to solve, and though
they are widely thought to be unbreakable, there is no natural proof that
they are secure. Such cryptographic methods are critical to electronic
commerce. Razborov is the leading researcher at the Russian Academy of
Science Steklov Mathematical Institute in Moscow, Russia, and Rudich is an
associate professor of computer science at Carnegie Mellon University.Click Here to View Full Articleto the top

Cybersecurity experts contend that nations need to prepare for the
possibility of inter-governmental cyber-war. This declaration came on the
heels of a "distributed denial of service" (DDOS) attack on Estonian
government Web sites that Estonia alleged came from Russian authorities.
This incident demonstrated that a government "definitely" could mount an
attack that would cut off a country's essential services by disabling its
computer systems, experts say. The attack would be carried out by a
"botnet," an enormous army of commandeered "zombie" computers that
overwhelm a selected Web site at the command of a master computer. Experts
note that this strategy makes it difficult to distinguish the attack's
originators from the botnet's unaware victims. Still, there are ways of
stymieing the attacks, such as a router that analyzes Web site traffic
patterns and can steer suspicious requests into a "cyber black hole." Ihab
Sharaim, chief security officer at MarkMonitor, says, "The U.S. Department
of Defense is definitely preparing for something like this." Water, gas,
and other essential services are vulnerable to such attacks, as they are
dependent on IT, says Tier-3 CEO Peter Wollacott. "Whether it's a group of
university students setting up a botnet or someone more ideologically
motivated, all those possibilities are there," he says.Click Here to View Full Articleto the top

The House Judiciary Committee's Subcommittee on Courts, the Internet, and
Intellectual Property approved sending the Patent Reform Act to the full
committee, despite concerns over portions of the bill. Two portions of the
bill, one that would allow a nearly unlimited amount of time for patents to
be challenged after being granted and another that would limit the damages
a patent holder can collect when infringing technology is part of a larger
product, were scrutinized by several subcommittee members. The second
controversial section would base damages on the number of patents within a
product, instead of the total value of the infringing product, as is the
current practice. Many large technology vendors support patent reform,
arguing it is too easy for patent holders to collect disproportionately
large damages when an IT product contains a small patent infringement.
However, such patent reform has been strongly opposed by small inventors,
pharmaceutical companies, and some small technology vendors who say the
bill will allow large competitors to infringe patents without fear of
significant penalties. The legislation would also change the U.S. patent
process from a first-to-invent, which can be difficult to prove, to a
first-to-file system. Several Republicans on the subcommittee suggested
the bill weakens the value of U.S. patents. "None of us wants a system
that rewards legal gamesmanship," said Rep. Howard Coble (R-N.C.). "But in
our zeal to weed out bad lawsuits, let's not proceed on the assumption that
every patent holder who wants to license an invention or enforce his or her
property is ill-intentioned." Supporters of the bill argue that it is
necessary to limit multimillion-dollar patent lawsuit awards as they stifle
innovation.Click Here to View Full Articleto the top

Women account for only three out of every 10 computer scientists, system
analysts, computer support specialists, and operations research analysts,
according to the 2005 Current Population Survey from the Bureau of Labor
Statistics, which estimates that the percentage of women in such fields has
dropped 7 percent at a time when the technology workforce is at an all-time
high. Meanwhile, a University of California, Los Angeles study concluded
that female undergraduate enrollment in computer science is at its lowest
point since the 1970s. Several reasons exist for why women do not hold a
more prominent role in computer technology, writes Jim Lanzalotto of
technology personnel placement firm Yoh. He cites women who start careers
in technology but choose a different career path later, or those that leave
the workforce for various reasons such as maternity leave, but find a steep
learning curve and few opportunities for advancement upon returning. Windy
Warner, an executive coach specializing in working with IT executives and
professionals, says women also have a difficult time landing tenured
positions, and directorships or CIO positions. "Women create part of the
problem themselves because we tend not to be as confident in our abilities
as men are," Warner says. "We don't let management know what we have
contributed, we don't ask for promotions and raises as easily as men do,
and we don't assert ourselves into leadership positions on teams as much as
we could." Lanzalotto says that a work/family balance is the primary
desire of women in IT, as with women in many industries, and managers who
create flexible schedules can ensure they retain experienced, knowledgeable
workers of both sexes. Training can help workers who temporarily leave the
workforce catch up faster, and flexible hours can help employees meet
conflicting demands without decreasing the number of hours they work. For
information on ACM's Committee on Women in Computing, see
http://women.acm.orgClick Here to View Full Articleto the top

Researchers at Hewlett-Packard's lab in Bristol, U.K., are developing
software that will let people use their portable devices as platforms for
GPS-enabled games and tours. HP Labs recently launched a site that
provides location-based games and city walking tours. The site also offers
the opportunity to modify some of the existing games and tours, or even
create a new application from scratch. The HP project uses a concept known
as augmented reality, or combining physical data with virtual information.
As location and guidance technology improves and PDAs become more powerful,
numerous augmented reality programs are being developed. Nokia is working
on a project that will help people navigate new areas. The user simply has
to take a picture of a landmark, and the program uses GPS coordinates to
create a hyperlink with the image. Research manager for HP's project Phil
Stenton says this type of technology will be useful for such things as
entertaining out-of-town guests during work hours. University of Bristol
computer science professor Cliff Randal says HP is making an important
contribution to this type of research. However, not everyone is completely
impressed with HP's initial offering. Georgia Institute of Technology
computing professor Blair McIntyre, who has developed similar software for
local tours, says HP is not creating a truly immersive augmented-reality
experience. Stenton admits that the program has some limitations, but
notes that future versions could include software for working with
Bluetooth wireless devices, infrared sensor data, in-phone accelerometers,
and possibly even heart-rate monitors. Stenton believes that future
generations of this technology will allow people to create programs such as
exercise routines that can be shared with friends.Click Here to View Full Articleto the top

Universities, other research institutions, and industry will have three
times as much time to use U.S. Department of Energy supercomputers for
projects next year than this year. The DOE's Office of Science is likely
to award as many as 250 million processor hours in supercomputing and data
storage resources for 2008, up from 95 million processor hours of computing
time used for 45 projects in 2007. DOE is offering outside researchers an
opportunity to take advantage of its powerful supercomputing resources
under its Innovative and Novel Computational Impact on Theory and
Experiment (INCITE) program. The Leadership Class Cray supercomputers at
Oak Ridge National Laboratory, the IBM Blue Gene supercomputer at Argonne
National Laboratory, the Cray XT4 supercomputer at Lawrence Berkeley
National Laboratory, and the Hewlett-Packard massively parallel system at
the Pacific Northwest National Laboratory will be available to researchers.
"The demand for access to INCITE supercomputing resources has far exceeded
what is available even though total allocations have soared from just 3
million hours in 2004 to 250 million hours next year," says Dr. Raymond L.
Orbach, DOE Under Secretary for Science. "The breadth of proposals--from
industry, academia, and national labs--illustrates both the demand for such
resources and the contributions computational science are making to our
economic and scientific competitiveness."Click Here to View Full Articleto the top

University of Florida computer science professor Paul Fishwick is trying
to dispel the stereotype that computer science is not a fun, hands-on type
of major by having his students build physical, visual artifacts to explain
computational concepts. The building process is used to visualize code,
which is an essential part of video game design and could soon be a
significant portion of all computer software design. Video game designers
need to visualize what their code will ultimately look like so they can
create a blend of code and art. Fishwick believes that game design,
computer software, and even mathematics often have similar problems due to
an inability to visualize concepts, and he has created "aesthetic
computing" to help students overcome these problems. Student's in
Fishwick's class started with an understanding of computing concepts, but
had to develop a technique to visually represent the concepts so that
viewers, with varying backgrounds of computer knowledge, could understand
them. Such aesthetic computing practices continue to expand beyond the
world of video games, as more educators start to use such methods to
illustrate abstract mathematical and computational concepts, and as
software designers expand beyond traditional programming and create news
ways of exploring the virtual world. An example of this expansion is
Second Life, an online social networking program that allows users to
interact with virtual representations of each other.Click Here to View Full Articleto the top

The Scripps Institution of Oceanography at the University of California,
San Diego is working with UCSD's division of the California Institute for
Telecommunications and Information Technology and the San Diego
Supercomputer Center to develop and create a digital infrastructure that
will allow ocean observatories to collect, process, and transmit data
nonstop. UC San Diego was selected by Joint Oceanographic Institutions
(JOI) to design and build an information technology and networking system
for the Ocean Observatories Initiative. The primary cyberinfrastructure
award is for $29 million over six years, but total funding may reach more
than $42 million during the course of the planned 11-year project. The
cyberinfrastructure will send real-time data streams as fast a one
gigabit-per-second from a variety of ocean sensors and instruments. The
data will be available to every researcher, teacher, or citizen in real
time through dedicated, high-speed Internet links. Scientists will also be
able to operate robots on the ocean floor from their campus laboratories,
and many functions will be automatic and require no human intervention.
Steve Bohlen, president of JOI, a coalition of 31 premier oceanographic
research institutions that serves the U.S. scientific community by managing
large-scale, global research programs in marine geology, geophysics and
oceanography, said, "As a whole, this undertaking will allow scientists,
students, and citizens to observe and compare ocean phenomena on a scale
that has not been possible until now."Click Here to View Full Articleto the top

Last June, the Computer Research Association and the National Institutes
of Health hosted a joint workshop that focused on synergies between the two
fields that could accelerate research in both. To overcome the challenges
associated with the proposed collaborative effort, the workshop attendees,
primarily leaders in computing, biomedicine, and NIH program directors,
worked to address these challenges by creating a list of recommendations
and actions that would guide the computing and NIH communities. The
workshop participants were able to agree on six recommendations, now
available in a 14-page report. The first recommendation is that the
National Science Foundation and the Department of Energy's Office of
Science should support the collaborative effort by creating small grants
that require conceptual proof-of-principle but no preliminary results,
create or expand programs to fund computing and biomedical research, and
establish a cross-disciplinary, multi-agency work group to identify,
explore, and recommend individual agency opportunities and define and
coordinate joint agency programs. The second recommendation suggests that
federal agencies enhance support for "training at the interface," including
summer schools for students, post-doctorates, and professors, as well as
increased emphasis on undergraduate and graduate training programs. Third,
the NIH should create a cross-institute software program that creates and
maintains high-quality, well-engineered biomedical computing software.
Fourth, the NIH should fund several large, distributed transformational
centers to act as "expeditions" to the future. Fifth, the NIH should
invest in a range of computing research technologies that would benefit
current and future biomedical research and healthcare needs. The final
recommendation suggests that the NIH, NSF, DOE, and CRA create a joint
"Interface Task Force," possibly utilizing the Computing Community
Consortium, to involve the community, and recommend specific ways to
support computing and biomedicine interface advancements.Click Here to View Full Articleto the top

The world's first commercial holographic storage system is set for release
this fall, and will be able to store the equivalent of 64 DVD movies on a
single disc about the size of a CD. InPhase Technologies has spent 13
years developing the materials, systems, and processes and its first
products will be a 600 GB write-once disc and drive. Holographic storage
has been an objective since the 1960s, but it has taken more than 40 years
for technology to reach the point where it is viable. Despite the
increasing capacities of new generations of optical discs--CD-ROMs can
store about 700 MB, DVDs about 18 GB, and Blu-ray and HD-DVD can each store
about 25 GB--many archival systems still use large storage capacity
magnetic tapes, despite being expensive and difficult to handle.
Holographic storage could provide an alternative. Although the first
holographic products are too expensive for the mass market, potential users
could include banks, libraries, government agencies, and large
corporations. There is no guarantee that holographic recording will
replace magnetic tapes either, as tape technology is still evolving. Last
year, IBM and Fuji Photo Film proved that it is possible to pack data onto
a magnetic tape at a density of 6.67 billion bits per square inch, 15 times
the data density of a standard tape. Magnetic hard drive capacity is also
increasing due to perpendicular recording. Holographic storage is also
unlikely to become the standard for video, as Microsoft Chairman Bill Gates
believes that Blu-ray and HD-DVD are the final generation of physical video
storage, with future generations of video being available for on demand
download. Holographic storage is able to create such massive space because
it uses a 3D process that uses the entire recording layer to store data,
not just the surface of the medium.Click Here to View Full Articleto the top

Demands that ethical laws be passed to restrict the potential for robots
to harm humans are growing as these machines enter homes and offices and
advance to the point where they can make decisions for themselves. In
certain Asian countries where birthrates are low and immigration is
limited, robots are viewed as essential for taking care of children and the
elderly, and relieving labor shortages by replacing people in low-skill
jobs. It is anticipated that robots will play a greater and greater role
in health care, but roboticist Gerard Lacey of Ireland's School of Computer
Science and Statistics maintains that "the social contact a carer gives can
never be replaced by a machine." Continued development of robotics
technology will probably make machines capable of self-learning, which
could make their behavior increasingly difficult to predict. South Korea
has drafted a Robot Ethics Charter that incorporates famed science fiction
author Isaac Asimov's three laws for governing robots' behavior so that
damage to humans can be avoided. Meanwhile, a conference of robo-ethicists
in Rome raised a number of issues, such as whether "sexbots" resembling
children should be legally permitted, and the possible use of robots that
make their own decisions by the military establishment. There are concerns
that the reduction of battlefield casualties thanks to robots will
encourage more aggressive warring by military planners.Click Here to View Full Articleto the top

Microsoft Chairman Bill Gates told attendees at Microsoft's WinHEC
hardware engineering conference that the future of the PC industry would
revolve around 64-bit computing, creating more humanistic interfaces,
unified communications, and Web services. Gates said that PCs with
increasing processing power and rich Web applications would work together
using the Internet to deliver services on all types of mobile or stationary
devices. Multicore processors running 64-bit applications will make home
computers faster, expand system memory, and be available on all types of
computing devices, Gates said. Gates also said the PC platform will expand
beyond computing devices and become embedded in other devices such as
refrigerators and toys. Among Microsoft's projects for the future is an
ultra-mobile PC that will create a more mobile way of being constantly
connected to the Internet. Microsoft also plans to continue developing
Web-based services that synchronize with information on personal PCs.
Microsoft's chief research and strategy officer Craig Mundie said it only
makes sense to have Web-based servers sharing computational tasks with
devices to take advantage of the additional power in 64-bit processing.
Mundie also outlined some of the challenges developers will face as the
majority of the world's population starts to use cheaper mobile phones and
handheld devices to access the Internet instead of PCs, including
challenges in constructing loosely coupled applications that can
communicate with software on the Web, or in other devices through
peer-to-peer networks.Click Here to View Full Articleto the top

Ian Appelbaum and Biqin Huang of the University of Delaware in Newark and
Douwe Monsma of Cambridge NanoTech in Massachusetts have demonstrated that
spin can be injected into silicon and measured using a silicon-based
device, which experts say is a key advancement toward developing logic
devices based on spintronics. Spin is a quantum property of electrons very
similar to magnetism, and individual electrons have a spin that is either
"up" or "down." In a typical electric current, electrons have both kinds
of spin, but passing the current through a ferromagnet will filter one spin
or the other out. Electrons with spin oriented opposite to the axis of the
magnet will be slowed down and scattered while electrons with a matching
orientation are drawn through. This process creates an electric current
made of mostly of electrons with a uniform spin direction, which can be
measured by another magnet. The researchers passed highly energetic
electrons through thin ferromagnet films 5 nanometers in depth deposited on
top of a 10-micron thin piece of silicon. The extremely thin layers
coupled with high energy electrons allowed the elections to move through
the silicon without losing their spin, making it possible to inject an
electric current with a particular spin through the silicon. The
researchers also proved it is possible to change the spin in the silicon by
subjecting the electrons to a magnetic field. University of Buffalo
physicist Igor Zutic said the nest step is to show that the devices can
work at high temperatures.Click Here to View Full Articleto the top

IBM named Mark N. Wegman an IBM Fellow during a ceremony earlier in the
week in San Diego. Wegman, 57, who was honored for his inventions
involving data compression tools, software algorithms, and compiler
optimization, was one of six IBM scientists to receive the title. IBM
Fellow is the highest honor that an IBM technologist can receive, and the
title has only been bestowed upon 199 employees over the past 44 years.
Sixty-seven active employees hold the title. "IBM has been very successful
over the years because of a very strong technical program that has allowed
the company time after time to reinvent itself," says Paul M. Horn, senior
vice president and director of IBM Research. "The Fellows are the people
who have demonstrated a level of expertise and competence that allows them
to fundamentally engineer the IBM company." Wegman joined IBM in 1975, and
his current focus at the Hawthorne, N.Y., lab of the Thomas J. Watson
Research Center is to improve computer programming teams. Another
researcher at the Westchester County facility, mathematician Brenda L.
Dietrich, became the 10th female IBM Fellow.Click Here to View Full Articleto the top

Director of Carnegie Mellon University's Robotics Institute Matt Mason
says in an interview that the institute pulls in more than $50 million in
sponsored research annually, and that its funding doubles every six or
seven years. He explains that science fiction is "a great reference" for
roboticists because of the visionary perspective the truly great authors
have, and points out that the institute's alumni are "very aggressive" in
starting spinoffs such as the Center for Innovative Robotics and the
Quality of Life Technology Center, which concentrates on assistive robots
for such applications as elder care, rehab, and health monitoring. Mason
says the emergence of robot caregivers is an inevitability; robot
companionship also offers a host of possibilities, although he acknowledges
that Americans view such a concept with skepticism out of fear that it
could promote social disconnection. Especially exciting to him is the
development of machine learning techniques in robots, while the institute
is developing robots that can fold paper into origami shapes as a project
to refine manual manipulation. Mason calls the origami robots "a great
challenge task that can inspire and challenge researchers for the next 50
years." Mason cites a lesson that is constantly reiterated in the field of
robotics: Virtually any action or function is tremendously more difficult
to mimic mechanically than people think.Click Here to View Full Articleto the top

IBM is holding an IBM Academy of Technology conference on agile
programming this week at its Almaden Research Center in San Jose, Calif.,
to discuss what is and what is not working, says IBM's Scott Ambler. He
says agile programming methods have been used in some places at IBM for
five years, but "we haven't taken the opportunity to actually get together
and coalesce [around] what we're doing." Agile's flexibility can make
programming more cost-effective and can lead to better code in a shorter
amount of time. "One of the things people like about agile methods is that
they enable you to do things rapidly, and if you don't get them right this
month, they will give you the right thing next month," said University of
Southern California software engineering professor Barry Boehm, a keynote
speaker at the event. Despite its advantages, agile programming has had to
face a number of challenges, Ambler says, including the mistaken belief by
some adherents that no requirements planning or architecture should be
created before starting a project, and industry perceptions that a database
schema cannot be changed. Changing the size of programming teams is also
obstructing agile programming. An ideal agile team has seven to 10
developers, but some programming projects can have as many as 4,000
developers. Mike Cohn, co-founder of Mountain Goat Software and a founder
of the Agile Alliance, says the transition to agile programming should be
treated like an individual project and include the formation of a
transition team.Click Here to View Full Articleto the top

ICANN Chairman Vint Cerf is encouraging interested applicants to apply for
one of the nine open positions at ICANN, but Cerf cannot help but wonder
whether the novelty of the Internet is subsiding. The application deadline
for the open positions--May 18--is fast approaching, and Cerf has tried to
generate interest in the positions with a video he posted to YouTube.
Former ICANN board member Karl Auerbach and Internet Governance Project
partner Milton Mueller are criticizing the "nominating committee" process
that ICANN uses to make board selections, with Mueller calling the process
"deliberately non-transparent." Auerbach asserts that the committee not
only nominates board members, but in fact chooses the candidates. "ICANN
has become yet another regulatory body that has been captured by those it
is to regulate," says Auerbach. "But while most regulatory bodies are
captured over decades, ICANN did it in record time, even by Internet
standards." Mueller believes ICANN should go back to holding public
elections to select the candidates, as it did in 2000, when Auerbach was
elected. The current process is purposely set up to eliminate
controversial figures from being nominated, says Mueller. But Cerf
counters that public elections have their own pitfalls, claiming that the
nominating committee is largely responsible for the large number of
non-American members on ICANN's board.Click Here to View Full Articleto the top

New York University instructor and consultant Adam Greenfield articulates
his concerns that ubiquitous computing--the proliferation of wireless
computers everywhere--could have dramatically negative ramifications for
privacy and civil liberties in his book, "Everyware: The Dawning Age of
Ubiquitous Computing." "The challenge now is to begin thinking about how
we can mold that emergence [of ubiquitous computing] to suit our older
prerogatives of personal agency, civil liberty, and simple sanity,"
Greenfield writes in his book. In an interview, the author notes how he
was trained to be skeptical of assertions that the comfort, security, and
convenience of new technologies will more than make up for the associated
losses in personal autonomy, privacy, or agency. Greenfield calls his book
"just one of a broader movement toward user-centered design," and he is
hopeful that the text's inclusion in the syllabi of some college
engineering programs will encourage more critical perception of ubiquitous
computing. He teaches an "urban computing" course that examines how
ubiquitous computing will probably affect the city and metropolitan life,
and from such studies he has concluded that most personal ubiquitous
technologies are causing people to withdraw from public engagement, which
could carry serious consequences for big North American cities, and anyone
interested in civic life in particular. Greenfield observes that many
people interested in engineering are lacking in empathy, perhaps by
necessity, but that the increasingly social nature of technology calls for
more empathetic engineers. Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top