ACM's special interest group on advanced multimedia applications will now
be headed by Klara Nahrstedt, a computer science professor at the
University of Illinois at Urbana-Champaign (UICU). In her new position,
Nahrstedt plans to press for multimedia and networking technology
improvements for the humanities, arts, sciences, and medicine. "We are
living in exciting times when digital video and audio are becoming
available via different platforms, in multiple size and shapes," Nahrstedt
says. "I plan to energize the multimedia community to make the multimedia
technologies pervasive across many boundaries." Multimedia technologies
are still not ubiquitous and pervasive. Nahrstedt also serves as the head
of UICU's Multimedia Operating System and Networking Group, a project that
involves the development of tele-immersive, 3D multi-camera room
environments that are able to facilitate distributed physical activities
such as physical therapy and entertainment.Click Here to View Full Articleto the top

Password protecting a wireless network may not provide enough security for
home networks and is definitely insufficient for larger organizations'
networks, according to a new by the A. James Clark School of Engineering at
the University of Maryland. Wireless users that routinely look for access
to any network available create a significant security risk as these
wireless "parasites" can expose the network and all of the computers on it
to a variety of security breaches. The problem gets even worse when
someone authorized to use a wireless network adds an unauthorized wireless
signal to increase the main network's signal strength, as these access
points are particularly vulnerable and are often completely unprotected.
Frequently, employees will set up their own wireless network, linked to the
official network, to boost signal strength in their office, creating an
unmanaged wireless access point. "If these secondary connections are not
secure, they open up the entire network to trouble," says UM assistant
professor of mechanical engineering and leader of the study Michel Cukier.
"Unsecured connections are an open invitation to hackers seeking access to
vulnerable computers." Cukier suggests network administrators limit signal
coverage and disable Service Set IDentifier broadcasting so it cannot be
detected outside the office or home. Additionally, Cukier suggests using
either Wired Equivalent Privacy (WEP) or Wi-Fi Protected Access (WPA)
encryption and regularly changing the encryption key.Click Here to View Full Articleto the top

Several software and hardware techniques have been developed to allow
iPhone users to recalibrate the device to work on any network instead of
exclusively on AT&T. George Hotz, a 17-year-old from Glen Rock, N.J.,
spent about 500 hours unlocking two iPhones, which can now operate on any
network thanks to a little soldering and some software tools. "This was
about opening up the device for everyone," says Hotz. Hotz described his
technique in detail on his Web site in the hopes that someone may be able
to simplify the process. Meanwhile, a group called iPhoneSimFree has
developed a software update that allows users to install the software and
switch the phone's SIM card with one from another carrier to unlock the
phone. The group says it has been working on the software since June, and
plans to sell it to anyone interested in unlocking large numbers of
iPhones, though a price has not been announced. Another company called
Bladox, based in the Czech Republic, recently started selling a device
called Turbo SIM that would allow users to attach another carrier's SIM
card and insert it into the iPhone to trick the iPhone into thinking it is
running on the AT&T network. Last fall, the Librarian of Congress issued
an exemption to the Digital Millennium Copyright Act that allows
individuals to unlock their cell phones, but the ruling does not apply to
companies and individuals such as Hotz who distribute or sell unlocking
tools and techniques. AT&T and Apple could sue such distributors, arguing
that people sharing modifications to iPhones are interfering with the
business relationship between Apple, AT&T, and their customers.Click Here to View Full Articleto the top

NASA researchers have successfully tested a series of algorithms and
software programs, known as the "Wavefront Sensing and Controls," that will
control 19 mirrors in the James Webb Space Telescope so all the mirrors act
as a single, highly sensitive telescope. After launching in 2013, the
mirrors aboard the telescope will bring light from the universe into focus.
The software will calculate the optimum position for the 18 primary
mirrors and the secondary mirror. "It's critical that all 18 mirror
segments be aligned in position so that they act as one smooth surface, and
the secondary mirror be placed exactly right," says NASA systems engineer
Bill Hayden. "This will allow scientists to clearly focus on very dim
objects that we can't see now." The telescope works by taking a digital
picture of a star. The image is then processed through mathematical
algorithms to calculate the mirror adjustments needed to focus the image.
NASA says that when properly aligned, the mirrors will allow the Webb
Telescope to capture dim light from objects at the edges of space and time
with extraordinarily sharp clarity.Click Here to View Full Articleto the top

University of Illinois professor Todd Coleman is working to understand and
mathematically model dynamic sensory information, or brain activity.
Coleman hopes that mathematical models of brain activity could be tested
and eventually replicated in engineered systems such as computers or other
devices designed to operate in a similar fashion. "Not only is it
interesting for pure science, but it has practical applications," Coleman
says. Coleman's research could also lead to brain-controlled products such
as video games or prosthetic devices for people with physical disabilities.
To examine how the brain functions in set situations, Coleman uses
electroencephalography (EEG) to capture electrical signals from volunteers
as they perform computer tasks. Coleman says finding the brain's mode of
operation is difficult, even on simple, known tasks, because brain activity
is dynamic and brain structure is changeable. Coleman says researchers can
investigate individual systems by using a reductionist approach, or by
changing individual variables to see what happens to get a unique result.
By collecting enough individual results, researchers can develop a larger
picture. Coleman's interest in computational neuroscience started in
graduate school at MIT when friends urged him to apply his communications
research to bioscience. In addition to his neuroscience research, Coleman
also works on developing techniques to improve communication methods by
making them simpler and more reliable.Click Here to View Full Articleto the top

By hacking into a nuclear power station, IBM researcher Scott Lunsford
demonstrated to the plant's initially skeptical owners exactly how
vulnerable their supervisory control and data acquisition (SCADA) software
was to attack. SCADA systems are employed nationwide to manage
infrastructure such as natural gas and oil pipelines, water filtration, and
trains. Moreover, the system's flaws are increasingly linked to the
Internet, exposing a large swath of national infrastructure to any hacker
with a laptop. Tipping Point security researcher Ganesh Devarajan has
notified SCADA software manufacturers about the weaknesses he has found,
adding that though the bugs are simple, they are perilous. One such
vulnerability enables hackers to insert their own commands, which would
enable the insertion of false data. Still, the overwhelming complexity of
critical infrastructure systems may be preventing criminals from
controlling SCADA systems. However, over the past two years, threats have
come in from hackers demanding ransom and claiming to have broken into
SCADA systems, says Allan Paller of the SANS Institute. The dearth of
security features in SCADA systems can be attributed to their age, as most
were created before infrastructure systems were linked to the Internet. In
addition, many SCADA software developers fail to provide security patches,
or make it hard to install such patches. Jim Christy of the Department of
Defense believes SCADA systems are in need of regulation by the government
so that changes are made to increase security to at least a minimum
standard.Click Here to View Full Articleto the top

Yale Fan won the 2007 Davidson Fellow Laureate for his quantum computing
research. Fan, a 15-year-old sophomore at Catlin Gabel, combined binary
algorithms to boost the processing speeds of next-generation computers. He
received a $50,000 scholarship for his work, and only four other students
in the United States were awarded the fellowship in early August. Fan was
mentored by Marek Perkowski, an expert in logic and quantum computing who
is a professor of electrical and computer engineering at Portland State
University. Fan met Perkowski two years ago in an effort to find an
internship that would assist him with his eighth grade science fair
project. "I was drawn in by quantum computing, figuring I could learn some
physics in the process," says Fan. His skills prompted Perkowski to extend
an offer to attend his graduate-level seminars, and he eventually presented
his work to the university students. Fan, who volunteered at PSU's
robotics lab this summer and has built a robotic arm, won a third-place
grand award in computer science at the Intel International Science and
Engineering Fair.Click Here to View Full Articleto the top

The University of Maryland, Baltimore County (UMBC) will create a
high-performance computational test laboratory based on the Cell Broadband
Engine (Cell/B.E.), as a result of a partnership with IBM. Supercomputing
research in aerospace and defense, financial services, medical imaging, and
weather and climate change prediction will be the focus of the Multicore
Computing Center (MC2). Cell processors can act as engines for image and
video-intensive computing tasks such as virtual reality, simulations, and
imaging; and also have applications for building very complex physics-based
computer models, and for bringing high-definition TV and high-speed video
to wireless devices. "Cell processors are groups of eight very fast,
independent but simple PCs with their own tiny memory all on a single chip
each with its own leader," says Milt Halem, a computer science professor at
UMBC who will serve as director of MC2. Researchers will have to find a
way to make all the chips work efficiently in parallel. "It's like a
distributed orchestra with 224 musicians and 28 conductors connected with
head phones trying to play Beethoven's Fifth Symphony together," explains
Halem. MC2 is scheduled to be operational this fall.Click Here to View Full Articleto the top

Argonne National Laboratory computer scientists and University of Chicago
economists worked together at the Institute on Computational Economics
conference to bridge the gap between the two fields and teach young
economists how to use advanced software and computational models. Economic
models are critical to policy analysis, but frequently economists do not
understand the mathematical theories used to create the models.
Additionally, economists are often unaware of improvements in computational
science that advance their industry. At the conference, more than 50
economics graduate students, postdoctoral researchers, and junior faculty
from the United States and Europe were shown how to use new computational
tools to find answers to economic policy questions. "We put great emphasis
on helping these young scholars apply cutting-edge software and techniques
in computational science to actual economics research problems," says Sven
Leyffer, Argonne computational mathematician and co-chair of the workshop.
The conference held tutorials on new analytical and numerical methods such
as dynamic programming, stochastic modeling, structural estimation, and
optimizing problems with equilibrium constraints. Other sessions allowed
participants to view software presentations and gain some hands-on practice
applying new software to challenging economics. "ICE2007 provided an
exciting opportunity to raise the level of sophistication in economics by
creating an interface between economists and computer scientists so that
they can address the computational challenges posed by modern economic
models," Leyffer says.Click Here to View Full Articleto the top

A Penn State study of a search engine's transaction log found that
consumers click on sponsored links fewer than two times out of every 10
searches, indicating that consumers prefer organic, or non-sponsored,
links. Penn State assistant professor in the College of Information
Sciences and Technology and lead author Jim Jansen says the study is one of
the first-ever academic studies of sponsored-link click through using
actual search engine data. "While the click through was only about 16
percent, I interpret this as being a real boon for search engines," says
Jansen. "Even at 16 percent, sponsored search is already a
multi-billion-dollar market, and this study shows there is plenty of upside
growth potential." The study found that 35 percent of searchers do not
click on a link because they either found what they were looking for on the
search-results page or because they believed there were no relevant links
on the page. When searchers did click, 84.2 percent of clicks were on
organic links and only 15.8 percent were on sponsored links. Prior to this
analysis, Jansen performed a study that suggests users are suspicious of
sponsored links. In that study, users were asked to select a link on a
page of results from a fictitious search engine. Jansen theorizes in his
current study that because of consumers' prejudice against sponsored links,
search engines may actually be doing users, and businesses that invest in
sponsored links, a disservice by separating sponsored and organic links.Click Here to View Full Articleto the top

Today's students are widely considered to be the most technologically
competent generation, which makes the fact that fewer and fewer students,
particularly girls, are interested in studying computers and technology all
the more baffling. Quocirca analyst Rob Bamforth argues that because
today's students have grown up with easy-to-use consumer technology
products that they actually are not tech savvy at all. "They're not aware
of technology, it's so regular and normal to use it they don't consider
it," Bamforth says. Computers used to be complicated devices that required
specific knowledge to use, but now training is no longer necessary and
young people no longer need to know how the technology works to use
computers. Bamforth also suggests that students lack role models and
inspiration. "I'm sure there are some business and industry figures who
could be made more accessible and become an inspiration for new
generations," Bamforth says. Jeff Brook, chair of the Recruitment and
Employment Confederation's IT and Comms sector group, agrees and adds that
the IT sector does not have the same energy it once did. "There's a
perception problem with parents, teachers, and school kids themselves about
the IT industry," says British Computer Society director Mike Rodd. "It's
seen as a poor career choice, contrary to employment rates." Jeannette
McMurdo, who organizes IT courses for women at Bradford College and works
for the UK Resource Center for women in the technology industry, says
women-only classes could help boost female participants. "Look at how many
girls do it as a single-sex school, which produce engineers in greater
numbers than mixed schools," says McMurdo.Click Here to View Full Articleto the top

Comparing the behavior of software programs is one way for companies to
determine whether their software has been incorporated into other programs.
Researchers at Saarland University in Germany have developed a tool, API
Birthmark, which allows users to run their own program and a foreign
program, analyze their behavior, and find similarities. A high degree of
similarity detected by API Birthmark would suggest that code theft likely
occurred, and that further investigation should be considered. The
approach is different from other detection methods that focus on the code
of the program, which can be easily obfuscated without destroying it,
making it difficult to prove in court that software theft occurred.
However, it would be difficult to change the behavior of a program without
breaking it, similar to a birthmark. David Schuler, Valentin Dallmeier,
and Christian Lindig have written a paper on the birthmarking technique,
which was accepted for the Automated Software Engineering (ASE 2007)
conference in Atlanta.Click Here to View Full Articleto the top

When discussing implications of the Estonian cyberattack, Michael Witt,
deputy director of the U.S. Computer Emergency Readiness Team, shies away
from the term "cyberwarfare" and stresses the importance of preparation.
The Estonian attacks showed the world the importance of cybersecurity, says
Witt. Because the attacks involved financial targets, nations have
realized that cybersecurity is not just essential to protecting critical
infrastructures, but also homeland security and economies. Industry
experts also noted that the attack was somewhat alleviated because
Estonia's ISPs offered bandwidth greater than the size of the DoS attack.
Witt notes that the U.S. critical infrastructure has "a more robust type of
backbone" than Estonia's critical infrastructure. That fact, combined with
years of planning, means the U.S. would react differently to a similar
attack, says Witt. Witt acknowledges that the country is not completely
secured from such an attack, but adds that plans have been established to
handle attacks. Witt asserts that political attacks do not rank within the
top three threats for U.S. security networks. Rather, phishing and other
socially engineered attacks are a major risk. Network operators should
also be aware of the activity assailing their networks and firewalls, and
should be aware of what is essential on the network and what the
consequences will be if it is removed. Witt emphasizes training, noting
that technical personnel must have enforceable policies in place in order
to respond to attacks. Future U.S. CERT cybersecurity exercises include
Zenith in 2007, which will be done with the Defense Department, and
Cyberstorm II, which will take place in March 2008 with the Department of
Homeland Security. Cyberstorm II is an exercise at the national level,
and will involve critical infrastructure representatives from across the
country as well as from international governments.Click Here to View Full Articleto the top

Anita Borg Institute for Women and Technology President Telle Whitney says
the recent media focus on IT outsourcing has convinced many women, and
parents of college-aged students, that IT does not have a solid future,
which is partly to blame for women's lack of interest in IT careers
compared to men. Perceptions of what an IT career involves also are
dampening women's interest in the field. The Information Technology
Association of America reports that the number of women in IT declined 20
percent from 1996 to 2004, and the National Science Foundation says women
received just 28 percent of computer science bachelor's degrees in the
United States in 2003, compared to 38 percent in 1985. "If you ask both
genders to identify what an IT professional looks like, the answer is still
that it's a man with a pocket protector and glasses," says Whitney. "And
there is a belief that you spend all of your time in front of a computer
and don't work with people, but the reality is quite different." Experts
say a few changes can attract more female workers to the IT industry.
First, IT needs an image makeover. People need to know that IT careers
involve more than programming and engineering, and that IT careers can be
flexible and include working with customers and offer creative
contributions. The image makeover is particularly important for exposing
"tween" and teenage girls to opportunities in IT, Whitney says. CIOs can
support the makeover by encouraging staff to talk to the community about
their careers. Companies also need to emphasize the necessity for workers
with skills that women are generally stronger in, such as working in teams.
CIOs can also send their female employees to conferences like those hosted
by the Anita Borg Institute so they can meet mentors and learn more about
IT career paths.Click Here to View Full Articleto the top

A team from the University of Illinois Urbana-Champaign's Graduate School
of Library and Information Science (GSLIS) will lead a two-year project to
preserve virtual worlds such as those found in early video games,
electronic literature, and Second Life. The project, called "Preserving
Virtual Worlds," will also be worked on by partners at the Rochester
Institute of Technology, Stanford University, the University of Maryland,
and Linden Lab, the creator of Second Life. GSLIS faculty member and lead
investigator of the project Jerome McDonough says interactive media is at a
"high risk for loss as technologies rapidly become obsolete." He says the
goal is to develop "mechanisms and methods" to preserve digital games and
interactive fiction. "In particular, we will be looking at the metadata
and knowledge management problems involved in preservation of highly
interactive digital works," McDonough says. The Library of Congress is
funding the project with a two-year, $590,000 grant through the "Preserving
Creative America Initiative," the most recent initiative of the National
Digital Information Infrastructure and Preservation Program. The first
phase of the project, which is set to begin in January, will attempt to
identity information needed to ensure any preservation strategy is
successful. In the second phase the team will try to develop XML stands
for encoding information so it can be included in digital repositories.
The final phase of the project will focus on testing the preservation
technologies the team developed in early phases.Click Here to View Full Articleto the top

Indiana University researchers have received a $1.96 million award from
the National Science Foundation to create a cyberinfrastructure to help
scientists better understand the current and future state of polar ice
sheets. The "Polar Grid" will span both poles using rugged laptops and
clusters deployed in the field in the polar regions, as well as a 17
teraflops cluster at IU and a 5 teraflops cluster at Elizabeth City State
University for detailed data analysis. The clusters will use Web 2.0 and
portal approaches to be highly accessible and easier to use. "The Polar
Grid project will transform U.S. capabilities in ice sheet research," says
Geoffrey C. Fox, director of Pervasive Technology Labs' Community Grids Lab
and IU professor of informatics. "With this technology, it will be
possible to collect, examine, and analyze data--and then use the results of
such analysis to optimize data collection strategies--all during the course
of a single expedition." In addition to advancing polar grid research, the
project advances Fox's existing efforts to provide greater access to
cyberinfrastructure to institutions that primarily serve minority students.
Elizabeth City State University is a historically black university in
North Carolina. The Polar Grid project will provide ECSU with a
high-performance computing cluster and access to IU's cluster through a
high-speed network connection. Linda Hayden, co-principal investigator
from ECSU, says the technology will support student leaning by expanding
ECSU's existing polar science efforts and by providing better access to
high performance computers.Click Here to View Full Articleto the top

Monolithic and monothreaded scalar processors can no longer deliver
steadily expanding computing performance as the age of many-core processing
moves forward, writes the High-End Crusader. Reasons for this include the
depletion of instruction-level parallelism. "With a thousand cores on a
die and a hundred threads per in-order multithreaded core, someone or
something had better master thread-level parallelism (TLP)," notes the
author. The High-End Crusader explains that parallel computing needs to be
reinvented with the participation of both the elite and mainstream parallel
computing communities, given the close connection between these approaches'
outcomes. "For a nanocore-die's memory-bandwidth walls, we need
engineering solutions to increase all of the following: 1) the nanocore-die
pin bandwidth, 2) the local (memory) and global (network) interconnect
bandwidths, and 3) the aggregate hardware DRAM bandwidth per gigabyte," the
author writes. "For a nanocore's memory-bandwidth walls, we need to
increase the hierarchical on-chip-network bandwidths." He cites the need
for sensible hierarchical caches that lower bandwidth requirements, are not
wasteful of bandwidth, and facilitate exploitation of on-chip "spatial"
dependence locality. "We need to reinvent heterogeneous processing
because, quite apart from useful-scalability imperatives, there are many
distinct types of heterogeneity, even many distinct types of processor
heterogeneity, and we will need to make intelligent choices about the type
(or types) of heterogeneity our applications need," the author
concludes.Click Here to View Full Articleto the top