A Department of Education report shows that educational software is no
more effective in improving students scores on standardized tests than
traditional classroom education, although a good deal of blame is being
placed on the school's implementation rather than the software itself. The
study, which looked at 15 math and reading programs used by 8,224 students
in 132 schools during the 2004-2005 school year, is the largest to date
comparing the standardized test scores of students whose schools used the
software with those whose schools did not. Educational software is
increasingly being used by schools systems looking for ways to meet the
requirements of 2002's No Child Left Behind Act. However, DoE
representative Katherine McLane said, "We are concerned that the technology
that we have today isn't being utilized as effectively as it can be to
raise student achievement," leaving open the possibility that the
software's potential to help is being hindered by human error. "The fact
is that technology is only one part of it, and the implementation of the
technology is critical to success," said Software and Information Industry
Association director Mark Schneiderman. Los Angeles abandoned the use of
$50 million of educational software it purchased in 2001 after it was found
to have no effect on standardized test scores. Not surprisingly, the
district blamed the lack of impact on poor implementation caused by a lack
of preparedness by teachers. Some feel that software should not be blamed,
because doing so cause mean the least affluent districts would do away with
it first, depriving these children of technology. Many schools involved in
the study are now critical of it, despite being enthusiastic when invited
to take part in 2004.Click Here to View Full Articleto the top

Researchers are experimenting with different ways of storing data on
incredibly small scales. A recent project at Keiko University for Advanced
Biosciences showed the ability to encode data in bacteria and have it
passed on through generations in an organism's DNA. The idea behind the
Department of Energy project is to allow data to survive a nuclear
disaster. Reading the data proved more difficult than encoding it and
researchers believe it could be decades before an effective data-retrieval
technique is developed. Another technique, storage of a single bit on an
atom, was achieved in 2002 at the University of Wisconsin, when a small
amount of gold on the surface of silicon was made to spur the self-assembly
of tracks on the nanometer level. In 1959, Richard Feynman imagined such
atomic memory with five atoms separating each bit, and the 2002 results
showed that the minimum empty space around each bit is four atoms. Reading
data encoded to atoms is relatively easy, requiring only a simple line scan
with a scanning tunneling electron microscope. However, writing the data
is more difficult and time consuming, since much less energy can be
extracted from such small bits during readout. Where the DNA storage
technique requires 32 atoms to store a single bit, this silicon atom memory
requires 20 atoms per bit. Finally, University of Arizona researchers are
working with microelectronic arms to read and write data in clusters of
molecules on nanotech organic film. The system uses cantilevers that write
data by injecting a current in the film that changes the electric
resistance at the point of contact. The team believes 1 million of these
cantilevers could be make to run in parallel, as they are simpler devices
than the transistors that run parallel by the millions in today's
processors.Click Here to View Full Articleto the top

Technology companies are starting to experience a shortage of skilled
labor in Central Europe, a region that has had a reputation for having a
deep pool of math and science graduates. Companies such as
Hewlett-Packard, SAP, and Dell have been attracted to the region with hopes
of finding IT talent that would be willing to work for a third of what IT
workers in Western Europe are paid. The competition for IT talent has
helped push up salaries by double digits annually, and some big companies
are now looking to relocate to the hinterlands to search for cheaper labor.
"There's too much competition in Sofia," Oleksandr Shcherbina, quality
chief for German communications gear maker Hilscher, says of the capital of
Bulgaria. The pain has been felt the most by smaller companies. "If you
build your economic model only around low-cost labor, you have a three- or
four-year window where you have an advantage," adds Sasha Bezuhanova, who
heads HP's operations in Bulgaria. At the same time, companies have to
contend with a desire that many people have, which is to leave the region
for opportunities to work abroad.Click Here to View Full Articleto the top

German security researchers have found a way to crack Wired Equivalency
Privacy that is much faster than previously discovered methods. They
recommend that those relying on the protocol to protect sensitive
information find a stronger means of protection. Earlier efforts showed
that WEP could be cracked in a matter of minutes, although this method
could be foiled by systems that change their security key every five
minutes, but the new research carried out at Darmstadt University of
Technology proves that it takes only three seconds to obtain a 104-bit WEP
key from intercepted data using a 1.7GHz Pentium M processor. The required
data can be accessed in less than a minute, and the attack itself requires
less computing power than previously thought, allowing it to be done in
real time as a person walks through an office, potentially using a mobile
device. Forty thousand packets captured means a 50 percent chance of a
successful attack, and 85,000 packets mean a 95 percent chance, according
to the researchers. WEP is still widely used for security in Germany,
often without encryption. However, this type of attack can be detected by
an intrusion detection system or by hiding the security key among numerous
dummy keys.Click Here to View Full Articleto the top

Professor Nancy Lynch of MIT will receive the 2007 Knuth Prize from the
ACM Special Interest Group on Algorithms and Computation Theory (SIGACT)
for her work in the theory of distributed computing. Lynch, an ACM Fellow,
currently holds the NEC Professorship of Software Science and Engineering
at MIT, where she heads the Theory of Distributed Systems Research Group in
the school's Computer Science and Artificial Intelligence Laboratory
(CSAIL). Her career includes the creation of distributed algorithms,
precise models for analyzing distributed processes, and the discovery of
limitations on what distributed algorithms are capable of. In 1982, she
received the Principles of Distributed Computing Influential Paper Award
(now known as the Dijkstra Prize) for her part in developing the Fischer,
Lynch, Patterson (FLP) impossibility result, which addresses the
impossibility of distributed agreement in the presence of processes
failures. Her career has impacted other areas of computing such as
database transaction procession, hybrid systems, security, and hardware
synchronization. Lynch is a member of the National Academy of Engineering
and was co-winner of the inaugural Wijngaarden Prize in 2006. Lynch will
be presented with the Knuth Prize at the ACM Symposium on Theory of
Computing conference.Click Here to View Full Articleto the top

Activists' outrage that the U.S. government's passenger screening
technologies are highly susceptible to false positives is logically
incongruous with their insistence that government databases should only
collect individuals' names, while Latanya Sweeney with Carnegie Mellon
University's School of Computer Science's Data Privacy Laboratory cites
government watch lists' reliance on the Soundex algorithm as a major
deficiency. Soundex, designed to index and retrieve soundalike surnames
with variant spellings distributed throughout an alphabetical list, creates
a situation in which the passenger screening system would confuse the
terrorist Osama bin Laden with Sex Pistols member Johnny Lydon. In
contrast, ChoicePoint maintains a database on American citizens that
indexes them by at least four data points (name, address, birth date, and
social-security number). Among the products and services ChoicePoint
divisions offer are medical information, tenant screening, drug testing,
employment-background screening, credential verification, motor-vehicle
records, mortgage-asset research, and database software. ChoicePoint's
thorough records were also used to identify Sept. 11 victims via their DNA.
Privacy Times editor-publisher Evan Hendricks believes it would be a bad
idea for the government to outsource administration of the watch lists to a
data aggregator such as ChoicePoint, which has been condemned by privacy
proponents for its continuous efforts to build dossiers on individuals and
for selling records to election officials, police, and direct marketers.Click Here to View Full Articleto the top

A new Duke study shows that the motivation behind offshoring of technology
jobs has much more to do with low costs than with a shortage of qualified
U.S. workers. Outsourcing R&D jobs is causing the United States to lose
its global competitive advantage, as China positions itself for future
success and India suffers from the politics involved in its educational
system, according to the study. Also addressed is the perception that the
U.S. graduates approximately 12 times less engineers than either China or
India, which was found to be false; the United States actually graduates a
very similar number. China's National Reform Commission has reported that
the majority of 2006 graduates will not find work, while India is suspected
to be experiencing a shortage of engineers. Those responding to the survey
said the advantages to hiring U.S. workers were communication skills,
business knowledge, and strong education and skills, whereas Chinese and
Indian workers were appealing for the lower costs of hiring them. However,
a good deal of tech jobs, especially R&D, will continue to be moved to
China because the country graduates more engineers with Master's degrees
and PhDs than the U.S. In order to rectify this situation, wide ranging
effort will be required of the U.S, according to the study's co-author,
Vivek Wadhwa. "Even if the nation did everything that is needed, it will
probably take 10 to 15 years before major benefits become apparent," Wadhwa
says. "Given the pace at which globalization is happening, by that time
the United States would have lost its global competitive edge. The nation
cannot wait for education to set matters right." The report closes by
recommending that the United States makes it easier for foreign workers to
immigrate and start companies here.Click Here to View Full Articleto the top

This year's quota for H-1B visas was reached on the first day the
applications were accepted, intensifying the debate over the need for the
quota to be raised. Forecasts that the visas would go very quickly caused
many technology companies to submit all of their applications on the first
day possible, rather than spreading them out over several weeks or months.
However, Programmers Guild founder John Miano says the rush was "an
organized campaign to exhaust the quota as quickly as possible" in hopes of
convincing Congress to increase the cap on the visas. "The fact that
industry is now capable of putting through a staggering number of H-1B
applications in just one day is the best illustration yet of why we need an
H-1B quota," he added. "Industry has proved it will not be self-policing
when it comes to H-1B numbers." Industry advocates claim the number of
jobs being created is increasing while the unemployment rate decreases.
The Department of Labor estimates that 100,000 new IT jobs will be created
by 2014. This year's H-1B shortage prevented many companies from securing
visas for soon-to-graduate foreign students, meaning after graduating they
could work for competitors overseas. However, opponents of a cap increase
point out that tech companies are mostly interested in the lower wages they
can pay foreigners, rather than any shortage of U.S. workers. A bill
recently introduced in the Senate would increase the "prevailing wage" paid
to foreigners and would enforce stricter rules for allowing U.S workers a
chance at these jobs.Click Here to View Full Articleto the top

A study and roundtable discussion by Microsoft Canada has found that IT
education is lacking in communication and design training, or "soft
skills," and that many 11th grade through second-year college students
believe their technical skills are not being sufficiently developed.
Courses that focus on soft skills would be more appealing to students,
especially females, says CISP consultant Margaret Evered, as evidenced by
the growth of social networking sites that focus on creativity and
teamwork. Most IT professionals do not measure success by their soft-skill
level, but the next generation of IT professional will have to be more
involved in business processes. Training, especially in business
practices, have become very scarce in IT departments. "They fail to
recognize that this could help them," says University of Waterloo student
Neville Samuell. IT education is also harmed by the teaching of outmoded
languages that only cause students to need additional training later. The
student survey showed that although 92 percent see technological experience
as somewhat to very important for success in a career, only 42 percent
believe their school cultivates these skills in them. Samuell says most
students get such experience more from creating blogs and updating their
MySpace page than from the classroom. "There's no way schools can keep up
technologically," says University of Toronto computer science professor
Eugene Fiume. Unless this trend is reversed, a vast divide will emerge,
leaving those without computer skills to be "the illiterates of tomorrow,"
Evered says.Click Here to View Full Articleto the top

A new report created for ICANN's board suggests that the group "explore
the private international organization model" and "operationalize whatever
outcomes result," meaning that ICANN could become an independent
international organization with immunity from national laws. Many such
groups, including the United Nations and WIPO, are located in Geneva,
Switzerland, and there is speculation that ICANN could seek to move there.
Fueling the speculation is the following sentence from the report:
"ICANN's headquarters may remain in the U.S." This statement is obviously
more ambivalent than saying that ICANN will remain in the United States. A
1945 U.S. law gives international organizations "immunity from suit" and
prevents their property and assets from being searched. This law also
gives employees immunity from income taxes and from customs duties.
Analysis produced by ICANN in August 2006 shows that the group finds the
Swiss-based model for international groups appealing. Whatever ICANN's
intentions, the Bush administration is unlikely to allow the Internet
oversight group to stray too far from U.S. control, opening the door for
more debates about Internet governance.Click Here to View Full Articleto the top

The National Institute of Standards and Technology (NIST) recently
announced that facial recognition technology has improved by a factor of 10
in the past four years. The institute recently held tests called the Face
Recognition Vendor Test (FRVT) 2006 and the Iris Challenge Evaluation (ICE)
2006, which compared the ability of vendor systems to recognize
high-resolution still images, 3D facial images, and single iris images, in
both controlled and uncontrolled lighting. Recognition performance was
found to be the same for the still 3D facial images and single iris images.
"In an experiment comparing human and algorithm [system] performance, the
best-performing face recognition algorithms were more accurate than
humans," according to the institute. Error rates for facial recognition in
a partially automated 1993 evaluation were found to be around 0.73 percent
evaluation and were found to be around 0.01 in the fully automated FRVT
2006 evaluation. The time required for algorithms to process iris images
ranged from six hours to 300 hours. The study tested performance under a
variety of lighting conditions and took into account the resolution of
different images used.Click Here to View Full Articleto the top

IT industry observers are still trying to figure out why the number of
women in the industry has declined substantially in recent years.
According to the U.S. Department of Labor's Bureau of Labor Statistics, the
number of women in eight IT categories has declined from 984,000 in 2000 to
908,000 last year, or by 7.7 percent. Six years ago, women represented
28.9 percent of all IT workers, compared with 26.2 percent of the industry
in 2006. Lynne Ellyn and Christine Davis have authored a report for IT
advisory firm Cutter Consortium, and they suggest that women have issues
with an intolerant working environment, a field that does not appreciate a
balanced lifestyle, declining opportunities in the industry, and the social
stigma or perception of IT. "I think this trend is an indication of the
often abrasive experience women have in the IT arena," says Ellyn. "As I
reflect on this disturbing trend, I recall countless incidences where women
have been discounted and marginalized while struggling to balance family
and work."Click Here to View Full Articleto the top

Director of MIT Media Lab's Cognitive Machine Group Deb Roy hopes
recording the first three years of his son's life will help his efforts to
teach a robot to understand language and speak by mapping the entire path
of early language acquisition without interruption, under the auspices of
his Speechome Project. Roy's work potentially carries benefits not just
for the field of robotics, but for child psychology as well. Cameras and
microphones installed in Roy's house capture nearly every aspect of his
infant son's daily life, and this data is stored on a disk array. Data
from the disks is backed up to an automated tape library, and every 40 days
Roy uploads the accumulated recordings onto a dedicated 250 TB array in the
Media Lab. Transcripts are generated so that key moments and trends, such
as vocabulary acquisition, can be pinpointed by data mining, while data
visualization is also used to track important patterns. The hope is that
computers will become capable of testing theories about language
acquisition by matching researchers' projections to recorded patterns. Roy
plans to use stimuli generated by Speechome to educate a sensor-equipped
robot named Trisk in order to more deeply explore the balance between
hardwired programming (nature) and learned behavior (nurture).Click Here to View Full Articleto the top

Imperial College London professor Jeff Kramer believes abstraction skills
are key to computer scientists and software engineers' ability to create
designs and programs that are clear and elegant. He cites Keith Devlin,
who states, "Once you realize that computing is all about constructing,
manipulating, and reasoning about abstractions, it becomes clear that an
important prerequisite for writing (good) computer programs is the ability
to handle abstractions in a precise manner." Following Jean Piaget's four
developmental stages, Kramer says the fourth stage, in which individuals
acquire the ability to think abstractly and scientifically, is only
achieved by 30 percent to 35 percent of adolescents, while some adults
never attain the ability, possibly because of the absence of training and
specific environmental conditions. Necessary for guaranteeing that
individuals will become capable of abstract thinking is an effort to ensure
effective education and computer skills evaluation, writes the author.
Kramer reasons that measuring college students' abstraction skills on an
annual basis would "help to gain confidence that abstraction is a key
indicator of ability ... provide an alternative means for checking
students' abilities ... [and] also help to assess the efficacy of our
teaching techniques." Testing students' abstraction skills at the time of
application to study computing is problematic, because no appropriate tests
exist, according to Kramer. He takes note of a suggestion of Orit Hazzan
of the Technion's Department of Science and Technology, proposing the
development of specific test questions as well as diverse tasks and
descriptions that support the accumulation of qualitative as well as
quantitative data for the purpose of studying different kinds of
abstractions, different abstraction levels, and different goals for those
abstractions.Click Here to View Full Article
- Web Link to Publication Homepage
to the top

Zhejiang University computer science professor De-Ren Chen and PhD student
Wen-Ying Guo detail an e-learning implementation strategy using Semantic
Web technology to imbue e-learning with more flexibility. The authors
write that using ontologies to describe learning materials could allow the
lack of a shared comprehension between terms in one vocabulary and between
terms in various metadata vocabularies to be avoided. Their approach
involves the provision of metadata for describing the content, context, and
structure of learning materials, and Chen and Guo note that one of the most
commonly used metadata schemes currently on the Web is the Dublin Core
Metadata Initiative's Dublin Core Schema. Dublin Core is designed for
metadata of all categories of digital resources, and therefore does not
abide by the specific requirements the authors cite in describing learning
resources. Thus, an extension of Dublin Core called the Learning Objects
Metadata Standard (LOM) was created by the IEEE's Learning Technology
Standards Committee, and LOM allows each learning object to be represented
via a series of 70-plus attributes split up into nine categories. Chen and
Guo define two ontology classes: An application ontology that describes
the individual who wants to choose the course to study, and one that
specifies training domain providers, including courses, location, and time.
Two basic operations--semantic querying and semantic mapping--are employed
to achieve semantic solution. The approach is designed to address several
problems that can crop up in an e-learning environment: The expression of
semantically identical concepts by dissimilar terms in the domain
vocabulary, and the use by two applications of the same term with different
definitions.Click Here to View Full Article
- Web Link to Publication Homepage
to the top