ACM TechNews is published every week on Monday, Wednesday, and Friday.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM.
To send comments, please write to technews@hq.acm.org.

A UC Berkeley report released Thursday raises questions over the Florida vote in the presidential election, with its hypothesis that President Bush could have received as many as 260,000 more votes than he should have via e-voting. The researchers analyzed Florida election results in the last three presidential races to measure support for Republican and Democratic candidates and parties in Florida's 60-plus counties over eight years, taking into account such variables as voter populations, average voter income, race, age, and voter turnout between 2000 and 2004. The study finds that the number of votes awarded to Bush in the 15 counties using touch-screen voting systems surpassed the number of votes he should have received, while the results in counties using other kinds of voting equipment were perfectly aligned with projected outcomes, given the variables considered. The discrepancy was most pronounced in heavily Democratic counties, and UC Berkeley sociology professor Michael Hout estimates that the odds of such an anomaly were less than one in 1,000. Jill Friedman-Wilson with Election Systems & Software, supplier of the touch-screen machines used in two of the three counties where the discrepancy was most prevalent, insists that rigorous pre-election testing and her company's track record ensure that the equipment is secure, reliable, and accurate. Susan Van Houten with the Palm Beach Coalition for Election Reform cites reports of electronic votes for Kerry being registered as votes for Bush, as well as machines failing to print out a final tally tape at the end of the night. The Berkeley researchers say their goal in issuing the report is to spur the public and election officials to study the e-voting machines and fix the inability to conduct a viable recount with paperless systems.
Click Here to View Full ArticleFor information regarding ACM's e-voting activities, visit http://www.acm.org/usacm.

Congress authorized a three-year commitment of $165 million to U.S. supercomputing efforts with its approval of the Department of Energy (DOE) High-End Computing Revitalization Act of 2004 on Nov. 17. The bill sets up a supercomputing research and development initiative inside the Energy Department, and empowers the DOE to erect supercomputer facilities that American researchers can use on a competitive, peer-reviewed basis. House Science Committee Chairman Rep. Sherwood Boehlert (R-N.Y.) declared that the bill will bolster American competitiveness and economic health, since supercomputing is playing an increasingly key role in the global leadership of U.S. industry and academia. A House Science Committee representative estimates that the bill will apportion $50 million for supercomputing R&D in 2005, and $55 million and $60 million in the following two years. IBM VP Irving Wladawsky-Berger and Silicon Graphics (SGI) CEO Bob Bishop issued a joint statement describing high performance computing as critical to solving societal problems through the provision of affordable, fast, and usable systems developed via academic, industrial, and federal partnerships. The High-End Computing Revitalization Act's approval "is clear recognition that to out-compete in the 21st Century, the U.S. will have to out-compute," Bishop added. An earlier version of the bill was approved by the House in July, but the Senate held out on passing the measure until an amendment to establish at least one R&D facility dedicated to software development for supercomputing applications was added.
Click Here to View Full Article

Former White House cybersecurity czar Richard Clarke calls cybersecurity "vitally important for our economies," and has no doubts that governments are considering using cyberwarfare against other nations. He takes aim at government cybersecurity initiatives across the globe, arguing that most governments' strategies to protect themselves, the cyberspace that citizens use, and private companies are inadequate. On the other hand, Clarke says IT security companies have progressed by marketing their products as empowerment tools for customers rather than exploiting the pervasive atmosphere of fear, uncertainty, and doubt. He also reports that many companies, such as those in the financial services sector, have improved their security significantly--not just by raising their IT security budgets, but by deploying security measures properly. Clarke attributes the lack of coordination between user groups and security organizations, despite much talk in information-sharing forums, to the difficulty IT security advocates have in convincing corporate executives that better security carries solid returns on investment. "Information-sharing forums are great for technical solutions, but haven't been all that great in helping the [chief information security officer] to tell their story to their superiors," he remarks. Clarke does not regret his much-publicized criticism of the Bush administration's anti-terror policies, though he acknowledges that whistle-blowers should take care not to reveal certain kinds of information, lest malevolent parties exploit it. He explains that government monitoring of Internet activity is limited by technical and legal factors: In the United States, for instance, the amount of online traffic is overwhelming, while individual monitoring cannot be sanctioned without a court order.
Click Here to View Full Article

Google is rolling out a new search service that will index academic research such as books, technical reports, and peer-reviewed papers. The Google Scholar service offers unique access to a number of publications and publishers who have cooperated with the effort, says project leader Anurag Acharya. Participating groups include the Association for Computing Machinery, the Institute of Electrical and Electronics Engineers, and the Online Computer Library Center. Acharya says Google's primary motivation was to give back to the academic community from which it originated and has benefited greatly from. Oftentimes, the latest scientific literature is not available in places such as India, but by making normal scientific citations available in one place and publishing previously unavailable works online, Google Scholar will help researchers worldwide to make faster and better informed discoveries. "We don't know where the next breakthrough will come from," Acharya says. "We want everyone to be able to stand on the shoulders of giants." SearchEngineWatch.com editor Danny Sullivan says other Web search providers such as Yahoo! are likely to explore more vertical search opportunities that index particular material soon. Stanford University HighWire Press director John Sack says Google Scholar is another example of how academic research is becoming more dependent on online tools as a first go-to source of information.
Click Here to View Full Article(Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

The Polish government has said it no longer supports a controversial patent law for "computer-implemented inventions" because of flaws in the text, throwing the future of that legislation into doubt. The European Union has been seeking to unify its patent law concerning computer, and especially software, inventions and the European Commission submitted the "Patentability of Computer-Implemented Inventions" to the directly elected European Parliament in February 2002. The Parliament modified the legislation to disallow software patents, but the Council of Ministers, which represents national governments, introduced their own version that includes software patents. With the reversal by the Polish contingent to the Council of Ministers, the software-patent version of the law could go back to the European Commission. EICTA director general Mark MacGann, whose group is made up of major European and U.S. technology firms such as Microsoft and SAP, admits that the current form of the legislation is not perfect, but says sending it back to the European Commission could result in patent confusion for years longer; he says the EICTA does not want software patents on their own merit, but rather a broad technical patent framework that would protect specific technical innovations. It is not the intention of the EICTA to institute U.S.-style patent law, which allows protection of business methods. MacGann sounded optimistic that Belgium officials could change their previous rejection of the patent proposal before the Council meeting next month, thereby allowing the legislation to go forward for review in the Parliament. The NoSoftwarePatents campaign, however, said the Polish decision meant the Council would have to renegotiate their version of the law.
Click Here to View Full Article

MIT professor Anant Agarwal's design of reprogrammable computer chips that can drive multifunctional devices is just one key element of MIT's Project Oxygen, a $50 million initiative to make computing simple and virtually ubiquitous. The project's goals include pervasively embedding computers that users control and interact with through voice command, and combining numerous sophisticated technologies into a unified, mostly invisible system while still supporting access to computing resources. Achieving the second goal requires making speech-recognition software compatible with visual-recognition software and movement-tracking sensor networks. Among the experiments MIT has undertaken to bring about this vision is powering a bank of 1,020 microphones with Agarwal's raw architecture workstation (RAW) chip, employing special software to align sounds with video footage so that the microphones can recognize and track a single speaker in a noisy crowd. Agarwal anticipates that within two to six years a chip's computing capacity will be approximately 1,000 times what it was in 1990. Oxygen is partly sponsored by the Defense Advanced Research Projects Agency, with additional funding from corporate partners such as Hewlett-Packard, Nokia, and Philips Electronics. Core components of the initiative besides the RAW chip include the development of language processing software that can help computers understand meaning; the Cricket indoor sensor network for tracking people and objects through the concurrent transmission of radio and ultrasound signals; and two potential operating systems for integrating the various Oxygen technologies. "Part of what we expect Oxygen to do is push the boundary and show what's possible," notes Fred Kitson with HP's Mobile & Media Systems Lab, who says that the project's early work inspired his company's virtual studio collaboration system.
Click Here to View Full Article(Access to this site is free; however, first-time visitors must register.)

A wireless local access network's (WLAN) actual throughput often turns out to be significantly lower than advertised as a result of increasing user congestion, but WLAN vendors and users are taking steps to alleviate such problems. Wireless transmissions are vulnerable to interference from conflicting WLANs as well as external factors such as physical objects, while vendors' attempts to improve the penetration of WLAN signals through technological augmentation frequently give rise to additional interference. The potpourri of available WLAN equipment also contributes to the interference problem, because some products boast stronger transmission power than others. Congested networks are forced to reduce transmission speeds in order to sustain the connection between two end points. Meanwhile, multiple stations may attempt to broadcast data simultaneously and thus will use a back-off algorithm to regain access in the event of a collision, which means that increasing numbers of collisions add up to longer back-up times. Other variables resulting in lower than advertised throughput include WLANs reducing transmission rates to maintain reliable connections with increasingly distant users; dealing with disruptions more often than wired LANs; and the networks' need to establish users' identities and monitor their movements. Vendors are working to expand the broadcast range of WLAN links, minimize cross-channel interference, and create microprocessors that transmit data concurrently in order to address the congestion problem. Users' efforts include the formation of spectrum alliance groups to manage neighborhood interference by assigning diverse frequency channels to different companies, and keeping disruptive devices such as cell phones or microwaves away from WLAN base stations.

Consumers and employees' increasing use of mobile phones, personal digital assistants, and other portable devices to access online information, carry out Web-based transactions, and communicate via email has spurred the World Wide Web Consortium's (W3C) Mobile Web Initiative to make the process simpler and more convenient. Problems the initiative seeks to address include poor rendering of Web content on mobile gadgets with small displays, and Web sites that are inaccessible or difficult to navigate. The consortium's mobile Web access effort stresses approaches such as the Voice Browser, whose Voice XML 2.0 spawned standards such as the Speech Interface Framework, a work in progress. The W3C's Janet Daly reports that this week's informal workshop in Spain will be an opportunity for attendees to assess their current progress in easing mobile Web access. "What's left to do to make this a thriving environment for mobile users that's going to give them the convenience, the pleasure and the productivity that they want out of the Web," she explains. The W3C will collocate the workshop with the Open Mobile Alliance (OMA), and both organizations agreed last year to jointly develop mobile data services standards. In its earlier incarnation as the WAP Forum, the OMA formulated an abortive plan to mobilize Web access by rewriting the protocol stack without employing the Internet.
Click Here to View Full Article

National Institute of Standards and Technology (NIST) engineer Keith Stouffer reports that Supervisory Control and Data Acquisition (SCADA) systems critical to U.S. infrastructure are increasingly susceptible to cyberthreats due to their growing linkage with IP networks. Stouffer also chairs NIST's Process Control Security Requirements Forum, which released the first draft of its System Protection Profile for Industrial Control Systems (SPP ICS) last month. He explains that SPP ICS is an industry specification that addresses such process control security issues as anti-spoofing measures, identification and authorization, logging and auditing, default security, physical security, voluntary encryption, and policies and procedures for secure management practices. "By having a set of products available with configurable security features, end users can select the appropriate off-the-shelf device and configure its security features to match their risk/impact situation," notes Du Pont's Thomas Good, a user representative on the NIST forum. "Companies will consider SPP ICS compliant control systems on modernization projects or new production lines when the risk is sufficiently high." Good projects that industrial process security and management should be affected by the SPP standards in a few years, but he does not foresee many companies replacing existing SCADA systems because of the associated costs; making plant operations SPP ICS compliant may require a certain degree of retraining, Good adds. Stouffer reports that the issue of certification has only recently become a topic of discussion among the forum, and its usefulness has yet to be determined. In fleshing out the standards, the forum has invited input from the transportation, pharmaceutical, electric power, food, and water utilities sectors, which are typically heavy users of SCADA systems.
Click Here to View Full Article

National defense officials in the United States and Britain are working on neural network software that will use the example of the Cambrian explosion evolutionary period to evaluate potential terrorist threats and countermeasures, writes scientist Andrew Parker. Officials at the Pentagon initiated the parallel programs after reading Parker's book, "In the Blink of an Eye," that details how simple worms and jelly-like organisms responded to the development of sight some 540 million years ago. Within just 5 million years, Earth's animal life was vastly diversified and changed. The U.K. Ministry of Defense and U.S. Defense Advanced Research Projects Agency are working on creating neural network software that uses the Cambrian fossil record as a model for how to quickly evolve attack and defense mechanisms; once the model is completed, the system will be fed a broad range of data, including information on how people travel, communications such as the Internet and postal service, use of resources, and the availability of weapons, chemicals, and other dangerous materials. Eventually, the hope is that this system would be able to quickly analyze a variety of information and identify obvious and sometimes inconspicuous threats. Parker suggests this system could play the role of a senior national security chief that collects and analyzes threat information coming from different sources within the government. Such analysis capability was lacking before the Sept. 11 attacks and prevented the government from acting decisively to stop it. The so-called Cambrian project is ambitious and is unlikely to succeed in the first few prototypes because designers are still identifying relevant data to be fed into the system; but if it works, it would be able to quickly evolve national security in the face of an innovative terrorist enemy.
Click Here to View Full Article

Purdue University chemical engineering professor James Caruthers notes that data-mining is usually the research method of choice for computer-aided scientific discovery, but his school's data cave operates on the principle of knowledge discovery. The extraction of data from knowledge, or "discovery informatics," is defined by Purdue researcher Venkat Venkatasubramanian as a process that "enables researchers to test new theories on the fly and see how well their concepts might work in real time via a three-dimensional display." Purdue's data cave is such a display--a physical space where researchers comprehensively visualize complex, software-driven simulations on tiled walls while wearing special eyewear that provides a stereoscopic view of information. There are two stages to discovery informatics, according to Purdue's Emil Venere: A "forward model" that determines the likely behavior of a specific material by combining knowledge with neural network software, and an "inverse process" that extrapolates a structure or formulation that is likely to exhibit properties entered by researchers. Venkatasubramanian calls the inverse process "an advanced method for product design." The data cave was used to visualize data about the predicted behavior of chemical reactions computed by software run on supercomputers. The prototype software required to manage the massive amount of data and simulations was co-created by Purdue chemical engineers and researchers in the school's e-Enterprise Center, according to center director Joseph Pekny.
Click Here to View Full Article

Since universities have a wealth of personal information and a large amount of bandwidth, hackers bent on identity theft, control of personal computers for their own uses, and simply bragging rights are targeting universities. The University of Colorado, University of Texas, University of California at Berkeley, Boston University, Georgia Tech, Southern Illinois University, San Diego State University, and many other schools have reported recent hacker attacks that compromised personal information. John Denune, San Diego State's technology security officer, says, "It is hard because security and convenience are kind of mutually exclusive. So with a university environment, we always have to keep our education mission in perspective because we can't lock things down like a business would." To make identity data more secure, Colorado lawmakers are forcing schools to discontinue use of Social Security numbers for student identification by 2008 with many schools already working to comply. Universities are locking down personal information, but complete lockdown is hard with so many computing devices and dependence on students themselves to use antivirus software and computer scans, says Colorado University IT head Dennis Maloney. He says, "Security has become the No. 1 agenda item every day for all the IT professionals." Meanwhile, legal experts note that laws against hacking are hard to enforce due to the anonymous nature of the Internet making it difficult to track down offenders. Maloney says that despite stepped up efforts to boost security, a new security vulnerability appears daily, and he doesn't see "the light at the end of the tunnel for that stopping."
Click Here to View Full Article

Researchers at Melbourne University are using a network of computers to study and map the brain, and intend to collaborate with other researchers around the world over the Web. The NeuroGrid application of the university's Howard Florey Institute is a cluster, but the software layers are part of a grid, explains research leader Dr. Gary Egan. "A grid computer is not here yet but we are aiming to achieve this in 12 months; our focus is on the user application end, but we want to close the loop so when things get processed the results are incorporated into the database," says Egan. Started 18 months ago as a database of images, the grid makes use of 12 G5 PowerMac and 12 Pentium 4 Linux workstations with 3 TB of Xserve RAID for storage; the group chose to work with Apple systems after the introduction of the Unix-based OS X. The group uses the desktops for processing--forming a grid--when they sit idle. The researchers use Sun Grid Engine, Globus middleware, and a Java workflow engine to control the grid software, and the open-source PostgreSQL database allows for greater sharing of data. The group has applied for an Australian Research Council grant for up to $200,000 to further build on its neuroinformatics system.
Click Here to View Full Article

Turbolinux in the United States has joined forces with Progeny in Japan, Mandrakesoft in France, and Conectiva in Brazil to form the Linux Core Consortium in an effort to create a common approach to implementing the Linux Standard Base (LSB) 2.0 guideline. The companies will use the LSB 2.0 software blueprint as they create programs to run on their versions of the operating system. The Free Standards Group (FSG) introduced the software blueprint in January to standardize certain aspects of Linux development. Another Linux standards coalition, Open Source Development Labs, will also back the effort, along with Red Hat, Novell's SuSE, Computer Associates International, Hewlett-Packard, and Sun Microsystems. Expected to be available by next April, the common core will be included in the Turbolinux Enterprise Server, Progeny Componentized Linux, Mandrakesoft Corporate Server, and the Conectiva Enterprise Server. "The Linux Core Consortium takes [existing] support one step further by creating a binary implementation of the LSB that will help in our efforts to secure widespread certification," said FSG executive director Jim Zemlin.
Click Here to View Full Article

In the latest issue of the International Journal of Digital Evidence, University of Florida doctoral student Mark Foster details a new "process forensics" technique that can yield twice as much forensic evidence to unmask the perpetrators of computer crimes. Foster, who co-authored the paper with UF computer science professor Joseph Wilson, says the method combines intrusion-detection and checkpointing technology to provide digital investigators with the most possible data to solve a case. The UF student says the technique takes a new spin on cybercrime by focusing on intruders who want to hack a running program. International Journal of Digital Evidence editor John Leeson thinks Foster's method will help digital detectives contend with hacks as they occur. Process forensics uses an intrusion-detection system that automatically creates checkpoints or intermittent snapshots of a running computer program. The method is directed at hackers who target host-based systems using exploits such as buffer overflow attacks, in which the intruder infiltrates the system through a flaw in a running program. Another advantage of Foster's technique is that users do not need to be trained to use the intrusion-detection system. "I like the fact that [Foster is] taking a proactive approach--forensics for years has been a reactive field," Leeson notes.
Click Here to View Full Article

Attendees at the Computer Security Institute's annual Washington, D.C., conference cited an increased need for more strategic data security tactics, even as the IT security slate continues to be primarily concerned with operational and tactical thinking. Security managers pointed to communication lapses between business units and security organizations, and also to the growing complexity of IT system threats. "We're still fighting a lot of yesterday's battles," stated Yeshiva University information security administrator Fred Trickey, who noted that managers are devoting too much time to contending with flawed code and the latest malware instead of stressing the use of IT security as a vehicle for business initiative enablement. Trickey said most IT security personnel's inability to take a more proactive stance is attributable to sparse resources or management support. First Data's Tony Spinelli said security managers often fail to understand business needs, and frequently lack a long-term IT security strategy as a result. Consultant Roger Fradenburgh argued that security staffs must become familiar with the lingo of business users in order to prove their importance and generate solid returns. Bose information security director Terri Curran warned that shifting business needs and the increasing sophistication of threats can force security managers into a corner where they have little else to work on but tactical and operational chores. She and other attendees reported that growing regulatory requirements and the increased likelihood of intrusions encouraged by the adoption of new technologies are bogging them down with operational aspects.
Click Here to View Full Article

Stanford University computer science professor David Cheriton warns that the Internet will soon play a crucial role in the operation of virtually every critical communication network, and give enemies the opportunity to inflict potentially catastrophic economic damage through cyberattacks. In addition to inadequate security, the Internet is suffering from a shortage of IP address space, a problem that the Internet Engineering Task Force (IETF) has been trying to rectify with the development and deployment of IP version 6 (IPv6); but IPv6 adoption has been stymied by users' preference for Network Address Translation (NAT), an inexpensive measure that allows a single IP address to mask several computers. Cheriton, however, thinks NAT offers better Internet protection than IPv6's encryption scheme, and is attempting to prove its legitimacy through his experimental Translating Relaying Internetwork Architecture integrating Active Directories (TRIAD) initiative. TRIAD aims to solve the Internet address shortage by assigning hierarchical addresses to data packets, an approach that fully links the NAT-concealed computers and allows users to connect as many machines as they want to the Internet. The TRIAD scheme makes finding IP addresses a function of the NAT boxes, which will transparently share not just information about who is looking after a specific name, but also information about rogue data packets--a vital mechanism for ensuring Net security. TRIAD may never be implemented in the real world, but Cheriton is convinced that his work will influence enough people to ensure that the NAT concept will ultimately trounce IPv6. Though Cheriton's concept is undeniably radical, the accuracy of his past predictions about networking's evolution have given him an enormous amount of credibility.

In his 2004 book, "The Anarchist in the Library," New York University culture and communication professor Siva Vaidhyanathan writes about a technological arms race between "anarchists" that want culture to be free and "oligarchs" who seek to turn a profit from culture by controlling it. In an earlier book, "Copyrights and Copywrongs," the author demonstrates that copyright law is limited in its ability to give creators incentives to produce and profit from their work, and argues that culture's transition to a digital medium has fueled the expansion of copyright and its use as a tool for monopolization and censorship. Vaidhyanathan warns that copyright law is strangling the output of artists, librarians, and academics who see digitally distributed culture as an instrument for innovative work. He posits that people often fear the free flow of information and scramble to plug the leak technologically and legally, using the Digital Millennium Copyright Act as an example. "Those are the sorts of moves that we make, either in policy or technology, that are destined to fail, because we think that we can invent a machine to fix the problem that the last machine caused," Vaidhyanathan explains. He sees the battle between music swappers and entertainment companies as representative of the anarchist-oligarch arms race: Swappers have responded to the companies' attempts to thwart file-sharing through data encryption by finding ways to subvert the protection. The NYU professor's critics include Indiana University law professor Marshall Leaffer, who contends that Vaidhyanathan subscribes to a "libertarian" and "romantic" perception of intellectual property that sometimes misses the digital age's real problems. Vaidhyanathan says his next book will chronicle the "technocultural history of voting," examining the intermingling of "democracy, technology, information control, and software."
Click Here to View Full Article

New anti-spam measures are being called for as spam continues to flourish and cost people and businesses more and more in terms of lost productivity, but Brian Whitworth of the New Jersey Institute of Technology and Elizabeth Whitworth of Carleton University postulate that the most effective solution may be to design software to fulfill social requirements so that the divide between social expectations and computer system capability can be bridged. At the root of spam's proliferation is a technology-driven social situation: Not only is it extremely cheap to send spam in bulk, but sending more spam translates into greater sender profit. The authors argue that spam does not constitute legitimate communication because it does not benefit society as a whole and is individually unfair since it is one-way and wastes public time at little cost to spammers. They write that legitimacy analysis facilitates two-way translation between social statements and social-technical system design, and aims to specify a fair and socially beneficial scheme for online object ownership, thus instilling accountability. Legitimacy analysis is a process for establishing consistency between social expectations and software behavior by defining information system objects and methods; stating legitimate, community-accepted ownership principles; and analyzing information object ownership based on those earlier steps. By this reasoning, the authors propose that the email transmission contract should stipulate senders' acceptance of rejected messages, a situation that creates costs for spammers. Another recommendation is the inclusion of a conversation request system that allows users to refuse to engage in new email conversations unless they accept an invitation to converse prior to the delivery of the first message. The computer copy function undercuts the social idea of online privacy by eliminating users' rights to own their personal address data, and the authors' social-technical solution is software that lets users update their own data and give businesses temporary access to the data via a link rather than a copy of the data.
Click Here to View Full Article