ACM TechNews is published every week on Monday, Wednesday, and Friday.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Thunderstone or ACM.
To send comments, please write to technews@hq.acm.org.

No outside techniques exist for confirming that the votes recorded by most electronic voting systems have not been tampered with, and the National Institute of Standards and Technology has organized a Oct. 7 conference to investigate technological safeguards. The obvious solution would be to make a voter-verifiable paper trail mandatory for e-voting machines, but election experts say the widescale adoption of such a measure has been held back by political disagreements and a lack of available voter-verifiable products. "The transparency of voting systems is critical to ensuring that the public is supportive of an election, mostly proving that the loser actually lost," says ACM's public-policy director Cameron Wilson. Several bills requiring paper trails have been stalled in Congress partly because leading Republicans see e-voting reform as an attempt by Democrats to call into question the outcome of the last two presidential elections. Other obstacles hampering such reforms include federal politicians' reluctance to shell out more money for paper trails after allocating $650 million to state officials to modernize their voting systems through the Help America Vote Act. Activists and computer scientists are also locked in intense deliberation about the effectiveness of paper receipts for detecting election fraud. Carnegie Mellon University computer science professor Michael Shamos thinks the inclusion of a paper trail in all voting machines is a bad move, as it will discourage experimentation with better approaches, and infringe on voters' privacy by showing who voted first and last. In addition, paper records themselves are not immune to tampering.
Click Here to View Full Article Cameron Wilson is the director of ACM's U.S. Public Policy Committee,http://www.acm.org/usacm

A recent executive order from President Bush has revived the erstwhile President's Information Technology Advisory Committee (PITAC) as part of the President's Council of Advisors on Science and Technology (PCAST). "The whole multidisciplinary nature of science and technology these days makes this a very good move and raises the [profile] of information technology to a new level," says PCAST co-chairman E. Floyd Kvamme. Kvamme says issuing an assessment of federal-level IT research and development will be the reorganized council's first priority. Bush's decision to let PITAC disband in June was not welcomed by many observers, who felt that future innovation, the U.S. job market, and America's standing in the global economy could be threatened by a lack of focus on IT research. Former PITAC co-chairman Ed Lazowska believes the restructuring of the council is a step in the right direction. "What [PCAST members] are able to accomplish in the IT space depends upon who is added to the committee and what priorities the co-chairs establish," he notes. Lazowska adds that the group's effectiveness in supervising U.S. IT leaders requires new PCAST members to be as renowned and knowledgeable as current PCAST members.
Click Here to View Full Article

Over 400 students, 25 companies, and 21 speakers will attend the 11th annual ACM student computing contest at the University of Illinois at Urbana-Champaign this weekend. The event will feature a job fair with representatives from such companies as Amazon.com, Google, and Morgan-Stanley, while speakers will include Stephen Wolfram of Wolfram Research and PlayStation 3 chip designer Peter Hofstee. There will also be an 18-hour artificial intelligence programming competition between 16 three-person teams at the Digital Computer Laboratory. The ACM Reflections Projections Student Computing Conference is organized to assemble students from throughout the United States and give them a general overview of the computer science field. "The goal of the conference is to be half software and business and half hardware related," says University of Illinois graduate student Sameer Sundresh. The event is open to all students, not just those majoring in computer science.

Sweeping patent reforms are being stymied by differing views concerning the ownership of rights to ideas and artistic works. Many intellectual property experts conclude that long-cherished approaches to producing innovations, such as patent systems, will be retained, with many of the affected businesses preferring to let the systems evolve to match the technologies they are currently trailing. Microsoft Europe's associate general counsel Horatio Gutierrez expects the software industry to experience a convergence of extreme views on "free" patents, shared copyrights, and open-source products, and he sees such convergence in several recent Microsoft deals. One deal involves a partnership with open-source developer JBoss, which Microsoft describes as an effort "to improve interoperability and ensure an optimized experience for customers using JBoss on the Windows Server platform." The other is an agreement with France's Inria computer science research institute to collaborate on research that would be owned by the institute. Making innovations accessible without limitations is the goal of several initiatives, including General Public Licenses, which allow users to modify and distribute software. The Creative Commons organization, which permits users to copy and distribute copyrighted work as long as proper credit is given, has been adopted in roughly 30 countries. As Sun, IBM, and other companies make more and more of their intellectual property freely accessible, it becomes difficult to keep track of shared and protected rights, so one organization is promoting a database, or "patent commons," to ensure users that rights are being properly enforced.
Click Here to View Full Article

Under construction in South Korea is New Songdo, a "ubiquitous city" where data is shared by all key information systems, and where residences, office buildings, and streets have built-in computers. Although South Korea is home to several U-city initiatives, Mike An of the Incheon Free Economic Zone says New Songdo will distinguish itself as "the first [city] to fully adapt the U-city concept, not only in Korea but in the world." John Kim with New Songdo City Development envisions the metropolitan infrastructure serving as a trial platform for new technologies, while the city will provide a template for the "U-life" lifestyle. He says a smart-card house key will be the starting point of U-life: The key can be used to access public transit, process transactions, borrow free materials, etc; Kim also foresees New Songdo residents enjoying universal wireless access to their digital content and property, video on demand, and videoconferencing calls with neighbors. Kim says people's needs rather than technology will be the central design factor for U-life digital services. New Songdo will present an opportunity to study how smart cards, radio-frequency identification (RFID), and sensor-based devices are employed on a large scale. "New Songdo sounds like it will be one big Petri dish for understanding how people want to use technology," notes B.J. Fogg of Stanford University's Persuasive Technology Lab. Institute for the Future research director Anthony Townsend says an experiment on New Songdo's scale faces fewer social and regulatory challenges than it would in the West, where ubiquitous computing technologies raise concerns about privacy infringement and invasive citizen surveillance.
Click Here to View Full Article

Defense Advanced Research Projects Agency (DARPA) director Anthony Tether says qualifying trials for this year's Grand Challenge contest have exceeded his hopes, especially compared to how poorly the contestants fared in last year's competition. The Grand Challenge is a DARPA-sponsored race between autonomous unmanned vehicles, which will be required to traverse approximately 175 miles of rough desert terrain within 10 hours on Oct. 8. The entry that finishes the race first will win $2 million, but the point of the competition is to encourage the development of robotic vehicles that can operate effectively in war zones and other hostile environments. A mere seven teams successfully completed the course of last year's qualifying event, but at least four times as many vehicles finished an even more difficult course this year. DARPA program manager Tom Strat says the key challenge the competing vehicles face is processing all the data they are receiving from sensors and GPS navigation systems so they can make decisions about how to respond to conflicts, the problem being that a computer can only sense its environment, not understand it. Still, the trials demonstrated that teams have made strides to eliminate the problems that plagued their performance in last year's Grand Challenge; the Ghostrider unmanned motorcycle, for example, was able to pick itself up on its own after tumbling. Stanford University's entry, a sport utility vehicle featuring cameras, laser guidance, an inertial system, and a half-dozen computers, completed the qualifying course with no errors. The exact route of the race course will not be disclosed until two hours before the race begins.
Click Here to View Full Article (Access to this site is free; however, first-time visitors must register.)

The concern that many employers are exploiting the H-1B visa program to replace U.S. tech workers with lower-wage foreigners is rearing its head again with a recent study from the Programmers Guild. The report says many employers in companies seeking a minimum of 100 H-1B visas in 2004 intended to pay significantly less than the average U.S. tech worker's annual salary of $62,620, as calculated by the Department of Labor. Employers are required by H-1B rules to pay at least the prevailing wage for the occupation or the actual rate the employer pays to similar professionals, and Programmers Guild President Kim Berry says the study highlights a loophole in which employers are allowed to ascertain prevailing wage from numerous sources. "The law should specify a minimum salary, above the median wage of comparable U.S. workers," he recommends. The Programmers Guild report revives feelings that technology staffing firms accused of abusing the H-1B program have not gone away, as well as views that India-based employers operating in the United States are the program's chief beneficiaries. The study estimates that almost 37 percent of H-1Bs approved in 2003 were for Indian-born workers, and Berry reckons that Indian citizens or U.S. citizens of Indian descent run 18 or 19 of the 20 lowest-paying employers, among companies seeking at least 100 visas. U.S.-India Action Committee Chairman Sanjay Puri dismisses the idea of abuse with his argument that Indian H-1B holders now have considerably more opportunities, both in America and back home, than previously. Rochester Institute of Technology professor Ron Hira says Indian-based firms are using guest worker visas to promote offshoring.
Click Here to View Full Article

At the recent Hack In the Box event in Malaysia, security researcher Dave Aitel showed off a demo of a "Nematode" framework for creating a benign computer worm that he believes organizations will employ to reduce the costs of network security. "With this [Nematode] concept, you can take advantage of automating technologies to get protection for pennies on the dollar," he said. Aitel said the nematodes or nonmalicious worms can be automatically generated from available vulnerability data, and he envisions a time when ISPs, large companies, and government organizations deploy "strictly controlled" nematodes to make security more cost-efficient. Aitel's concept involves the employment of servers or "Nematokens" that only respond to requests from networks cleared for assaults, and the Nematode Intermediate Language (NIL), a programming language for creating the worms. Exploits can be rapidly and simply converted into nematodes through use of the NIL. Prior to his current stint at the Immunity security firm, Aitel worked as a computer scientist at the National Security Agency and then as a code-breaker for @Stake. The commercial technology that enables networks to protect themselves automatically with automated technologies will be available within five years, Aitel reckons.

The IST-sponsored NOBEL project set out to achieve widespread broadband implementation throughout Europe at a moderate cost. Broadband adoption has been slowed by concerns about sinking large investments in relatively new technologies, such as networking through optical fiber; the project, which concludes in December, has sought to develop an intelligent and flexible infrastructure, and also to create standards to ensure that implementation can occur throughout the European Union. A broad partnership of telecommunication network operators, research centers, and equipment manufacturers was created to develop a more comprehensive and expedient approach to broadband implementation. The coalition examined broadband development over the next few years, and then looked at progress in the medium term before finally considering development in the long term. In each period they looked at network technologies, services, and the control plane that enables automatic configuration. In the short term, they examined the adoption of level 3 VPN and various IP and Multiprotocol Label Switching technologies; in the intermediate term, network traffic management is likely to become more sophisticated, and level 2 VPNs are also expected to become a reality; in the long run, the researchers anticipate burst-switched networks and level 1 VPN. The researchers said that it will be services, rather than technologies, that drive broadband development. The home PC will be broadband's central market driver, followed by ADSL technology, while new online services and emerging video technologies will also play a critical role on the success or failure of broadband in Europe.
Click Here to View Full Article

UC Berkeley alumnus and computing pioneer Jeff Hawkins, whose innovations include the PalmPilot, has endowed a new research center to devise mathematical and computational models of the human brain with a $4 million gift from him and his wife, Janet Strauss. The Redwood Center for Theoretical Neuroscience will be headed by Bruno Olshausen, a former researcher at the Redwood Neuroscience Institute, which was also started by Hawkins. Olshausen and his research team will strive to generate a model of the brain that will ultimately be imitated by a computer. "Our goal is to develop a theoretical framework for the neocortex--the outer layer of the brain involved in conscious perception and action," Olshausen explains. Hawkins believes enough knowledge about neocortical function has been accumulated to create software informed by the brain's algorithms, and thus teach computers to perform operations that would be defined as intelligent; his new startup, Numenta, aims to put this theory into practice. Hawkins says Numenta's technology could enable computers to understand spoken or written content, or facilitate fluid robotic walking, for instance. He says the technology "can understand complex worlds and make predictions about them, and that is the core of what intelligence is all about--forming an understanding about a world and then predicting the future." UC Berkeley neuroscience and psychology professor Bob Knight expects the new center will help campus researchers gain a better understanding of the huge data sets produced by brain studies through innovative theoretical and computational viewpoints.
Click Here to View Full Article

IT careers still hold much promise despite globalization, outsourcing, and other trends that have discouraged people from becoming IT professionals. According to the Bureau of Labor Statistics, computer scientists, database administrators, and computer-system analysts will experience some of the fastest job growth through 2012, while the National Association of Colleges and Employers points to increasing salaries for IT graduates. But the percentage of U.S. students majoring in science and technology is declining precipitously, and more people and companies must start promoting IT's benefits in order to inspire the best and brightest to follow an IT career track. Stanford computer science Chairman Bill Dally believes researchers and companies could make a greater effort in demonstrating to students that meaningful innovations can still be realized through IT. IT MIT lecturer Jack Rockart says the most demoralizing factor for prospective IT employees is apprehension that IT jobs will become scarcer and scarcer because of outsourcing and overseas competition. Web-user interface developer Diane Zhang anticipates plenty of opportunities in IT, provided professionals take the long view of what constitutes an "IT person" and never lose sight of customer and business needs. Microsoft and the Society of Information Management have organized an educational outreach program designed to interest college students in IT. The industry must also directly address the issue of global competition for IT jobs, while taking into account that young people will probably be vying and collaborating with people worldwide.
Click Here to View Full Article

As wireless providers look to expand their industry beyond cell phones, machines are becoming a more appealing target. Machine-to-machine (M2M) communication holds vast potential to automate everyday tasks performed by service technicians, and, ultimately, to improve customer service; for instance, a soda machine could notify a local vendor that it has run out of change via a wireless connection. Increased bandwidth could enable such M2M applications as utility companies collecting household usage data through sensors and copiers notifying technicians of paper jams. The pervasiveness of cellular technology, and the associated falling costs, has spurred increased interest in M2M technology, while the advent of XML and IP has also made it much easier to establish network connections, reducing the time required to build M2M networks. M2M users are valuable customers for wireless carriers, as they are less likely than individual consumers to switch providers. Still, cellular M2M communication is an emerging field with an underdeveloped infrastructure, and as usage becomes more widespread, it is increasingly difficult to synthesize all the information produced. To address this issue, some companies are exploring smart dust, a self-organizing method of automating data collection and evaluation. Despite the multitude of connectivity options confounding the market, it is estimated that the M2M sector will experience tremendous growth in the near future, as the number of modules sold is expected to grow from 7 million in 2004 to 70 million by 2008.
Click Here to View Full Article

Experts at MIT's Emerging Technologies Conference expect a new era of intrapersonal communications to arise from the advent of social networks and applications that assess personal behavior and gauge basic human traits such as honesty and likeability. MIT Toshiba Professor of Media Arts and Sciences Dr. Sandy Pentland noted that MIT and other institutions are developing voice recognition software that can already determine whether a person is lying over the phone, failing to engage an audience's interest, or more likely to break up with their spouse. "By engaging in reality mining, or paying attention to patterns of movement, activity or a person's tone of voice, to examine the way people truly feel, we can build better forms of social software," said Pentland. Dodgeball.com creator Dennis Crowley pointed to the emergence of geographically aware social networks thanks to a resurgence in locative technologies, Dodgeball being a case in point; the social networking application enables people to hook up by sharing personal information through their computers and wireless devices. Meanwhile, Del.icio.us Web site founder Joshua Schachter said the rapid adoption of email, instant messaging, and mobile devices is converging to pave the way for new applications that try to take advantage of human intelligence to some degree. IT.com CEO Mark Cordover said he anticipates the development of new social rules that address challenges to privacy and eavesdropping opportunities presented by social computing applications. Many conference presentations cited socially beneficial technologies: Entrepreneur Nolan Bushnell detailed his latest brainchild, an educational software system designed to make learning more fun and interactive by exploiting kids' love of video games.
Click Here to View Full Article

Researchers at Virginia Tech's Center for Wireless Telecommunications are developing a cognitive radio solution that could enable more effective communication for first responders and safety officials in the event of a disaster by significantly cutting Wi-Fi interference and lowering the incidence of dropped calls and blocked calls in cellular phones. Center director and professor Charles Bostian says cognitive radio was presented as a solution to the problems of communications technology that required broadband transmissions to be bounced off walls and rubble. Electrical and computer engineering Ph.D. candidate David Maldonado says cognitive radio represents an attempt to apply knowledge accumulated through past developments of artificial intelligence to radio. Student contributions have been critical to the project, which involves, among other things, the development of what Bostian terms a "cognitive engine." He says, "It basically is a way of turning the knobs on the radio and then reading the meters...it tries something, sees what happened, decides what was good or bad; if it was good it tries to make it better and if it was bad, it won't do it again."
Click Here to View Full Article

The Working Group on Internet Governance (WGIG) is calling for more participation from businesses and the technology industry in the development of Internet policies and structures. Markus Kummer, executive coordinator of the WGIG, recently explained the areas covered by the term "Internet governance" at the Global Public Policy Conference in Malaysia. Kummer noted that, in addition to policies regarding Internet names and addresses, Internet governance also encompasses "significant public policy issues, such as critical Internet resources, the security and safety of the Internet, developmental aspects, and issues pertaining to the use of the Internet." Kummer says the WGIG wants to see all stakeholders, including the private sector, civil society, and members of the technical community as well as governments, to participate in Internet governance. Karine Perset, policy analyst for the Organization for Economic Cooperation and Development, says the private sector should have a leadership role in Internet governance, with governmental/inter-governmental oversight brought in only for emergency situations and threats. Meanwhile, Ang Peng Hwa, director of the Singapore Internet Research Center, supports the WGIG model, saying, "The government, private sector, and civil society are all essential for good Internet governance."
Click Here to View Full Article

Researchers at MIT, the San Diego Supercomputer Center (SDSC), IBM Research, and other major research facilities offer observations about various technological developments that are likely to be of interest to enterprise IT adopters. Microsoft is working to improve the reliability of systems through the deployment of run-time environments that spot and prevent numerous software errors, while IBM, Intel, and MIT are pursuing the improvement of team interaction. Intel's Cindy Pickering anticipates better interaction via the integration of multiple communications modes into a single environment for collaboration, and IBM's Daniel Yellin points to the incorporation of collaborative features into Eclipse. Automation of complex operations and systems capable of self-repair and self-management are also areas of interest, though Steve White of IBM Research says system managers' expectations must be respected, with emphasis on increasing transparency and keeping false alarms to a minimum. MIT's Jerrold Grochow stresses the need for a robust cybersecurity infrastructure effected via a service-oriented architecture, and many researchers agree that the security problem could be addressed by considering security upfront. SCDC's increasingly heterogeneous computing workload and storage requirements highlights the need for low-cost durability, high-disk duty cycles, and other methods for efficiently managing vaster data streams. Sun Microsystems and others are channeling a lot of energy to developing more cost-effective processing architectures, with Sun Solaris Group Manager Chris Ratcliffe witnessing a migration to multicore architectures among increasing numbers of system builders.

Among Sun's research and development initiatives, chief researcher John Gage counts the work on Opteron architectures as the most interesting, which includes a streaming device and an improvement on RAID arrays that could, in theory, lead to perfect, permanent storage. In the processor area, the conceptual work conducted on multithreading has led to Niagara, an eight-core, 32-thread Ultrasparc engine to be released in servers in 2006. Gage also touts the emergence of Rock, an exercise in simplicity that will dispense with deep pipelines, out-of-order execution, and other factors unnecessarily complicating CPU subsystems. Sun is also interested in content management projects such as WBGH-Boston's metadata initiative, aimed at developing standard descriptions for the content of a digital media repository. Gage anticipates a shift coming in the area of spectrum ownership, as the radio industry gradually evolves away from its proprietary model to embrace the increasing acquisition of towers by wireless providers. There is also a social aspect to Gage's work with Sun, as he founded NetDay, a program to deliver technology to underfunded schools, and was active in Sun's partnership with U2, in which the company provided a Java-based technology to enroll concert patrons in Bono's ONE program for poverty and AIDS relief in Africa. Sun is currently exhibiting its new residential access box with a vast storage capacity and sophisticated content management abilities, though Gage notes that Sun is more interested in broad industry acceptance of the technology than marketing the individual devices to residential consumers. Gage maintains that Sun's ongoing Java initiative is nothing without innovation, and that the company's detrace analytical technology has helped catch numerous errors in both hardware and Solaris services.
Click Here to View Full Article

Co-chairman of the President's Information Technology Advisory Committee (PITAC) Ed Lazowska says inaction is the order of the day among government, CIOs, and vendors as far as cybersecurity is concerned. He accuses the Bush administration of undervaluing science, engineering, education, and research, which means that CIOs will be prevented from purchasing desperately needed cybersecurity products unless they pressure the government as well as pay for cutting-edge products as a demonstration of their commitment to cybersecurity. Lazowska says an attack on the nation's IT infrastructure could have serious ramifications for its critical infrastructure, while the military's dependence on commercial vendors for most of its hardware and software makes it highly vulnerable to cyberattacks as well. He cites a PITAC study that singles out three federal agencies as particularly deplorable in terms of cybersecurity funding: The Homeland Security Department, which currently commits a mere $18 million of its approximately $1 billion annual science and technology budget to cybersecurity; the Defense Advanced Research Projects Agency, whose investment in mainly classified cybersecurity programs shuts the door to premier academic researchers and yields products of little immediate value to commercial IT systems; and the National Science Foundation, which could only fund a small portion of its Cyber Trust program. Lazowska says current cybersecurity efforts are all about "Band-Aid" solutions, when what should be developed are new system architectures with long-term applications, static and dynamic vulnerability detection tools, programming languages with basic security functionality, and methods for building trusted software systems from diverse elements.
Click Here to View Full Article

Developing futuristic but practical technologies is the motivation behind Applied Minds, a company whose gadgets are a testament to the childlike imagination of its co-founder, inventor Danny Hillis. Applied Minds' first commercialized product, developed in collaboration with the Herman Miller office furniture company, was Babble, a device designed to increase the privacy of office cubicle workers by masking their conversation with a soundtrack of scrambled, nonsensical vocal fragments. Meanwhile, Hillis' company is developing a personal robot dubbed "the mule" with Northrop Grumman that seeks to relieve soldiers from the burden of carrying their own water, communications equipment, and batteries. The mule, which trails behind the soldier, produces water from air and has broadband built-in. Another fascinating Applied Minds product is the 2.5-D Display, a table that can present topographical information of any location on Earth as a detailed physical model. As a student at MIT, Hillis envisioned the concept of a thinking machine, which inspired him to re-imagine the computer's "brain" as a combination of thousands of interoperating processors rather than one processor. His idea of "parallel processing" inspired a doctoral thesis as well as a company, Thinking Machines. Despite his many accolades and patents, Hillis asserts that "people tend to overestimate the individual inventor and underestimate the system that makes their inventions real."
Click Here to View Full Article