ACM TechNews is published every week on Monday, Wednesday, and Friday.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM.
To send comments, please write to technews@hq.acm.org.

A study commissioned by Open Source Risk Management (OSRM) has found that the Linux kernel could potentially infringe on 283 registered software patents, 27 of which are owned by Microsoft. Free Software Foundation senior counsel and patent attorney Dan Ravicher, who conducted the study, says these results do not imply the end of Linux; rather, he claims, the conclusions are "very similar to the result you would get if you investigated any other software program that's as successful as Linux." The study involved the examination of patents issued by the U.S. Patent and Trademark Office along with patents whose approval was being impeded by the courts: Ravicher used automated search tools to whittle the search down to 400 court-tested patents and 1,000 untested patents, and the 283 troublesome patents came from the untested group. Kirkland & Ellis software lawyer Jeffrey Norman explains that open-source software is by its very nature more susceptible to patent claims than proprietary software, but the relative youth of software development makes it easy to devise methods that should probably not be patented, even if they are unique. Norman believes it is only a matter of time before a patent lawsuit related to open-source software is filed, but Microsoft may not necessarily be the plaintiff. The lawyer also doubts that Ravicher's study is an effective evaluation, given the massive amount of code in the Linux kernel and the numerous software patents that have been approved. The majority of patents that could potentially be leveraged against the kernel are held by IBM, a Linux ally; no SCO Group patents were determined to be a threat to the Linux kernel, despite SCO's claims of copyright violation.
Click Here to View Full Article

Government officials and security experts say that a coordinated cyberattack against the United States could do a great deal of damage to infrastructure, government, and business. However, it is not clear who would launch such an attack--al Qaeda, for instance, is focused more on physical threats than electronic ones. A vast array of would-be attackers threaten the nation's digital infrastructure including hackers and criminals from other countries, says Amit Yoran, the director of the National Cyber Security Division of the Homeland Security Department. Over 24 nations have developed "asymmetrical warfare" strategies that target holes in U.S. computer systems, and military experts say that those nations see the electronic arena as the best way to get past traditional U.S. defenses. The attacks could be worms or viruses, malicious programs, or denial-of-service attacks. A Homeland Security report released last year encouraged the use of firewalls, security plans, and antivirus software, but critics say it lacked regulation and funding. "It is a good description of the problem, but doesn't put the onus on the people who can fix it, such as the software developers," says Sans Institute director Alan Paller.
Click Here to View Full Article

Project Columbia is a joint venture between NASA, SGI, and Intel to increase NASA's current supercomputing capacity by a factor of 10 and provide an integrated computing, visualization, and data storage environment known as the "Space Exploration Simulator." The project will involve the integration of 20 SGI Altix 512-processor systems that collectively encompass 10,240 Itanium 2 processors; the new supercomputing cluster will reside at NASA Ames Research Center in Silicon Valley. Project Columbia is designed to comply with the White House's mandate to federal agencies and to the Office of Science and Technology Policy's (OSTP) High-End Computing Revitalization Task Force, as well as remove supercomputer resource impediments that were illustrated during the probe of the Columbia tragedy and shuttle return-to-flight activities. "NASA is excited to be working with industry in an innovative way to allow the agency to deploy a versatile capability in supercomputing," noted NASA administrator Sean O'Keefe. "This will enable NASA to meet its immediate mission-critical requirements for return to flight, while building a strong foundation for our space exploration vision and future missions." An earlier collaboration between NASA and SGI led to the development of Kalpana, the first 512-processor Linux server, which has helped advance shuttle modeling and high-res oceanic simulation significantly. The new cluster will enable this work to continue while also supporting new projects in space and life sciences, exploration systems, aeronautics, and mission safety. A segment of the system, which will be constructed and deployed over the next three months, will be made accessible to the U.S. science and engineering community, as per OSTP recommendations.
Click Here to View Full Article

In an effort to help simplify the writing of Java applications, IBM will today announce a contribution of over 500,000 lines of proprietary software code to an open source software group. "We hope to spur the further development of the Java community," declares Janet Perna, IBM's general manager for data management software. The Apache Software Foundation will receive the code for the Java-based Cloudscape database, which is valued at $85 million. Any growth in Java applications fueled by this move will almost certainly benefit IBM by providing more potential uses for its WebSphere platform, which runs and manages such applications in direct competition with Microsoft's .Net platform. Cloudscape is designed to be used as a single database contained within a software application rather than a complete database program that functions by itself in corporate data centers; Java specialists believe the Cloudscape code could be very appealing as a basic Java database. Apache will rename the Cloudscape database Derby, and retain licensing and intellectual property rights to the code. Industry analysts call this development a further illustration of IBM's support for open source projects to which it has contributed staff, marketing dollars, and code. Forrester Research analyst Mike Gilpin says, "The Cloudscape code is not a major factor in IBM's overall platform strategy. So this makes sense for IBM."
Click Here to View Full Article(Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

Approximately 11,000 people are expected to attend the LinuxWorld Conference & Expo in San Francisco this week, where a chief topic will be Linux's penetration of desktop systems. Linux products and implementation strategies will be showcased by over 190 exhibitors, including IBM, Red Hat, Intel, Hewlett-Packard, Dell, and Oracle. IDG World Expo President Warwick Davies observes that Linux is growing in popularity, and IDC analyst Dan Kusnetsky notes that Linux could be seen as a feasible alternative by consumers who want to move away from Microsoft's Windows operating system, provided Linux can support email as well as access to Web-based applications and the Internet. He adds that Linux could also become appealing to developers of platform-neutral software, as long as they can create Java-based applications and Web services using proper tools and at reasonable prices. Kusnetsky reasons that many workers' requirements could be fulfilled with an operating system that supplies common applications (Web browser, email agent, Java virtual machine, etc.), so organizations could give them a system in which Linux functions as an underlying client operating environment for either client/server or Web server applications. "The whole industry is starting to wake up to the possibility of Linux on the desktop," boasts Illuminata analyst Jonathan Eunice, although he cautions that Linux is at an early stage. Davies says there will be increased emphasis on Linux security at LinuxWorld, as well as discussion on incorporating Linux into corporate environments without jettisoning existing corporate systems. IDC expects Linux's market share to expand from 2.7 percent with 3.4 million paid license shipments in 2002 to 6 percent with over 10 million shipments in 2007, according to Kusnetsky.
Click Here to View Full Article

Cryptographic hash functions are one of the most useful mathematical tools in computing today, because they allow people to easily protect passwords, stored files, and even database information. One of the most recent applications comes from three Stanford University researchers, who created a browser plug-in that scrambles one easily remembered password for different e-commerce sites based on those sites' Web domains; this protects people from hackers who could use their uniform log-in and password to gain access to multiple accounts, while providing users with the convenience of remembering just one set of identifiers. Yahoo! also uses a version of hash cryptography in its registration process where the user computer is sent a "challenge" sequence that must be appended to the entered password, protecting people using insecure public terminals from hackers sniffing Web traffic, for example. Hash functions are mainly based on research done in the 1980s by RSA co-inventor and MIT professor Ron Rivest, who developed the system as a way to ensure the integrity of a file; hash files garnered from a set of computer files can let the owner know those files were not tampered with, for instance, because any change in the input would produce a different hash code. Hash technique is also used in the Surety secure timestamp service to verify a file was in existence at a certain time, and this involves publishing the hash code in a well-known location owned by a third-party, such as the New York Times classifieds. Although the Message Digest #5 (MD5) hash function is the most widely used today, perhaps the most secure is the U.S. government's Secure Hash Algorithm, or SHA-1, which caused some controversy at the time of its announcement because cryptographers theorized it contained a backdoor for U.S. intelligence services. Hash functions continue to be used in innovative ways, and might possibly be used to secure entire databases as proposed in the book "Translucent Databases" by Peter Wayner.
Click Here to View Full Article

Researchers at the National University of Singapore (NUS) and the Defense Science and Technology Agency (DSTA) have been working for two years on a system to power portable devices such as MP3 players and cell phones by harnessing electricity generated by the human body. NUS assistant professor Adrian Cheok detailed one such application at the Mobility Conference yesterday: The scheme involves affixing a piezoelectric material onto the soles of a pair of shoes so that electricity is generated when the wearer walks. Cheok theorized that the electrical current produced by the human body can also be used to pass information between people through their skin, making the human body another device incorporated into a personal area network. The DSTA-funded research has almost reached completion, and the project's first applications will be military. Such technology has been the focus of research at MIT, IBM, and elsewhere. The conference where Cheok's discussion took place was the first hosted by ACM's SIGCHI Singapore chapter. Meanwhile, University of Maryland professor Zachary Segall discussed the creation of devices that learn how to cooperate with humans better: "We're trying to see how to make computers more human-literate, rather than making humans computer-literate," he noted. An August 3 presentation from University of Udine professor Luca Chittaro at the conference will detail how to simplify the viewing of data on mobile devices, which is particularly challenging because the amount of information available for viewing is surging while portable devices are shrinking.
Click Here to View Full Article

Researchers at Sweden's Royal Institute of Technology, University College London, the Viataal software company in the Netherlands, and the Belgium-based Babletech voice analysis firm have developed a prototype system that supplies an animated face to help people with hearing difficulties understand what is being said on the other end of a phone line. Synface, as the system is called, runs on a conventional laptop and can be plugged into any kind of phone. The software matches voice to mouth movements using a neural network that identifies phonemes rather than whole words, which is a fast way to match words to animation while also enabling the system to more accurately represent unfamiliar words. Synface can produce animated annunciations is about one-fifth of a second, and incorporates a fractional delay so that the animated face and the voice are in perfect sync. Synface was tested at Britain's Royal National Institute for the Deaf (RNID), and results showed that 84 percent of hard-of-hearing volunteers were able to recognize words and converse normally over the phone using the system. Synface has been trained to function in English, Dutch, and Swedish, and could be refined to accommodate various regional dialects. The system was designed as a tool for people who have some difficulty hearing rather than those who are severely hearing-disabled.
Click Here to View Full Article

Organizations are increasingly burdened by peer-to-peer (P2P) traffic because it cuts into employee productivity, network bandwidth, and enterprise security. P2P traffic is at an all-time high across the Internet, consuming anywhere from 30 percent to 70 percent of ISP resources, depending on whether they operate on the edge or the core of the Internet; and according to a Blue Coat Systems survey, almost 40 percent of respondents said they used P2P applications while logged onto corporate networks. LexisNexis deals with the P2P threat using several network management technologies that target peer-to-peer traffic, which is often concealed using dynamic ports, hashes, tunneling, and a number of other techniques that will elude traditional network management. LexisNexis uses Web-filtering software to set limits for different users depending on their function or location. Employees in the legal and human resources departments do not have restrictions, for example, nor do employees in Europe, where the culture is more lenient. U.S. P2P use is actually less than that in Europe, but still comprises a significant portion of traffic: Websense's Leo Cole says the number of tracked P2P sites has grown from just 434 sites to more than 2,000 sites in the past year. Tools that monitor the underlying P2P protocols are vital, because those applications serve as entry points for malware such as MyDoom/Sober-C, various Trojan horses, and spyware. As much as 56 percent of P2P content is also copyrighted, leaving companies open to lawsuit from organizations such as the Recording Industry Association of America. Not all P2P applications are bad for business, however, and the BBC and Red Hat have separately made content and software distributions available via P2P file-sharing.
Click Here to View Full Article

In an interview, Michael Bove, director of MIT's new Consumer Electronics Lab, says the facility is focusing on five key areas: Power efficiency and tapping new sources of power; sensors and communicating the information collected by sensors; cooperative wireless communications systems; self-organizing ecosystems of intelligent devices; and the design and manufacture of new materials. Bove says the lab is studying the generation of power on an as-needed basis as well as parasitizing power from unusual sources, such as Wi-Fi networks. He notes that the lab is concentrating more and more on collections of smart devices that work together to solve problems, and points out that while the concept of an all-in-one device has its uses, it cannot fulfill every function. Bove says his lab is collaborating with Seoul's Information and Communications University on smart architectural surface tiles that each consist of a computer, display, camera, microphone, speaker, and sensors. The low-cost, power-efficient tiles communicate with each other wirelessly, and can run applications individually or as a unit. Bove says the chief characteristic of an electronic ecosystem is seamlessness: "It would be really sort of nice if, wherever I was and whatever access means I had to the network, all of my information just appeared to be there," he muses. An electronic ecosystem would maintain seamless communications and data regardless of where a user is or what kind of network he is on, even if he switches between contexts. Bove does not expect PCs to be driven into extinction by the advent of electronic ecosystems, but their role will change; he predicts that the PC is "going to be a partner in a bigger enterprise as opposed to being the enterprise."
Click Here to View Full Article

University of Wollongong accountancy lecturer George Mickhail is now available to students anytime and anywhere--as a Web-based version of himself driven by artificial intelligence. Virtual George, as the program is called, started life in 2003 as a text interface, but has evolved into a program that can carry on conversations with users by employing a database of linguistic terms. Its behavior and interaction with students was refined through the study of real chat sessions and their outcomes, and the real George Mickhail says the program is so successful that students thank him for assisting them even though it was Virtual George who actually helped them. Among the problems that Virtual George can help mitigate is an overwhelming workload for lecturers caused by the increasing student population, more demanding fee-paying students, and growing use of casual academic staff, leaving lecturers with less time to focus on research and other high-value activities. In addition, Internet-based training has led to an overload of student email queries, while students themselves are facing increased workloads and subsequently need access to information resources and aid outside of regular university hours. Mickhail says that Virtual George becomes familiar with a user's inclinations and engages in conversations that cover a wide range of subjects the user is interested in; the program can then broaden the discourse to include other subjects that may be of substance. A course query yields definitions of terms, links to course materials, and multilingual video and audio streams. The system, which is available for free, is promoted by Mickhail as a "private tutor for students and a personal assistant for academics."
Click Here to View Full Article

The bulk of the 9/11 Commission's 567-page report focuses on the failings of structure and intelligence agencies, but the document does contain advisories for technology improvements, most notably a total back-office renovation of information sharing. The report recommends that "information be shared horizontally, across new networks that transcend individual agencies," which would enable agencies to access each other's databases as well as their own, and increase the amount of information available. The study cited a report from the Markle Foundation Task Force outlining an infrastructure comprised of a network of linked databases that employ directory services, metadata standards, encrypted storage systems, search tools, rights management, and authentication technologies that would allow analysts and agencies to access terror intelligence on an entirely new level; but experts such as GlobalSecurity.org's George Smith caution that this deployment may be impossible, or at least very difficult, because of agencies' reluctance to share information. The technical challenge itself is formidable: Not only would the infrastructure have to migrate from a centralized to a decentralized model, but an intimidating amount of information would have to be collated and shared. Furthermore, several agencies have already devised their own analysis infrastructures and are expending precious resources trying to copy each others' initiatives. The 9/11 Commission thinks the CIA's Terrorist Threat Integration Center could be a solid foundation for a new infrastructure, although it still lacks oversight and operational authority; the center is set up so that analysts can access 14 government networks, with 10 more on the way. The Markle Foundation's Todd Glass reports that this overhaul could be immediately instituted because it has broad government backing and the technology to build the infrastructure is already available.
Click Here to View Full Article

Sen. John Kerry (D-Mass.), Democratic presidential nominee, has had a mixed history in regards to his position on technology issues, writes Declan McCullagh. In 2002, Kerry ignored Intel executive VP Leslie Vadasz's warning to the Senate Commerce committee that a bill proposed by Sen. Fritz Hollings (D-S.C.), which called for the incorporation of copy protection controls in all consumer electronic devices, was a bad idea. Kerry also supported unpopular legislation that mandated the institution of tougher data collection rules on Internet firms than apply to the rest of the American economy. In the 1990s, Kerry came out in favor of key escrow--the insertion of back doors in encryption products so that law enforcement and intelligence agencies would have easy access to private information--several times, but these proposals were killed in the wake of heavy criticism from organizations such as the ACM and IEEE. More positive is Kerry's long-time support for the Internet Tax Freedom Act, which is reflected in his current agenda to renew the moratorium on Internet access taxation, and his position that the 1996 Telecommunications Act should be amended to invalidate the illegality of distributing information about abortion online. Less encouraging is his support for the Patriot Act, although he is currently sponsoring a bill to revise it so that certain law enforcement practices related to surveillance and search warrants are constrained. Kerry's Broadband Deployment Act of 2001 offers tax incentives to businesses that supply high-speed Internet access to "underserved subscribers," but McCullagh wonders whether such incentives could have been more beneficial to the development of remote medicine, Internet security products, or nanotechnology.
Click Here to View Full Article

Robotics Trends President Dan Kara projects that 4 million personal robots will ship in 2006, while the United Nations Economic Commission for Europe forecasts that over 2.1 million personal robots will be sold between 2003 and 2006. Many of these machines will only function to entertain, but the ones that can perform household chores should experience the fastest growth in terms of popularity. Current domestic robots can only perform one task, but developers are competing to create robots that can multitask. Most of this development is taking place in Japan, where research has yielded robots that can walk, such as Honda's Asimo and the Kawada HRP-2. The HRP-2 is humanoid in configuration and can recover from falls; the motivation for building such a machine is driven by concerns of how to care for the geriatric population. Another notable Japanese robot is the wheeled Wakamaru from Mitsubishi Heavy Industries, which can recognize voices and faces. Analyst Rob Enderle expects about two decades will pass before robots can mimic life so convincingly that people will regard them as living beings. In the more immediate future is inexpensive software that uses artificial intelligence to converse fluently with its owner and function as a multitasking digital assistant. The U.N. study estimates sales of entertainment robots to surge from 545,000 in 2002 to 1.5 million in 2006. Kara observes that robotics development in the United States is chiefly focused on military systems and products that yield immediate profits.
Click Here to View Full Article

Computer security consultant Dan Kaminsky blew the lid off of the domain name service's (DNS) true abilities at the recent DefCon hacker conference. Kaminsky illustrated how DNS can be used to broadcast audio clips across the Internet or as hacker tunnels into otherwise secure networks. Roughly 2 billion computers worldwide are linked to the Internet via DNS, says co-inventor Paul Mockapetris, making it a prime target for ambitious hackers. The Internet Engineering Task Force is already working on exploiting the system to translate people's phone numbers into multifaceted identifiers that can allow people to be contacted by email or instant messaging if they are not available to take a call, for example. The ENUM service uses DNS to return specially coded queries with simple programming commands so that phone numbers can be automatically linked to other modes of communication. By early next year, 20 countries plan to roll out ENUM capability, though privacy advocates say the system will be quickly exploited by spammers and marketers. Kaminsky demonstrated that DNS exploitation does not even require specialized services such as ENUM, but can be easily built using current DNS infrastructure: He set up a DNS server at doxpara.com to return small portions of audio content instead of IP addresses. Queries can specify the audio a user wants; 1234.doxpara.com will return 1,234 half-seconds of an audio file. Because of the distributed nature of the DNS infrastructure, not all the information is retrieved from doxpara.com, but can be accessed from local nodes once data is cached there as well.

Linda Beck, Earthlink's executive VP of operations, believes women make ideal managers in tech organizations based on the strength of their communication and social skills, and she has pursued her role as a mentor to other women in IT careers with enthusiasm. Beck explains that IT careers hold less appeal to women than to men because many women exclusively focus on available entry-level positions when choosing a college major. "They don't see past software engineering and coding and sitting at a computer all day to the more interesting IT roles, such as project manager, where you interact with people," she notes. Key to luring more women to IT positions is taking a more creative approach to the formation of career paths, and coming up with entry points that are more attractive to women. Beck uses her own experiences to illustrate her point, noting that her entry-level job as a programmer was unappealing because of its lack of communication and interaction, and it was only after she took on roles that revolved around coordination and integration that she started to blossom. "I think women do a better job in a lot of the technology management roles because those roles require good communication, mediation and facilitation skills, and lots of women do all of these things all of their lives in their families," she posits. Beck's advice to IT women who wish to advance is to communicate confidence and resist intimidation, and to volunteer for jobs that they are skilled at. She observes that advancement is easy because there is a scarcity of technical people who wish to manage, and who have a talent for it.
Click Here to View Full Article

Russia will see 11 percent more science graduates this year amidst a resurgence in student interest, according to Auriga research. Russian computer programmers are well-known for their creativity and grasp of complex mathematical algorithms, and they are unique in that many have backgrounds in chemistry, physics, and mathematics. This year's Association for Computing Machinery International Collegiate Programming Contest was won by a team from a Russian university, making Russians the winners in three out of the last five years; in addition, 10 of the top 30 slots in the contest went to schools in the former Soviet Union. Still, experts warn that paltry government spending, especially on basic science, will see the ranks of Russian researchers thin in the future as current professors retire. New graduates look forward to jobs in the private sector, and have many foreign firms ready to hire them: Intel plans to add 500 researchers to its current staff of 500 employees in Moscow, St. Petersburg, and Nizhny Novgorod. Although a private-sector job or even an academic position overseas pays thousands of dollars per month, many Russian professors make just a percentage of that salary. Moscow State University Rector Viktor Sadovnichiy presented a study to President Vladimir Putin that showed nearly two-thirds of Russian professors are older than 40 years old, and that 42 percent would be over 60 years old by 2010 if trends continue; Sadovnichiy says one way to boost interest in Russian academia would be to invest in promising areas of research so that scientists involved in those efforts would be able to benefit from commercialization of their technology. Moscow State University researchers have already formed a startup company together with local investors that makes insulation material called Graflex, and that company employs 700 student researchers and chemists.
Click Here to View Full Article

In a roundtable discussion with CyberDefense Magazine, eBay VP and former White House Special Adviser for Cyberspace Security Howard Schmidt, PatchLink Chairman Sean Moshir, and Foundstone President Stuart McClure talk about the current status of the computer security industry as well as future directions it may take. The panelists provide numerous reasons why the Internet's safety and security is so hard to maintain, among them: The design of the Internet to be an open and collaborative environment that supports anonymity; the inability to keep up with new problems, which are being unearthed on a daily basis; and vendors' eagerness to give customers special features and functionalities without considering how they might impact security. Schmidt remarks that America has taken a vanguard position in boosting cyber-defenses through private-sector and international partnerships, and McClure reports good progress in security deployments by American companies and greater security education. A General Accounting Office report indicates that progress has been made in security patch management, but Moshir contends that the narrowing gap between the announcement of a vulnerability and its exploitation means that patch automation can no longer be just a luxury. Schmidt observes that on-demand Web-based vulnerability evaluation is key to patch management, while McClure says, "The two will go hand-in-hand eventually." McClure raises the need for more knowledge about security requirements among small organizations, while Schmidt calls for better identification of IT systems' interdependencies, developers' prioritization of software quality control over new features and usability, expanded cybersecurity education, and better enforcement of cyber criminal investigation and prosecution. The possibility of a cyberattack comparable to 9/11 is debatable: Schmidt says that society's resiliency against network assaults is improving, but this is no reason to relax our vigilance.
Click Here to View Full Article

Electronic voting systems touted as the solution to the election debacle of 2000 have come under fire by local governments, special interest organizations, and others who cite their lack of reliability and security, and are particularly distrustful of the machines' source code. The move to e-voting systems was instigated by the passage of the Help America Vote Act (HAVA) in 2002, which committed almost $4 billion to help states upgrade, train voting officials, and adjust electoral policies. However, tests and elections designed to prove e-voting's viability uncovered a host of vulnerabilities and technical malfunctions, eroding confidence in the technology and prompting 13 members of Congress to request a security audit of e-voting by the General Accounting Office. New Mexico Secretary of State Rebecca Vigil-Giron comments that HAVA has transferred election responsibilities from local governments to the state. State governments are charged with the regulation, certification, and approval of e-voting machines, leaving end users with very little say in the matter, because the authority of local boards and commissions is state-controlled and constrained. Stevens Institute of Technology professor and Open Voting Consortium member Arnold Urken explains that state governments are somewhat detached from cities and counties, and argues that "[Governments] should be alerted that they may be better off buying nothing rather than buying something just because it is new." Vigil-Giron recommends that all cities and counties update their administrative security policies so they are more in sync with e-voting technology, while Jeff Zaino of the American Arbitration Association blames inadequate poll worker and machine technician training for most recent e-voting snafus. There is also growing support for the provision of a voter-verifiable paper trail to ensure accurate recounts and facilitate random audits.
Click Here to View Full Article