The task of boosting the defenses of the U.S. government's computer networks has been transferred from the White House to the Department of Homeland Security's (DHS) National Cyber Security Division, a branch of the DHS Information Analysis and Infrastructure Protection Directorate. DHS representative David Wray says the department is actively seeking a director for the division, who would have similar duties to the now-defunct position of White House cybersecurity czar. In addition to reducing security holes in the government's computing infrastructure, the new division will collaborate with private industry to safeguard other areas of cyberspace, create cybersecurity education and awareness initiatives, and manage a consolidated Cyber Security Tracking, Analysis, & Response Center. Business Software Alliance CEO Robert Holleyman praises this development, arguing that "Industry and government can set the example by making sure that this issue is addressed at the top level of every organization." William Harrod of security software vendor TruSecure is skeptical, claiming that the move actually gives cybersecurity a lower profile, indicating that the issue's priority in the Bush administration has also been downgraded. Wray counters that the maneuver will have the opposite effect, and calls it "a natural evolution for going from strategic thinking to execution." The SANS Institute's Alan Paller agrees, and asserts that the new division has far more resources to pursue cybercrime deterrents than former White House cybersecurity czar Richard Clarke had.
http://www.pcworld.com/news/article/0,aid,111066,00.asp

Computer researchers at the University of Illinois at Urbana-Champaign are building out a huge storage area network (SAN) that will link a 256-machine Linux cluster to 110 TB in virtual hard disk storage. National Center for Supercomputing Applications (NCSA) technical program manager Michelle Butler says the SAN is the first of its kind to allow so many computers to tap so much virtual storage. Butler says, "We are on the bleeding-edge," pushing the limits of storage technology and enabling a whole new class of research. The system does not use a traditional network infrastructure, but is connected by Fiberchannel lines, which boosts the speed at which the disks share data. In addition, the single file system afforded by the cluster format allows files to be written in parallel. Later this summer, the NCSA plans to build another supercomputing cluster of 768 Linux machines, sharing 170 TB of storage. This system will connect to the TeraGrid, a pioneering distributed computing collaboration between five U.S. universities and research centers. The TeraGrid will eventually provide more than 1 PB of storage capacity and will enable unprecedented computing applications. At the NCSA, researchers are also working with large companies to ensure that their supercomputing infrastructure also works with commercial applications, such as manufacturing modeling.
http://www.enterpriseitplanet.com/storage/news/article.php/2218261

When Linux developers roll out an upgrade, such as the forthcoming Linux 2.6, there is little hype, which is in keeping with the open-source software community's credo of releasing the software freely to the public and letting the added features stand on their own. This runs counter to the marketing push of commercial software companies, which plug "blockbuster" extras in order to get consumers to upgrade. IBM and other companies are allying themselves with open-source developers such as Linux creator Linus Torvalds to make the Linux operating system a fundamental computing component and break the dominion of Microsoft. Torvalds himself makes little effort to promote the latest Linux upgrades, and has never exploited his invention for personal aggrandizement. He is confident that Linux, with little fanfare, will come to be the operating system of choice for Internet servers and eventually the desktop. However, these predictions may be derailed because of a lawsuit SCO Group has brought against Linux, alleging that the operating system infringes on copyrighted Unix technology, which is supposedly an SCO asset. Although Torvalds says the suit makes little difference in the long run, he warns that "with the U.S. legal system, you'd have to be crazy not to worry about lawsuits." Linux is already benefiting users of desktop "fleets," in which a large numbers of professionals use a small number of programs. Linux was also recently selected as the operating system for Munich city employees, who use some 14,000 PCs.

Teams of researchers at the University of Maryland, Johns Hopkins University, and elsewhere have been given a month to devise an information system that can translate between English and another language selected at random, as part of a Defense Advanced Research Projects Agency initiative to rapidly provide translation tools to deal with unexpected scenarios, such as terrorist attacks or a sudden outbreak of war. The scientists working on the project will share linguistic resources--dictionaries, native speakers, religious texts, etc.--in order to build a system capable of constructing statistical models to translate words and phrases in one language--Hindi, in this case--into English; another goal is to create a system that can automatically condense documents and furnish thematic abstracts of texts. The unique problem with Hindi is that there is a huge amount of available data on the language, but no standard technique of encoding Hindi characters. A Hindi-to-English translation system could theoretically be beneficial to the media or the military, who want to keep tabs on the tense situation between India and Pakistan. "You'd be able to read what the Indian newspapers are saying and what Hindi organizations are putting up on their Web sites--whether they are terrorists or high schools, for example," notes Eduard Hovy of the University of Southern California's Information Sciences Institute. A trial exercise for the project was conducted in March, when a small collection of researchers was given two weeks to develop a system that translated the Filipino dialect of Cebuano into English. At this point, there are no plans to continue funding the machine translation system once the exercise ends at the end of June. Still, Hovy acknowledges that commercial or federal agencies may see the value of such systems, while their development will probably help spur new research opportunities.
http://www.wired.com/news/technology/0,1282,59093,00.html

Privacy and intellectual property advocate Lawrence Lessig says a new legislative effort has been launched after the Supreme Court refused to change a critical copyright policy. Lessig says established copyright holders such as the movie and music industry are preventing a wealth of material from entering the public domain. Since 1976, Congress has passed revisions to copyright law that automatically renew individual and corporate copyrights by extending their lifespan. According to the Sonny Bono bill passed in 1998, no new material will enter the public domain for roughly two decades. Lessig plan to gather enough online signatures to petition Congress to change current policy so that Internet archives and public libraries will have free access to old work that is no longer in distribution. Some old film reels are likely deteriorating, and the thought is that digitizing them and making them available on the Internet will ensure their preservation. Lessig says the wording of his online petition makes it easy for copyright holders to renew their copyrights by charging just one dollar to renew them every 50 years. He says this will protect the estimated 2 percent of copyrighted work created between 1923 and 1942 that is being actively marketed in traditional forms, but would open up the large body of works that no one is interested in any longer. Lessig says technology firms are generally sympathetic to his cause because they understand unnecessarily long copyrights hamper innovation and growth.
http://news.com.com/2008-1082-1013830.html

Millionaire mathematician and musician Hank Risan struck back when Internet users started illegally copying songs off his Museum of Musical Instruments Web site by forming a research venture, Music Public Broadcasting, that claims to have developed a series of products that can effectively protect copyright owners' content from digital pirates. Risan says he used network theory to study the internal pathways data follows across networks and "visualize how to protect the copyrighted material as it transfers through those networks." Music Public Broadcasting alleges that copyright holders can determine what content can and cannot be copied by controlling those pathways via its technology. Risan's museum will launch an online radio service in July that will use the technology to allow users to listen to specific songs while making transmissions invulnerable to digital recording on current computer systems, according to Risan. His company follows the same strategy to make CDs and DVDs copy-proof, boost existing safeguards on music labels' downloadable songs, and prevent the unauthorized downloading of music and movies through file-sharing networks. The viability of this approach rests on Music Public Broadcasting's installation of internal network control software on users' machines, and Risan says users will balk at this move unless the technology can balance out security and freedom. The new radio service will cost consumers double what they pay for other services that do not allow them to request particular songs. Risan explains that sound quality would be significantly improved because the transmissions use less digital compression.
Click Here to View Full Article(Access to this site is free; however, first-time visitors must register.)

The federal counterterrorism Technical Support Working Group has awarded University of California, San Diego, researchers $600,000 to work on automated video systems that can identify possible terrorist activity in crowds. The camera array system would scan crowds and pick out suspicious activity based on computer algorithms. Electrical and computer engineering professor Mohan Trivedi says the benefit of the system is that it replaces human beings with omnipresent computers endowed with artificial intelligence. Trivedi's team has previously developed traffic monitor camera systems that can pick out vehicles in distress on the freeway and alert authorities automatically. Trivedi says the UCSD project involves the "resolution of some challenging research problems in multi-camera, system-based tracking and event recognition." Some of the specific issues the team will deal with are reducing false positives, increasing performance in inclement weather, and improved motion-tracking. Trivedi says getting the cameras to work cooperatively is another task, since high-definition cameras will pick up on something identified by other tracking cameras. He also says the university has strong local contacts with both commercial firms and first-responder agencies that could put the technology to work quickly. Trivedi's project grew out of work developed at UCSD's Computer Vision and Robotics Research laboratory, and is one of nine winning proposals selected by the Technical Work Group dealing with video for public surveillance.
http://www.lajollalight.com/2003/06/05/n030605new_tool.html

Georgia Institute of Technology researchers have developed an aesthetically pleasing data display designed to minimize distraction. The InfoCanvas system displays data as moveable, abstract components within an electronic painting of a desert, a beach, a mountain camp, an aquarium, or a window view. The data elements change as the information changes, while the software is designed to run on an always-on Internet connection. "We're exploring ways of helping people stay aware of secondary information in a peripheral manner, one that does not distract, interrupt, or annoy them," explains Georgia Tech's John Stasko, who uses InfoCanvas on a dedicated screen in his office. Stasko's display consists of a beach scene where a moving sailboat keeps time, clouds in the sky represent weather conditions where his parents live, and a seagull's position symbolizes the Dow Jones performance. An email from Stasko's wife is represented by the appearance of a towel on a beach chair, and moving the mouse over the picture causes text balloons to pop up; important images or news headlines can also appear in the picture as text on billboards or signs towed by a plane. The abstract elements are customizable, so users can keep track of sensitive data without worrying that everyone who enters the office will also be privy to the information. Georgia Tech researcher Todd Miller says the InfoCanvas prototypes were designed with the input of potential users, and adds that the researchers are building pictorial customization into the system through interactive software.
Click Here to View Full Article

The United Kingdom will enact the European Union's Waste Electrical and Electronic Equipment (WEEE) Directive in 2008, making electronics manufacturers responsible for recycling 70 percent of their discarded products as well as designing more environmentally friendly items. Life Cycle Services director Jon Godfrey notes that electronics production costs could rise by more than 65 pounds as a result of the directive, while British recycling companies could be stretched beyond their capacities. "One of the big challenges in the recycling market is that there does not seem to be enough demand for the materials that recyclers are generating," he says. Godfrey adds that recycling is not always the best option for old machines, given the effort required to drop them off or pick them up, so the WEEE Directive could give computer makers an incentive to overhaul or re-use more products. Meanwhile, Computer Aid International head Tony Roberts speculates that the directive could lead to greater exports of refurbished electronics into developing countries for use in projects designed to nurture tech skills. Such machines are often installed in schools in nations such as Nigeria, South Africa, Kenya, and Uganda, and Roberts reckons that 99 percent of schoolchildren in third-world countries lack exposure to computers by the time they graduate. "The multiplier effect of putting in a very simple technology that means little to us means an enormous amount to someone who has not used the stuff," asserts DHL's Colum Joyce. Godfrey estimates that ethical and responsible disposal currently covers a mere 35 percent of computers, while 1.5 million discarded units end up in landfills each year.
Click Here to View Full Article

India is currently the offshore tech support outsourcer of choice for U.S. companies, but Forrester Research speculates that preeminence could shift over the next 10 years as American firms look to nations where IT labor is even cheaper. IBM, Boeing, Intel, and other U.S. companies have started awarding more outsourcing contracts to Hungary, Romania, the Czech Republic, and Russia, though most of the jobs involve software testing and development rather than technical support, according to International Data's Traci Gere. This complicates an already difficult situation for Indian workers, who must contend with an American effort to staunch the migration of tech jobs to India. Gere's research indicates that the Philippines has a strong presence in outsourcing "call centers and sweat shops," but political shakiness could undercut the country's attractiveness. Gere notes that Vietnam is particularly eager to gain prestige as a center of IT expertise, which makes it the most probable Asian challenger to India's tech support domination. Indian entrepreneur Farhat Gupta, who owns several call centers in Bangalore, says that local training programs devote more time to learning U.S. culture than on building technical skills. It is not unusual for call center employees in training to spend the first week watching American films and TV programs to familiarize themselves with the many nuances of U.S. lingo. Forrester calculates that more than 3 million American jobs will be exported overseas by 2015.
http://www.wired.com/news/business/0,1367,59126,00.html

Up to now, it has been impractical to transmit quantum-encrypted data over a conventional fiber-optic line 100 kilometers long because the random noise picked up by a photon detector on either end is too high, giving rise to frequent failures in cryptographic key generation. A team of researchers from Britain-based Toshiba Research Europe has tackled this problem by developing advanced photon detection gear that significantly decreases this noise. Lead researcher Andrew Shields says the equipment "is more sensitive, so we can tolerate a lower signal rate." The previous record-holder for fiber-optic quantum communications was Switzerland-based Id Quantique, which deployed quantum links across 60 kilometers of fiber-optic line in May 2002. Shields believes the technology his team has devised is practical, but speculates that about three years will pass before the device is ready for commercial applications. The U.K. scientists will detail their work at the Conference on Lasers and Electro-Optics in Baltimore. "What quantum cryptography will provide is better security, efficiency and will be future-proof," Shields declares.
Click Here to View Full Article

George Cotter of the National Security Agency's Office of Corporate Assessments told people gathered at an Army High-Performance Computing Research Center luncheon on Wednesday that a larger supercomputing investment is necessary if the U.S. military is to improve vehicle and weapon design, support advanced mapping and intelligence analysis functions, and predict the effects of biological or radiological terrorist attacks, among other things. Such operations rely greatly on simulation and modeling, and no computer system is currently available that can accommodate their processing requirements, Cotter explained. He said the engineering of aircraft and ships would greatly benefit from improved modeling, as would the monitoring of America's nuclear-weapons supply. Cotter added that the targeting accuracy of missiles could be boosted through atmospheric modeling achievable via supercomputing. The need for faster computers to accomplish all these advances was supported by an analysis of high-end computing research and development commissioned by Congress. The Army has poured $4 million into supercomputing research every year for the past two years thanks to a Pentagon mandate to devote more concentration to the military applications of supercomputing. The shift in focus during the 1990s from supercomputing to parallel and distributed computing reduced costs and raised output for high-end computing, but caused computer problem-solving progress to stall, declared Ford Motor's Vincent Scarfino at the luncheon.
http://www.govexec.com/dailyfed/0603/060403td1.htm

Two congressional hearings were recently held relating to two reports, one on the Justice Department's handling of the USA PATRIOT Act and the Foreign Intelligence Surveillance Act (FISA), and the other on the Total Information Awareness program, now called Terrorism Information Awareness (TIA). The reports were produced by the Justice Department and DARPA (Defense Advanced Research Projects Agency), respectively. TIA calls for pursuing terrorists by using data mining techniques. This entails searching through various databases from the Internet and the financial, travel, and health sectors in order to detect irregularities. Hearing witness James Dempsey, executive director of the Center for Democracy and Technology, told the judiciary subcommittee that existing laws "are totally inadequate to deal with the reality of decentralized commercial databases and the new techniques of data mining." At the Government Reform subcommittee hearing, witnesses expressed doubt whether TIA's data mining approach would catch terrorists at all. Paul Rozenwieg, a legal fellow at The Heritage Foundation, said models involving people who rent vehicles and purchase fertilizer set a pattern for "not only Timothy McVeigh, but most farmers in Nebraska." He added that if such a system must be implemented, it should require ample supervision and cause minimal invasion. Rep. Michael Turner (R-Ohio) questioned the cost-effectiveness of data-mining, and asked for alternatives, while Barry Steinhardt, director of the American Civil Liberties Union's Technology and Liberty Program, said that such systems, if they don't work, increase the threat potential since they create only the illusion of security.
http://www.hillnews.com/news/060403/antiterrorism.aspx

Technology has lost its luster with many corporate buyers because it no longer yields strategic advantages, argues Harvard Business Review editor-at-large Nicholas Carr. His article compares IT to the railroad industry in the late 1800s, which eventually experienced a meltdown due to overcapacity. FN Manufacturing MIS vice president Ed Benincasa says much of the technology used at firms is a business necessity, not a competitive edge. In fact, Sun Microsystems chief scientist Bill Joy, who was quoted in Carr's article, wonders if "people have already bought most of the stuff they want to own." Carr says that Microsoft has already acknowledged that replacing standard infrastructure is no longer imperative and has accordingly set up its annual subscription service for the Office software suite. Still, Washington Group International IT enterprise operation director Gary Bronson says the critical factor in business IT is how it is used. Harte-Hanks executive vice president Randy Wussler says the penetration of many technologies is low among businesses. VPNs, for example, are installed at just 25 percent of companies with more than 500 workers, while wireless LANs are deployed at just 6 percent of such firms, he says. Alinean President Tom Pisello says Carr's recommended minimalist tact to IT spending is baseless. Pisello's firm has found in extensive studies--involving 20,000 firms in 40 countries and 400 vertical sectors--that the success of companies cannot be attributed to IT expenditures, no matter how much or how little they spend.
http://www.eweek.com/article2/0,3959,1115053,00.asp

Government officials are revamping Cyber Corps, a program designed to recruit information security graduates into federal agencies. The government pays tuition and a stipend for Cyber Corps students, who in return participate in a summer internship and work at a government agency for a period of up to two years. But officials say different guidelines and oversight methods were hampering the hiring process. In addition, agencies had been reluctant to employ less experienced individuals while higher posts were still available. And although the Office of Personnel Management (OPM) was in charge of handling placements, it did not have the power to enforce them. "The people doing the hiring don't seem to have a stake in the process," says Iowa State professor Douglas Jacobson, who gives counsel to 22 students in the program. Currently, officials are modifying the training program to more closely resemble the Department of Defense's Information Assurance Scholarship Program. The first batch of 50 Cyber Corps students graduate this year; the $30 million program currently involves 150 students with 100 more joining in September. Currently, 13 universities have been certified as Centers for Academic Excellence by the National Security Agency. Although the Cyber Corps program has the support of the White House and Capital Hill, Ernest McDuffie, lead program manager at the National Science Foundation, says getting its graduates hired has become a "daunting logistical and bureaucratic problem."
http://www.infosecuritymag.com/2003/jun/cybercorps.shtml

Convicted hacker Kevin Mitnick, now head of his own computer security consultancy, provides a view into the motivations and methods of hackers. "Condor," as Mitnick was known in hacker circles, says he was not out to steal money or destroy property, but simply addicted to the thrill of compromising secure systems. Early on, at just 16, Mitnick demonstrated his favored "social engineering" tactic by calling a Digital Equipment system manager and posing as a lead developer for the company that needed a password. George Washington University political psychology program director Jerrold Post says hackers, who are often social outcasts, relish their ability to outwit authority. Mitnick was first caught in his mid 20s and spent a year in prison, but later was sought again by the FBI for two years, during which Mitnick traveled across the country and hacked prolifically. Novell systems administrator Shawn Nunley remembers being victimized by Mitnick in 1994, when Mitnick called him in the guise of an out-of-town employee who needed an account. Nunley called the employee's voice mail to check if the recording matched who he heard on the phone; it did, because Mitnick had earlier convinced Novell's help desk to reset the voice mail password and recorded his own message. Through the episode, Mitnick obtained the source code for NetWare, Novell's key software product. Today, the importance of the Internet has dramatically increased the need for computer security and raised the consequences of successful intrusions; computer security spending is expected to hit $13.5 billion this year, double that from 2000. But spending alone won't ease the threat of "social engineering," or the human side of hacking; as Mitnick says, "There is no patch for stupidity."
Click Here to View Full Article

A truly autonomous robot must be capable of extracting a picture of its surroundings and using the map as a tool to navigate, a methodology known as simultaneous location and mapping (SLAM). Today the majority of autonomous vehicles use dead reckoning, in which they follow lists of directions preset by a person with an existing map. Wheel slippage and other factors give rise to cumulative errors that make the technique imprecise and unreliable, and though the Global Positioning System (GPS) may seem like an effective solution, the technology also suffers from inaccuracy, while GPS signals can be disrupted by jamming devices and cannot penetrate certain obstructions. Furthermore, even the combined use of GPS and a map is pointless if the robot encounters unfamiliar terrain. Whereas previous approaches have focused on robots that handle mapping and positioning functions separately, the SLAM paradigm, as outlined by NASA's Peter Cheeseman and colleagues, proposes the simultaneous tackling of both problems by making robots capable of building self-consistent maps; this is accomplished through the use of a mapping algorithm that not only measures the relative positions of environmental features, but also potential errors in the robot's position as it moves. Thanks to the SLAM approach, MIT's B21 robot, which maps its surroundings by laser measurement, can close loops. Some scientists are also investigating more advanced ways to handle measurement errors, such as recognizing landmarks. The success of SLAM should yield robots that can carry out exploration and reconnaissance of areas too dangerous for humans, furnish military maps without having to worry about GPS jamming, and aid explorers who get confused by complex and repetitive surroundings such as caves.

Firms such as Nuance Communications and SpeechWorks are making a splash with interactive voice response software that allows automated call centers to more smoothly interact with customers, but this is only the first step in the rollout of language-processing systems. Projects are underway at IBM, the Palo Alto Research Center (PARC), and elsewhere to develop computers that can understand natural speech; International Data's Steve McClure speculates that, "Whereas the GUI [graphical user interface] was the interface for the 1990s, the NUI, or 'natural' user interface, will be the interface for this decade." A truly interactive language processing system must be able to precisely convert human speech into text the computer can read, deduce meaning by studying vocabulary and sentence structure, and supply human-sounding responses that make sense. Breakthroughs in the area of language understanding stem from awareness that people value machines more for their helpfulness and efficiency than for their conversational abilities, and that the best language-processing model combines grammatical structure analysis with statistical analysis. However, though this model has yielded very helpful interactive voice response systems for United Airlines, the U.S. Post Office, and others, it does not represent true language understanding. PARC research fellow Ron Kaplan believes that a natural-language interface would be more effective if it were stripped of the need for system customization, but the chief barriers to this achievement are the smallness of language sample databases and statistical algorithms that eliminate ambiguity, which can rob a sentence of its true meaning. His solution is the development of the Xerox Linguistic Environment, grammar-driven software designed to retain ambiguity. Meanwhile, IBM is trying to enhance the management of unindexed, unstructured data on computer networks via natural-language processing software called the Unstructured Information Management Architecture.
http://www.technologyreview.com/articles/roush0603.asp