Speaking at the third Networked Economy Summit in Reston, Va., the president's special adviser for cyberspace security Richard A. Clarke urged that government, industry, and academia cooperate to safeguard the country's information networks from cyberattacks both within and without. Clarke recommended a strategy in which network security officials find security flaws and eliminate them before threats can develop. Another summit speaker, Guy Copeland of Computer Sciences, noted that the fast pace of technological innovation has led to a profound lack of network security personnel. Clarke said that academic institutions should lead the initiative to educate technicians in order to remedy this shortage, while Copeland added that industry must persuade clients to invest in information security technology. It is up to the government to spread awareness of the security problems and their ramifications, according to Clarke. He said that his office and the Office of Homeland Security will present a "National Strategy to Secure Cyberspace" to President Bush in mid-September. The strategy incorporates the advice of a dozen distinct sectors of society. Other speakers at the summit included FTC Chairman Timothy J. Muris and former New York City mayor Rudolph W. Giuliani.
http://www.washingtonpost.com/wp-dyn/articles/A28471-2002Jun10.html

A new computer science class at Stanford University is teaching students the political aspect of their field, and is indicative of a growing awareness among programmers about the effect of law on their work. The new class is taught by Barbara Simons, IBM researcher and former president of the Association of Computing Machinery, and Edward Felten, a Princeton University researcher who last year was threatened with prosecution because of his work. The class focuses on clarifying policy and technology in hot topics such as national ID cards, deep linking, domain names, and cookies. "I really think the profession is on the front lines," Simons says of computer science's clash with intellectual property interests. "If we lose, one of the things we lose is the freedom to speak out and do research." Felten warns that the laws currently on the books will have a chilling effect on research and that they are increasingly encroaching on activities that used to be legal, such as reverse engineering. Representatives of the tech industry, both on the grassroots and executive level, have begun to think through their positions and express themselves, giving Felten optimism that programmers in the future will have as much sway with lawmakers as the politically savvy entertainment industry.
http://zdnet.com.com/2100-1104-934602.html

Researchers at IBM's Zurich laboratory report in the current issue of IEEE Transactions on Nanotechnology that they have devised a punch card-like technology capable of storing 1 trillion bits of data per square inch, effectively enabling the storage of 200 CD-ROMs' worth of data on a surface the size of a postage stamp. Project leader Dr. Peter Vettiger teamed up with scanning tunneling microscope co-inventor Dr. Gerd Binnig to develop the technology, which is dubbed Millipede. The researchers store data by making indentations in a thin plexiglass film with a heated silicon tip; to read data, the tip is heated at a lower temperature and reinserted into the holes, where it cools. Data can be erased by passing a hot tip over the indentation, thus allowing virtually unlimited rewriting. To make up for the long time it takes a single tip to read and write data, another Millipede chip prototype employs over 1,000 tips working in tandem. Another computer memory technology being developed at Nantero, a startup in Woburn, Mass., involves carbon nanotubes arranged in twos, placed at right angles, and separated by a small space. The tubes are pushed together through the application of a voltage, and remain that way after the voltage is removed, allowing memory to be retained even after a computer is deactivated; reversing the voltage draws the tubes apart. Nantero plans to have a prototype ready by the end of 2003 and start producing the device a year later.
http://www.nytimes.com/2002/06/11/science/physical/11DATA.html(Access to this site is free; however, first-time visitors must register.)

The National Security Agency (NSA) is working with open source developers on a Linux module that provides extra security features, such as mandatory access controls. Linux systems equipped with the Security-Enhanced Linux (SELinux) can, for example, demand data such as IP addresses to verify a user. Discretionary access controls such as user names and passwords, by contrast, can be easily manipulated by hackers. A small group of volunteers is also helping with the project, and it is attracting the attention of non-Linux developers as well. Linux security developer Shaun Savage says that SELinux protects against hacker exploits even administrators are unaware of because of the inherently secure nature SELinux adds to Linux systems. Still, Savage warns that SELinux is not easy to integrate because of the unique nature of security programs. Grant Wagner of NSA's Secure System Research Office reports that SELinux has been successfully deployed in both the private and public sectors. "These reports indicate that SELinux is very effective and has countered actual attacks mounted against systems," he adds. Grant Wagner, technical director for NSA's Secure Systems Research Office, says adoption of the SELinux prototype has exceeded expectations, as both public and private organizations have already used the Linux add-on to thwart actual attacks.
http://www.wired.com/news/linux/0,1411,53004,00.html

Experts say that grid computing, in which complicated applications are split up and distributed across a network of processors, will help government agencies run homeland security scenarios such as terrorist attacks through simulation. A project that harnesses the computing power of two IBM supercomputers at Indiana University and Purdue University will help researchers at the Centers for Disease Control and other government agencies predict the response of U.S. citizens to emergencies such as biological or nuclear attacks. "What we will be able to do is a fire drill for homeland security that is incredibly complicated because it will involve the whole country," explains Alok Chaturvedi of Purdue's Krannert School of Management. He notes that the project involves the creation of 1 million artificial agents programmed to exhibit real-world behavior as if they were actual American citizens. Patterns to be modeled in the synthetic environment include panic fleeing, mood swings, and the lag time between the contraction of disease and the manifestation of symptoms. Chaturvedi says grid computing allows such simulations to be processed in a matter of minutes. The Institute for Defense Analysis co-developed the scenarios with the Krannert researchers. In the next few weeks, they intend to simulate a biological or chemical disaster based in Illinois and Indiana, while later simulations will involve the entire nation.
Click Here to View Full Article

Finding ways to reduce the amount of power consumed and heat generated by computer chips will be a major focus of the VLSI Symposia on Technology and Circuits. IBM Microelectronics' Bijan Davari says that power consumption will likely "become the limiter" of chip design, as more and more transistors are packed onto chips, giving rise to power leakage and interference. The solution is to develop low-power materials or increase the efficiency of transistors. At the symposium, IBM engineers will unveil the CU-8 manufacturing process for application specific integrated processors; at the core of this process is the voltage island concept, in which designers build system-on-a-chip processors that vary their voltages across the chip, enabling lower power consumption. Pathfinder Research President Fred Zieber says that IBM's voltage island concept "is a very nice addition to IBM's [90 nanometer] process and something I think you'll see increasingly as people go to that dimension." Prototype chips manufactured by the CU-8 process should debut in the third quarter of 2002, while the method should start showing up in commercial products in 2003. Meanwhile, Intel will submit several papers at the symposium: Engineers will discuss ways to ramp up PC processor clock speed while preventing leakage, as well as the creation of lower-power on-chip buses, low-power clocks, and leakless caches. Intel research fellow Shekhar Borkar states that mating these new methods with the company's latest transistor design and packaging innovations could lead to chips that consume up to 50 percent less power.
http://news.com.com/2100-1001-934355.html

A number of leading technology companies involved in the development of new wireless operating systems have formed the Open Mobile Alliance to develop a universal wireless standard. The alliance will write new code that will serve as the building blocks for different wireless platforms, including those especially crafted for enterprise applications. The Open Mobile Alliance replaces the WAP Forum, which had coordinated the deployment of Wireless Application Protocol, the disappointing wireless standard now used on handhelds for Internet access. Companies involved in the group include Vodafone, NTT DoCoMo, Nokia, Motorola, and Microsoft, but not Palm, whose operating system still runs on more than half of all handheld computers. The hope is that the Open Mobile Alliance will be able to set the stage for interoperable wireless applications and services, including multimedia messaging and wireless games.
Click Here to View Full Article

Both Microsoft and the holdout states presented their recommendations to resolve the antitrust case against the software giant to U.S. District Court Judge Colleen Kollar-Kotelly on Monday. Microsoft's Jim Desler reiterated the company's position that its 2001 settlement with the Justice Department and nine other states is "fair and appropriate," while the document the company filed describes the dissenting states' proposed resolution as "unworkable," as well as "hopelessly vague and ambiguous." Microsoft also implied that the states' provisions would be detrimental to consumers and the PC sector, and would furthermore exceed the 12 acts that the U.S. Court of Appeals has deemed anticompetitive. The suing states want Microsoft to issue a streamlined version of the Windows desktop operating system so that hardware manufacturers can ship PCs with products from rival vendors, as well as freely distribute the Internet Explorer code to developers and make the Office desktop software compatible with competing operating systems. The release of the states' recommendations was followed by a statement from Iowa Attorney General Tom Miller, who claimed that "The filing gives the Court a powerful foundation for ordering injunctive relief necessary to repair the systematic harm that Microsoft has inflicted upon consumers and competition in the computer software industry." Closing arguments from both sides are supposed to commence on June 19, while Kollar-Kotelly is expected to make her decision in the next couple of months.
http://www.infoworld.com/articles/hn/xml/02/06/10/020610hnmscourt.xml

About 24 exabytes of unique information has been produced by the human race, according to a two-year-old study from the School of Information Management and Systems at the University of California, Berkeley, while study leader Hal Varian has noted the possibility of an acceleration of data growth in a recent update. The Berkeley study finds that an incredible amount of data is being generated by individuals, with most of it being stored on computer systems; however, most of this information is not coming from the Web, but rather from emails and transactions. This has been a boon to the data storage industry, where competition is especially fierce. There is a wide array of data storage systems on the market, ranging from low-end tape drives to more advanced and expensive disc storage systems. Data storage specialists are falling into two camps--hardware and software, with profitability favoring the latter. The flood of data could inundate business managers and complicate their corporate decision-making, but analytics software may help make that information more manageable. One of the key challenges is making the software more accessible to average users--for now, the tool is mainly used by specialists. Varian says that "[Statisticians] are the people who can really extract value from a lot of those databases that companies are developing now," and predicts that a career in statistics will become especially attractive within the next decade.

Industry officials, politicians, and local government leaders this week will gathered at a "Massachusetts broadband summit" to discuss the rollout of high-speed Internet infrastructure. Rep. Edward J. Markey (D-Mass.) is expected in his keynote address to shift the emphasis from infrastructure availability to services provision. Like FCC Chairman Michael Powell and other leaders, Markey says providers such as Verizon and AT&T Broadband need to offer the approximately 70 percent of consumers currently able to connect a better reason to do so. Still, Markey suggests that government can intervene through local public-private partnerships to reach the remaining 30 percent without broadband availability. Public-private partnerships would aggregate demand and create anchor subscribers in unconnected areas, for example. He cites the Tech Collaborative's "Berkshire Connect" project, which built up a buyers' cooperative for broadband service in Western Massachusetts, as particularly helpful. Markey also says that Congress could do more to regulate intellectual property rights so that consumers would be able to access rich media such as music and video online through broadband.
Click Here to View Full Article

The federal government's technology infrastructure--particularly its ability to search and transmit data--is in poor shape and could be putting the country at risk. Communications between the FBI and other agencies is severely limited because they rely on obsolete flat-file databases and specialized, noncommercial software platforms that cost a lot to manage and upgrade. Search capabilities can be dramatically improved by switching to relational databases that can detect information overlaps, and gradually transitioning to such programs via conversion software will help ensure that older data is not lost. A better system to keep track of data and who is accessing it can then be deployed by installing commercial search software from such vendors as Google, AltaVista, and Inktomi. The government should also make an effort to beef up the security of sensitive data, and one possibility is the implementation of a separate, virtual private network-based email system. Improved computerized information analysis methods can be undertaken once updated data search and transmission capabilities are in place--for instance, the FBI could develop a program that automatically searches for overlapping data between intelligence initiatives and newly filed reports. Even if this interagency update is successful, there will still be difficulties in managing the mountain of data that will exist. Paul Roberson of computer-security firm TruSecure notes that "They're already at information overload at the FBI, the CIA, and the NSA."
Click Here to View Full Article

Sen. Conrad Burns (R-Mont.) says he is likely to introduce legislation forcing ICANN to allow U.S. government influence in ICANN decisions, and says that if ICANN does not comply, ICANN's contract to manage the DNS may be cancelled when it comes up for renewal this fall. Burns says ICANN is operating in a closed manner, is exceeding its authority, and is unaccountable to all interest groups, issues that will be addressed during a Senate subcommittee hearing on Wednesday, June 12. Burns believes that ICANN should operate with the same type of internal structure as any federal agency. During the organization's tenure, ICANN has driven domain-name registration prices down from $50 to $10, according to General Accounting Office director Peter Guerrero.
Click Here to View Full Article

Vanderbilt University's assistant professor of mechanical engineering Kenneth Frampton is using micro-electromechanical systems (MEMS) and distributed computing to overcome the technical limitations of smart sensor networks that have the potential to automate vehicles' response to changing conditions. "Our primary focus is on embedded systems in which many processors are integrated into the system," he notes. Central computers cannot handle smart networks once they exceed about 100 nodes, so Frampton's team has developed a distributed computing environment in which each node controls a subset of sensors and actuators via a low-power microprocessor. Frampton says the nodes can communicate with each other and work together, thus reducing the amount of data to be managed and keeping the processors' workload steady even as the size of the system increases. Embedded systems can also tolerate more faults than centralized systems. Frampton theorizes that an embedded system of 100 nodes will produce the approximate work equivalent of a centralized system of 1,000 or 10,000 nodes. His team is using the embedded system approach to design a smart vibration-reduction system for spacecraft, as well as a system that controls the noise and vibration levels in a rocket payload faring, which he says will lower "the cost of manufacturing satellites and other equipment boosted into space" because "they can be built lighter."
http://sci.newsfactor.com/perl/story/18138.html

Russia is making rapid gains in the software outsourcing sector, according to attendees of the Second Software Outsourcing Summit in St. Petersburg last week. Brunswick Warburg estimates that Russian software exports this year will almost double from last year, and total $300 million. Some $1 billion in shipments is expected in 2005. To reach this goal will require significant improvements: Speaking at the summit, IBM executive Robert Williams explained that Russia needs to aggressively market its programmers, while the government must beef up its support of offshore businesses. India currently leads the software outsourcing industry, but U.S. companies are worried that a war with Pakistan could jeopardize software exports from that country, while rising Indian outsourcing costs only add to the burden. Still, India is expected to export $8 billion in software this year. The majority of Russian firms are limited to the Internet as a marketing outlet, while Indian companies boast much higher marketing budgets. Furthermore, Indian software companies may be fewer in number than Russian companies, but they are larger in size.
http://www.sptimes.ru/archive/times/776/news/b_6615.htm

State capitols along the U.S. Rust Belt are competing in a race to build the next major high-tech research hub, snapping up top tech talent and making heavy investments in such technologies as bioinformatics and optoelectronics. New York plans to create four "Centers of Excellence" where industry and academic researchers can connect, and has recruited University of Rochester professor Wayne Knox to organize a team of experts in myriad fields to produce academic research that can be commercialized. The Centers of Excellence, which will receive funding from Eastman Kodak and IBM, among others, will focus on nanoelectronics, information technology, bioinformatics, and photonics. Over 20 years, Michigan will pour $1 billion in tobacco settlement funds into the creation of a life-sciences hub, while Kansas City, Indianapolis, Atlanta, and Pittsburgh are working on similar initiatives. "Even if the research doesn't pan out you're going to have a work force that's better educated, has a higher level of skills and will be able to compete in the global economy," says Dan Berglund of Westerville, Ohio's State Science and Technology Institute. He says the states that offer incentive packages have the best chance of reaping the most rewards.
Click Here to View Full Article

The Progressive Policy Institute says Massachusetts, Washington, California, and Colorado top the list of states most ready to succeed in the new economy. In a new survey, those states had the best mix of tech skills, entrepreneurship, and technology-friendly policies of all 50 states, while Maryland, New Jersey, and Connecticut also ranked highly. Report author Robert Atkinson says the impact of information technology on the economy has produced the type of sea change that only occurs twice a century. Massachusetts was ranked highest in the survey due to its abundance of both hardware and software firms, as well as its top-grade universities, Harvard and MIT. Meanwhile, Washington state touts a strong entrepreneurial culture and is a hotbed of software development, anchored by companies such as Microsoft. The report said that access to skilled workers was one important factor for new economy regions, and suggested that states think about job training and quality-of-life issues in order to attract talent.
Click Here to View Full Article

A five-member Board of Technologists consisting of inventor and futurist Ray Kurzweil, Melanie Mitchell of the Santa Fe Institute in New Mexico, Ardesta VP for nanotechnology Sandeep Malhotra, director of Cap Gemini Ernst & Young's Center for Business Innovation Chris Meyer, and IBM's Paul Horn convened in May to discuss technologies that will dramatically impact business models and profitability in the coming years. Horn is leading a crusade in autonomic computing at IBM, and the corporate benefits of self-regulating systems include cost savings. Mitchell's area of expertise is evolutionary computing, in which features of software programs can be recombined to form a hybrid offspring that performs significantly better than either parent. Behavior-based computer simulation of self-organizing systems, as demonstrated by software developed by Meyer's team, can increase a company's efficiency and lower costs by estimating how customers and rivals will react to certain conditions. Nanotechnology companies, which Ardesta incubates, are researching and building self-assembling devices of superior performance and quality; such devices have a wide range of applications--medical, construction materials, appliances, and electronics. Kurzweil predicts that computing power will outdistance that of the human brain within a generation, because "our rate of exponential growth is growing exponentially." All of these technologies are modeled after biological processes: Autonomic computing is the network counterpart to the autonomic nervous system, evolutionary computing takes its cues from cellular evolution, behavior-based simulation uses self-organizing natural systems as a template, nanotechnology is derived from basic physical processes that all matter exhibits, and Kurzweil's anticipations are based upon evolution's tendency to accelerate.
http://www.time.com/time/business/printout/0,8816,257102,00.html

Robotics pioneer Rodney Brooks of MIT's Artificial Intelligence Lab argues in his new book, "Robot: The Future of Flesh and Machines," that there is a critical ingredient missing from robotics and AI that hampers their evolution--"something in all living systems, even a bacterium, which we have tried to emulate but are missing," he suggests. Brooks says we need "new stuff" that will define the distinction between living and non-living systems, and this could involve a new way of describing organization mathematically that could hypothetically transcend traditional computation. Brooks says this new stuff could be discovered by examining "the edges of where we don't quite understand things in biological systems;" the next step is to program robots or AI systems to simulate the biological model "and see where our understanding breaks down." He says his chief goal is to design a living machine, but building such a machine requires a detailed investigation into self-assembly, energy, metabolism, self-repair, and self-reproduction. Such a machine could be built from reformable polymer material. Brooks acknowledges that adaptable robots with human-like intelligence could develop "something like emotions," which could complicate hopes to employ such robots as ethical slaves that people can use without feeling guilty. Even if such robots cannot be used for labor, they can still be used to gain important knowledge about human beings, Brooks attests. He also admits that the development of AI and robots for military use such as weapons systems is a difficult issue, but deciding whether to prohibit them is not up to him.

The Six Sigma quality approach focuses on companies aiming to develop products and services that have zero defects at the outset in order to stop time and money from being wasted in fixing flawed products, and to prevent the inevitable refunds and loss of customer service. The Smarter Six Sigma Solutions (S4) methodology can help achieve the strategic goal of generating cash and customers: Strategies to attain such goals are derived from two business metrics--satellite-level metrics (earnings per share, growth, sales, return on investment capital) and 30,000-foot-level metrics (costs, cycle time, inventory, efficiency, waste). The path from goals to projects that help achieve those goals can be mapped out with an S4 business strategy, which uses a DMAIC (define, measure, analyze, improve, and control) approach to manage those projects. So that managers will know the right questions to ask employees and be able to help them find the right answers, the voice of the customer needs to be addressed and tracked. IT departments can play a critical role in this process by converting information into vital knowledge and giving the supply chain complete access to the data. It is their job to devise effective data capture, storage, and delivery systems for users. Six Sigma techniques are moving out of the manufacturing sector and into all kinds of companies seeking to improve their business.
http://www.optimizemag.com/issue/007/management.htm