Though the Blaster worm that started proliferating on Aug. 11 is not designed to be particularly destructive, it is enough of a headache to cause the shutdown of government agencies and services such as the Maryland Motor Vehicle Administration, impede corporate networks, and threaten to cripple the Web site where people can download the Blaster software patch. Home computer users, who are notoriously slow when it comes to updating their systems with the latest security measures and procedures, are especially vulnerable to Blaster, which has affected up to 100,000 computers, by SANS Institute estimates. Blaster exploits a vulnerability in the newest versions of the Windows operating system--Windows NT 4.0, Windows 2000, Windows XP, and Windows Server 2003--so it can infect computers and instruct them to attack the Windows Update Web site on Aug. 16, after which the worm seeks out other unprotected systems. Aside from switching to Linux or some other non-Microsoft OS, users could thwart the worm by installing a Microsoft patch that fixes a flaw in Windows' Remote Procedure Call software, or by deploying a firewall program to shield their Internet connections. Embedded within Blaster's code are assaults on Microsoft CEO Bill Gates, with messages such as, "Billy Gates why do you make this possible? Stop making money and fix your software!" Microsoft and the Department of Homeland Security issued a warning of the Windows vulnerability almost a month ago, and Blaster is programmed to continue to attack until Dec. 31, after which it will assault the Windows Update site on the 16th of each month. Critics are taking advantage of Blaster's spread to argue that Microsoft's heightened security focus is ineffective, while Microsoft security VP Michael Nash claims that the incident illustrates the viability of the strategy, given that the company quickly designed and issued a patch soon after the problem was recognized.
Click Here to View Full Article

The continued health of the U.S. supercomputing effort will be determined by a federal strategy that supports the research and development of both custom-built and commercially available supercomputing technologies in equal measure, according to a recent report from the National Academy of Sciences (NAS). "Balance is needed between exploiting cost-effective advances in widely used hardware and software and developing custom solutions that meet the most demanding needs," argues the NAS report. The study is one of three expected to support R&D of supercomputing clusters assembled from off-the-shelf components while beefing up custom technology R&D initiatives that could lead to systems capable of petascale computing. NAS Future of Supercomputing co-chair Susan L. Graham advises researchers to take the emergence of the custom-built NEC Earth Simulator with a grain of salt, arguing that there is still plenty of room for both clustered and custom supercomputers. Meanwhile, a report from the High-End Computing Revitalization Task Force is expected to outline a five-year strategy for providing a new source of funding for petaflops-scale computing and customized architecture R&D. The third report, put together by a group of high-level government consultants known as Jasons, will evaluate the Accelerated Strategic Computing Initiative (ASCI) program, which is supposed to deliver a 100 teraflop system by 2005 and a petaflop system by 2013. The Jasons are advising ASCI planners to allocate funds for multi-user capacity machines rather than focusing exclusively on high-end, single user capability machines. ASCI planners intend to evenly split their annual systems budget of $70 million to $100 million between the two supercomputing types within five years.
http://www.eetonline.com/sys/news/OEG20030812S0011

Columbia University researchers report in the journal Science the confirmation that six degrees of separation do indeed exist between any two people in the world, but contacting far-flung people through this chain of acquaintances is usually a futile exercise. The study involved over 60,000 people who attempted to contact one of 18 people throughout 13 countries via email, but less than 2 percent of the 24,613 email chains that were initiated reached their targets. The successful chains were able to reach their targets in only four steps, while the unsuccessful attempts were thought to stem from the failure of someone in the middle of the chain to forward the message, possibly because the email was mistaken for spam; fewer than 1 percent of people who failed to participate reported that they could not think of anyone to forward the message to, indicating a lack of interest. Furthermore, "Just because President Bush is six degrees from me doesn't mean I'm going to be invited for dinner at the White House," notes Columbia sociology professor Dr. Duncan J. Watts. "You can ask a friend of a friend for a favor, but that's about it." The Columbia researchers estimate that 50 percent of the email chains could have reached their targets in five steps or less if the first sender and the target lived in the same country. Interestingly, almost half of the successful chains contacted a Cornell University professor whom Dr. Watts did not think of as highly socially connected; this phenomenon may be attributed to most of the participants being college-educated and more than 50 percent being American. Also notable was the fact that the social networks did not subscribe to the hub-and-spoke architecture of airline routes, and the successful email chains contained more "weak" links than the unsuccessful chains.
http://www.nytimes.com/2003/08/12/science/12MAIL.html(Access to this site is free; however, first-time visitors must register.)

Wireless sensor networks, also known as smart dust, are expected to revolutionize scientific research. In one smart dust experiment, small sensors linked to batteries, radios, and computers are scattered throughout a redwood forest in California, where they record temperature, humidity, and lighting levels; this is a far cry from conveying expensive equipment up a tree and extensively wiring the gear to a computer, all to gather data from a single location. The small computers embedded in the sensors can also process some of the data en route so researchers can study only the most critical information, thus streamlining analysis. The redwood experiment is part of a University of California, Berkeley, smart dust effort funded by the Defense Department, which is expecting to use the technology to enhance military operations. Observers say that smart dust technology will not spread until the sensors--which currently cost about $200 individually--come down in price. Other wireless sensor network projects being planned or implemented include Digital Sun's effort to create automated irrigation systems that water crops based on moisture readings; programs to preserve ancient Chinese cave artwork and the fortress of Masada; a National Science Foundation venture to predict the spread of wildfires by studying environmental data gathered by tiny sensors; and a UC-Berkeley project studying the effects of earthquakes on buildings' structural integrity. David Culler, director of UC-Berkeley's Intel Research Berkeley laboratory, forecasts that, eventually, "99 percent of the Internet is going to be billions of these little devices, streaming information." At UCLA's Center for Embedded Network Sensing, scientists aim to complement wireless sensor networks with miniature robots that would enhance research by snapping photos, conducting chemical analysis, and performing other functions.
Click Here to View Full Article

Although the National Conference of Commissioners on Uniform State Laws (NCCUSL) was pressured by broad opposition to stop campaigning for the state-by-state adoption of the Uniform Computer Information Transactions Act (UCITA), both opponents and supporters agree that the act is far from dead. Alan Fisch of Howrey Simon Arnold & White notes that there is little disagreement that existing Uniform Commercial Code is unsuitable for intangible transactions such as software licensing, but cautions that "for any new uniform code to be an effective solution, there must be widespread acceptance of its underlying principles, which is not true of UCITA." UCITA has been passed into law by the states of Maryland and Virginia, and vendors can apply those states' UCITA provisions to a software contract in other states. Furthermore, the lack of uniform governance for software licensing and online contracting gives courts the authority to resolve licensing matters, and UCITA is a template that the courts could use. Additionally, vendors can continue to push for state UCITA adoptions without NCCUSL's patronage. "I think we're going to remain vigilant," says Americans for Fair Electronic Commerce Transactions President Miriam Nesbit. Her organization will convene in September to review its options, one of which is continuing to support the adoption of bomb-shelter legislation that bans the application of Maryland or Virginia's UCITA laws outside those states. Thus far, North Carolina, West Virginia, and Vermont have passed bomb-shelter legislation.
Click Here to View Full Article

American tech workers are frustrated and disgusted that they are being forced to train replacements from India and other foreign lands brought in on visas such as the L-1. The L-1 classification permits companies with U.S. branches to send workers to America for as long as seven years, supposedly for their "specialized knowledge" or to train them in corporate culture--and allows employers to pay these professionals less than prevailing U.S. wages. The number of L-1s approved by the State Department rose almost 7 percent between the first six months of fiscal 2002 and that same period in fiscal 2003; Rep. Rosa DeLauro (D-Conn.) proposed legislation in July calling for an annual cap of 35,000 L-1 visas. Intel's Gail Dundas acknowledges that her company needs domestic workers to train L-1 workers so that the latter can competently run offices in Russia, China, and other emerging markets, but insists that American workers are not displaced. And Dan Larson of Texas Instruments says that decreasing numbers of U.S.-born science engineering graduates is forcing companies to look overseas, where tech talent is booming. Some labor experts advise programmers to quit complaining and redirect their energies to expanding their marketable skills, but out-of-work tech professionals retort that the offshore outsourcing of lucrative positions makes such efforts pointless.
Click Here to View Full Article

The growth rate of hard-drive data density appears to be slowing down, and analysts such as Ashok Kumar of U.S. Bancorp Piper Jaffray attribute this slackening to basic technological constraints. Currie Munce of Hitachi Global Storage Technologies estimates that the annual 100 percent growth rate data density has enjoyed in the last few years could fall 40 percent to 60 percent over the next several years. Aftereffects of this trend could include price cuts, a shift in market dynamics from technology leaders to followers, and a refocusing on smaller, power-efficient drives for laptops and other popular items rather than increased density. Disk/Trend President Jim Porter says less demand for denser platters in desktop PCs is partly responsible for the falloff, while Munce adds that the read/write heads in hard drives are reaching their size limitations. Another factor Porter calls attention to is the "superparamagnetic limit," which is likely to become problematic as the disk's magnetic domain continues to contract. In the meantime, companies are developing alternative data recording techniques--such as longitudinal recording, heat assisted magnetic recording, and perpendicular recording--to boost capacity. Still others are pursuing alternative data-storage technologies, such as magnetic RAM. Kumar says Samsung and other companies that devote more energy to low-cost manufacturing are more likely to benefit from the density slowdown than companies that depend on delivering cutting-edge technology for their revenue. On the other hand, the latter approach could give companies a competitive edge if the strategy yields a way to further miniaturize disk drive heads.
http://news.com.com/2100-1008_3-5061923.html

A researcher in the United Kingdom has created an eavesdropping tool to show how vulnerable Bluetooth-enabled laptops, mobile phones, and handheld computers are to someone looking to steal data from the devices. Ollie Whitehouse, who works for the computer security company @Stake, says his "Red Fang" software program can be used in a setting such as a train system to scan devices and determine whether they are unprotected. Whitehouse views his effort as being similar to Wi-Fi "wardriving," where people seek out poorly secured 802.11 wireless networks. Whitehouse says many people do not know that Bluetooth wireless technology is stalled on their devices and that their security settings are often turned off. Red Fang was unveiled at the Defcon computer security conference in Las Vegas at the beginning of the month, and Bruce Potter, a security expert with U.S. think tank The Shmoo Group improved the tool by making it more user-friendly. Potter expects Bluetooth security to become a growing concern, considering the high penetration of Bluetooth-enabled devices and the lack of knowledge about Bluetooth security in corporate security departments. A September 2002 report by Gartner indicates that many people do not activate Bluetooth's security features, which potentially exposes their devices to hackers.
http://www.newscientist.com/news/news.jsp?id=ns99994041

Ultrawideband (UWB) will be ready to move from its consumer electronics niche to the enterprise, once a task force of the IEEE's Wireless Personal Area Networking working group settles on UWB standards. The leading candidate for 802.15.3a standardization is the MultiBand Orthogonal Frequency Division Multiplexing (OFDM) Alliance's proposal supporting the encoding of data with 802.11a and 802.11g standards, and the division of available spectrum into multiple bands that can be employed concurrently to shield against interference, says Stephen Wood of OFDM backer Intel. Another 802.15.3a standard contender championed by XtremeSpectrum and Motorola encompasses the whole frequency band, except for the 5 GHz range in order to circumvent interference with military frequencies. The 802.15.3a standard is expected to operate at a rate of 110 Mbps at 10 meters and 480 Mbps at 1 meter, and some analysts suggest that such a breakthrough would make UWB a viable alternative to Bluetooth, though Wood cautions that at least four years will pass before technology based on 802.15.3a is on a par with Bluetooth's cost and availability. With two distinct flavors of 802.15.3a standards emerging, Julius Knapp of the FCC's Office of Engineering and Technology promises that his agency "will move quickly to clarify any questions on how the rules are implemented currently." For instance, there is agreement among both working groups that the UWB compliance testing methodology is still murky. Even if 802.15.3a fails to supplant Bluetooth, the IEEE may accomplish the same goal with either the 802.15.3 or ZigBee specifications.
http://www.infoworld.com/article/03/08/08/31NNwireless_1.html

The two Mars Exploration Rovers currently en route to the red planet rely on mostly old hardware and a fragile data connection, but scientists are planning to make the probes carry out far more sophisticated maneuvers and experiments by swapping old software for new throughout the course of their mission. Dave Kleidermacher of NASA contractor Green Hills Software says the mission is split into smaller missions--one mission is navigation, another is landing, and a third is carrying out experiments on the surface of Mars. "The operational software for one mission will be uploaded and the operational software for the previous mission will be discarded since it is no longer necessary," he notes. The first software swap is slated for November, when the spacecraft will have traveled 100 million miles from Earth. NASA chose to incorporate old technology into the rovers--the same technology used in the 1997 Pathfinder mission, in fact--to increase the chances of success and avoid errors that led to the costly loss of the Mars Polar Lander and Mars Climate Orbiter in 1999. There was also a priority to launch a new Mars mission as quickly as possible, which shortened the development cycle, according to the rover program's flight software development manager, Janis Chodas. The Pathfinder hardware's track record was a plus for the conservative space agency, and boosting memory capacity and processor speed was not an option because power is a very precious--and limited--commodity on the spacecraft. Chodas says the rovers' radio link will give NASA engineers more time to test and polish software patches and fixes before they are transmitted to the probes.
http://www.wired.com/news/technology/0,1282,59983,00.html

A Georgia Institute of Technology project funded by the Defense Advanced Research Projects Agency has yielded detailed computer network simulations that can be accomplished almost in real time. "Our team has created a computer simulation that is two to three orders of magnitude faster than simulators commonly used by networking researchers today," boasts Georgia Tech project leader Richard Fujimoto. "This finding offers new capabilities for engineers and scientists to study large-scale computer networks in the laboratory to find solutions to Internet and network problems that were not possible before." Up to 1,534 processors were simultaneously employed by the Georgia Tech simulators to process the computation and model over 106 million packet transmissions in one second of clock time. Packet-level simulations are often used by engineers and scientists to design and study new networks and gain insights on denial of service attacks and other Internet-based developments. The considerable time investment required for current network simulation restricts most studies to a few hundred network elements such as servers, routers, and end-user machines. The Georgia Tech simulators are capable of modeling network traffic from more than 1 million Web browsers in close to real time. Such research is expected to contribute to the development of a more reliable, higher-performance Internet, according to Fujimoto.
http://www.gatech.edu/news-room/release.php?id=173

University of Finland researchers demonstrated the feasibility of "Fog Screen" technology at ACM SIGGRAPH's recent Emerging Technologies event. The Fog Screen prototype consists of a non-turbulent, laminar airflow within which a thin layer of dry, non-residual "fog" is injected. The fog serves as a perfectly flat surface that is protected from outside turbulence, and onto which images can be cast using any standard projection technology. The screen can be configured to appear translucent or nontranslucent, and images maintain their quality regardless of whether they are projected from the front or rear. Researchers list television sets and walk-through ads as potential indoor applications for the Fog Screen, while an even more advanced application could be achieved through the deployment of fog walls within mixed reality and immersive computerized automatic virtual environment (CAVE) settings. The Fog Screen is quiet and unbreakable, making the technology suitable for automated public presentations. The Fog Screen's development team intends to unveil an even more sophisticated version of the technology at the Museum Centre Vapriikki and Tampere's Communications museum in 2004.
http://www.electronicsnews.com.au/articles/89/0c018c89.asp

The USENIX Association's recent Security Symposium in Washington was a showcase for federally funded academic research focusing on security threats to wired and wireless networks. Stanford University researchers demonstrated a timing attack, in which a hacker attempts to seize sensitive information by studying how long it takes a system to respond to certain queries; Stanford's David Brumley and Dan Boneh successfully extracted a OpenSSL private encryption key on an Apache Web server in about two hours, which is far less time than it would take a brute force attack to accomplish the same objective. A process known as blinding, in which a random number is introduced into the encryption exponent, can block the attack. The Stanford research was funded by the National Science Foundation. Another demonstration involved a denial of service attack launched against 802.11 wireless networks by researchers from the University of California at San Diego. John Bellardo first shut down traffic to a targeted notebook PC that was using the wireless network provided for the symposium, and then blocked traffic to most of the other notebooks in the conference room by spoofing deauthentication packets. The defense strategy against such an attack is to patch access points to delay a deauthentication packet for a few seconds to see if the user who seemingly requested deauthentication immediately transmits data. The UC San Diego project was underwritten by the Defense Advanced Research Projects Agency and the National Institute of Standards and Technology.
http://www.gcn.com/vol1_no1/daily-updates/23053-1.html

Spintronics could be the key to building smaller and smaller electronic devices by allowing electrical current to pass through conducting materials without any dissipation of energy--and at room temperature. Theoretical physicists from the University of Tokyo and Stanford University report in the Aug. 7 issue of Science Express that they have discovered a spintronic equivalent of "Ohm's Law," the axiom stating that the application of a voltage to many materials generates an electric current, and electric energy dissipates as heat when the current encounters resistance. "Unlike the Ohm's Law for electronics, the new 'Ohm's Law' that we've discovered says that the spin of the electron can be transported without any loss of energy, or dissipation," explains Stanford's Shoucheng Zhang, who adds that well-entrenched semiconductor industry materials such as gallium arsenide exhibit this non-dissipation effect at room temperature. Up to now, only superconductors operating at extremely low temperatures are known to sustain this effect. The report's authors anticipate that the application of an electric field will induce a current formed from the collective spins of the electrons; these spins, the spin current, and the applied field line up in three different directions that are all perpendicular to each other. Zhang says the next stage is to confirm the physicists' projection and demonstrate the non-dissipation effect through close collaboration with experimental laboratories. The Stanford researcher predicts that spintronics and electronics could be evenly matched within a decade.
Click Here to View Full Article

Test-driven development (TDD) that takes advantage of "agile" methodologies such as eXtreme Programming and novel approaches to software testing helps improve software significantly. The implementation of TDD does not make quality-assurance (QA) testers obsolete: Independent TDD consultant Ward Cunningham declares, "If you're the head of QA and you hear that your developers are working test-first, you should think, 'Good for them--now we can focus on the truly diabolical tests instead of worrying about these dead-on-arrival problems.'" TDD suggests that tests be deployed prior to the writing of nontrivial pieces of production code--tests that the code will initially fail, and then pass once it is written properly. Because the tests symbolize specific software uses, they help coax out software design that complements other methods; the tests also allow programmers to focus on refactoring without worrying about breaking the code. Furthermore, TDD enables developers to learn the actual rather than advertised nature of unknown code, while the tests that result can ensure that the components or services will continue to follow established patterns of behavior as they evolve. TDD is not intended to supplant stress, load, reliability, and acceptance testing, so meshing TDD with these other methods via Cunningham's Framework for Integrated Test or some similar procedure is a wise move. Adopting both xUnit tools and test-first methods is a solution that promotes the preservation of the best software development practices.
Click Here to View Full Article

As computers become essential research tools in both academia and industry, the value of IT experts is rising, although traditional researchers still get the lion's share of the recognition. GlaxoSmithKline's Andrew Davies notes that "The boundary between the workbench scientist and the IT support staff is increasingly blurred," but the presence of programmers, network administrators, and other IT experts in scientific research and development laboratories is growing. Most insiders concur that R&D IT workers are driven by a fundamental interest in science. Their contribution is critical to "distributed system" programs in which researchers tap into the idle computing power of many interconnected machines to search for intelligent alien life, detect errors in electronic encryption, and study disease, among other things. The U.S. National Science Foundation's TeraGrid project seeks to build a high-speed supercomputing network distributed across five campuses and capable of performing 20 teraflops a second; the project will support subatomic simulation, protein modeling, jet aircraft design, and other initiatives, with expertise and elements such as 3D data visualization systems provided by IT specialists. Other supercomputing efforts of note include the U.S. Department of Energy's Scientific Discovery through Advanced Computing program, which thus far encompasses about 80 projects, and Cambridge University's Cosmos Project, which employs the new CosmoGrid machine. "These projects are all new, so the time is right for a career in this field," says Cosmos Project director Dr. Paul Shellard. Increasing availability of university courses is making it easier to become an IT expert.

IT companies are backing basic research in physics and other fields in order to foster innovations that can help bring smaller and smaller IT components to market faster so that performance gains are accompanied by price reductions. Companies can only be first to market if they possess a deep knowledge of scientific frontiers, and this can only be attained through exploratory research; at the same time, researchers must familiarize themselves with the quirks and requirements of product development. Successfully reaching these goals depends on organizations undertaking both product development and long-term exploratory research. IBM has a very successful long-term research model, which is currently focused on information processing, storage, and communication through investigation of novel materials, devices, and algorithms. Physics plays a major role in such research efforts: The move toward nanodevices and workable nanomanufacturing methods with effective and economic error-correction processes gives companies opportunities and enticements to delve deeply into the physics of natural pattern formation or self-assembly. Projects that IBM and others are pursuing in this direction include research into self-assembly to reduce reliance on costly lithography, and the generation of more nanostructured substances and how they could be used to improve IT hardware. Many IT industry products and services are based on algorithms designed to recognize patterns in complex data, and these algorithms are derived from research into a variety of theories, including those governing statistical physics, dynamical systems, information, control, and mathematical optimization. IBM has embarked on computational biology and bioinfomatics initiatives, with a heavy emphasis on physics, to help solve problems faced by life sciences customers. Another intense research area is quantum physics, whose theories form the basis of quantum computers and other potentially revolutionary technologies.
http://www.physicstoday.org/vol-56/iss-7/p44.html