A massive cascading power outage extending across the northeast from New York to Canada and to Detroit in the west illustrates the vulnerability of U.S. critical infrastructure to both spontaneous failures and premeditated attacks, according to national security experts. Authorities say there is no evidence indicating that either terrorists or the Blaster computer worm are responsible for the incident. However, a fire at a New York-based power facility was reported; should this fire be determined as the source of the disruption, former President's Critical Infrastructure Protection Board chief of staff Roger Cressey says it is a vivid reflection of the grid's interdependence. Onetime Department of Energy critical infrastructure protection director Paula Scalingi adds that a more resilient infrastructure is being looked into, but the effort is unfocused and has little sense of exigency. A recommendation for more "intelligent and adaptive" electric-power grid systems was made by the National Research Council in a 2002 report concluding that a cascading failure triggered by damage or destruction to key grid elements could lead to a regional transmission grid failure. A core component of such an adaptive grid would be adaptive islanding, in which portions of the grid are isolated by sensors and control mechanisms. Private-sector cybersecurity consultants have long cautioned that the linkage of Supervisory Control and Data Acquisition (SCADA) systems to corporate local area networks--spurred by the power industry's desire to boost statistical tracking and surplus grid capacity sales--increases the grid's vulnerability. A January 2001 white paper from Riptech stated that "The security strategy for utility corporate network infrastructures rarely accounts for the fact that access to these systems might allow unauthorized access and
control of SCADA systems."
Click Here to View Full Article

The rapid spread of the Blaster worm has raised fears among security experts that a more malicious form of intrusion is in the offing: The installation of "back doors" in the victimized computers that hackers could use to pillage systems of their sensitive information long after an initial cyber-assault has stopped. "The information security threat is no longer properly characterized by the 'caffeine crazed' hacker out to prove his or her technical prowess," states a 2002 World Bank report. Experts such as John Frazzini of iDefense say that more and more of these intrusions are the handiwork of criminal syndicates that operate primarily out of Asia, Brazil, eastern Europe, and Russia. The Blaster worm seems to be the simple work of an individual, one who appears to be using it as a way of telling Microsoft to improve the security of its software, according to World Bank study co-author Valerie McNevin. However, another analyst warns that a Blaster spinoff has emerged that may contain Trojan-horse code. Such code is the most prevalent way to turn infected systems into "sleeper cells" that can be instructed to wreak mayhem at a hacker's whim. Experts note that cyber-intrusions into financial systems are not usually publicized out of fear of unsettling the financial firms' customers.
Click Here to View Full Article

Despite the widespread power outages caused by the blackout throughout much of the Northeast United States Thursday evening, Internet connections, wireless networks, and communications lines kept working thanks to back-up power supplies. Equinix, which operates data centers that run over 90 percent of the world's Internet routes, lost power in its three New Jersey data centers, but immediately switched to backup power generators and said they had "no disruptions whatsoever." Level3 Communications, a big seller of wholesale dial-up services to U.S. and Canadian ISPs, said backup power generators kicked in when the power went out, ensuring that its large Internet backbone remained online. The generators can keep going until Saturday, and then battery backup power is available for another 45 hours, says Level3's Paul Lonnegren. He says, "I don't see any big disaster with the Internet going down." AT&T also said that its various network services remained online when the power went out; nevertheless, many companies reported that some communications traffic was disrupted by heavy volume.
Click Here to View Full Article

The product of a five-year research effort by Sandia National Laboratories scientists is a cognitive machine that can intuit a user's desires, recall experiences with users, and enhance human decision-making and situational analysis with simulated specialists. The original objective of the research was to create a computer/software program that reasons in much the same way as a human being. Key to this breakthrough was overcoming the fact that modeling software traditionally follows a logical processing pattern rather than using experiences and associative knowledge, and its failure to account for organic variables such as emotions, stress, and tiredness that regularly affect human cognition. The participation of Sandia's Robotics department broadened the project's scope to include intelligent machines. "In the long term, the benefits from this effort are expected to include augmenting human effectiveness and embedding these cognitive models into systems like robots and vehicles for better human-hardware interactions," explains John Wagner of Sandia's Computational Initiatives Department. "We expect to be able to model, simulate, and analyze humans and societies of humans for Department of Energy, military and national security applications." Sandia expects to dramatically boost humans' ability to comprehend and tackle national security problems as information expands exponentially and environments become increasingly complex through its Next Generation Intelligent Systems Grand Challenge project, says principal investigator Larry Ellis. The Grand Challenge involves the integration of unique perceptive methods and cognitive computing so that engineers, analysts, decision-makers, and others can better extract meaning from massive amounts of disparate information.
http://www.newswise.com/articles/view/?id=500498

The outcome of SCO Group's $3 billion copyright infringement suit against Linux software distributor IBM has the potential to significantly bolster or hobble the free-software movement. IBM has countered that SCO was also a Linux distributor for a number of years, and therefore essentially a signatory of the General Public License (GPL), which authorizes the free duplication of any software it covers, as well as any works derived from that software. SCO lead attorney Mark Heise says his client will challenge the GPL's validity and allege that the license violates federal copyright law. If the court rules in favor of SCO, free-software advocates worry that the decision will grant software developers the right to demand remuneration from companies that use GPL-licensed software, as well as impede the development of new programs. The GPL has helped nurture the generation of thousands of programs employed in important capacities by government agencies and major corporations. Many computer companies urge their clients to embrace open-source software partly to challenge Microsoft's attempt to dominate the corporate server-computer sector. Should a judge disqualify the GPL, free-software developer Eric Raymond assures that open-source proponents have devised an alternative license, one supported by Linux creator Linus Torvalds. Some free-software boosters think that IBM's defense of GPL will help certify the license.

The Oct. 7 recall vote in California could boost the controversy surrounding electronic voting machines, which critics argue are vulnerable to tampering despite reassurances from manufacturers and advocates of such systems. A July report from Johns Hopkins and Rice University researchers concluded that touchscreen voting machines from Diebold Election Systems are insecure: After examining an anonymously published version of the Diebold machines' source code, Johns Hopkins' Avi Rubin determined that a crafty teenager could modify the Windows-based system so he could vote more than once. Furthermore, "smartcards" used by voting administrators feature a simple default PIN number that hackers could easily guess, and Rubin also noted that Diebold's machines leave no printed audit trail, negating the possibility of a valid recount. SRI International scientist Peter G. Neumann commented that touchscreen users are required to scroll through numerous screens to view all the candidates, and was puzzled that a search command is not a standard feature in such systems. Riverside County registrar of voters Mischelle Townsend called such criticism irresponsible, considering how commonplace computerized systems are, and insisted that there are enough checks and balances to prevent tampering. However, Stanford University computer science professor David Dill argues that "We should put in the safeguards as soon as possible--especially in an election that's going to be so complicated and difficult." Meanwhile, electronic voting backers such as the Leadership Conference on Civil Rights are reconsidering their position as a result of the Johns Hopkins-Rice report.
Click Here to View Full Article

Researchers from the University of Virginia, Stanford University, the University of California at Santa Barbara, and Microsoft have created prototype software that can provide 3D exploded views of digital architectural models for use in computer games, training simulations, and other areas. The software outlines each layer of the architectural model, such as the individual stories of a high-rise, and then alters the program's graphics output to separate each layer and furnish the exploded perspective. Users can opt for different viewpoints and spacing between layers. The vertical and horizontal dimensions of the 3D objects are drawn to scale, while curves and diagonals are distorted. In this way, the interiors of all the building's floors can be viewed at once, and relative positioning among all the model's details is retained. The software could be used to monitor team interaction thanks to the bird's-eye view that is provided, says University of Virginia researcher Christopher Niederauer. "Military strategists and even police could potentially use our software...to aide in planning group interactions within a building," he notes. The software incorporates Chromium, a program that enables the exploded view visualization component to modify 3D graphics programs on the fly. Michael Ashikhmin of the State University of New York at Stony Brook says the software could be used to improve interactive multiplayer games without having to change the existing code.
Click Here to View Full Article

The Defense Advanced Research Projects Agency (DARPA) boasts an impressive track record of wildly successful, revolutionary technologies (the Internet, stealth technology, the computer mouse) that led to new industries and lifestyle changes, but the flipside to these successes is an awe-inspiring list of astoundingly ridiculous concepts that never panned out (a robot elephant, telepathic agents, and the much-maligned terrorism futures market). All of these ideas are the products of radical thinking, which is essential to DARPA's continued vitality and credibility. Fostering an innovative attitude, recruiting talent from industry and academia, and giving these researchers a long leash have been key to DARPA's most amazing and wide-reaching breakthroughs. But though DARPA is lauded and respected for its triumphs, in recent times the looming threat of terrorism and other factors--and the projects DARPA is considering to counter such dangers--have made many of the agency's biggest advocates anxious. Director of the University of Texas' 21st Century Project Gary Chapman warns that "DARPA has become a scary sandbox for people whose objectives many Americans disagree with." The agency courted controversy and derision for the Terrorism (formerly Total) Information Awareness project, which was designed to extract signs of possible terrorist activity from a vast database of information on both Americans and foreigners; placing Iran-Contra scandal figure John Poindexter at the head of the project added fuel to the critics' fire. Current DARPA director Anthony J. Tether is concerned that the distrust engendered by such projects could prompt Congress to more thoroughly regulate DARPA, which could hurt the agency's enterprising spirit.
Click Here to View Full Article(Access to this site is free; however, first-time visitors must register.)

A Defense Advanced Research Projects Agency (DARPA) program to reduce battlefield casualties by allowing medics to access holographic models of soldiers' bodies via a chip in their dog tags could be extended to all U.S. civilians in perhaps a decade, according to DARPA officials. Virtual Soldier program manager Dr. Richard Satava says the animated, updateable holograms will be based on the person's anatomy and physiology, but many medical technologists are skeptical that the initiative will yield any tangible results, given the formidable funding and processing power requirements of the project. The "holomers" Dr. Satava envisions would be based on magnetic resonance imaging (MRI), computed tomography (CT), X-ray, and ultrasound scans, with each new scan providing an automatic update. Holomers and other medical advances may be able to facilitate instantaneous and automatic diagnoses of wounded soldiers, allowing medical personnel to treat injuries faster and with more accuracy, thus saving more lives. Dr. Satava also thinks holomers could be used to simulate the effects of aging. "You'll be able to do a Dorian Gray for yourself, to show what you'll be like at 65," he states. Dr. Satava believes holomers could complement another DARPA project--LifeLog, an initiative to capture all aspects of a person's life as a sort of computerized diary. However, LifeLog and other controversial DARPA projects have riled privacy advocates in recent months, although Lee Tien of the Electronic Frontier Foundation says the Virtual Soldier effort is in too early a development phase to consider its privacy implications.
http://www.wired.com/news/medtech/0,1286,60016,00.html

Many of the technologies showcased at this year's SIGGRAPH conference were most notable for their film industry applications, particularly in the field of visual effects, where realistic character animation is gaining importance. On hand at ACM's SIGGRAPH 2003 was Armin Bruderlin of Sony Pictures Imageworks, who talked about his company's pioneering efforts to create software for simulating fur and human hair for films such as "Stuart Little" and "Spiderman 2." Industrial Light & Magic (ILM) gave a presentation about the particle-level-set (PLS) method its artists developed for "Terminator 3," wherein the molten metal effect of the film's villain was rendered using existing technology; PLS was selected because it facilitates explicit fluid control, precise fluid generation, complex collisions, texturing, and a rapid turnaround time. Eyetronics highlighted facial performance capture software used in several summer movies, but the tool can also be applied to archeology and the medical sector. Also represented at the conference were two West Coast Institutions, the Bay Area's Ex'pression Center and Oregon 3D of Portland: The former offers a two-year degree program in digital visual arts and sound arts, while the latter offers continuing professional education for industry artists. Bauhaus demonstrated its Mirage software, touted as a cost-effective tool that features the advantages of AfterEffect and Photoshop while lowering render time. SIGGRAPH's Guerilla Studio allowed kids to try out new technologies, while the Emerging Technologies Exhibit spotlighted fun and innovative tools such as the Fog Screen, a walkthrough display, and Toshitaka Amaoka's +1D, which renders 2D real-time video imagery as 3D computer graphics. Many visual effects firms--ILM, PDI/DreamWorks, and ESC Entertainment among them--used SIGGRAPH to court prospective CG artists, though there may not be enough qualified applicants to fill all the jobs currently available.
http://www.uemedia.com/CPC/article_11071.shtml

Public elementary and secondary schools in the United States are beginning to see the potential of the higher-speed, next-generation Internet2, and projects are underway to establish how educators can avail themselves of the technology's benefits. One such project is the Internet2 K20 Initiative directed by Dr. Louis Fox of the University of Washington. Fox says the goal of the initiative, which would recruit innovative people in a variety of fields to flesh out Internet2-related curricula, is to "figure out which [Internet2] technologies are useful for schools and which ones are not very useful." High-speed digital videoconferencing is one application with a broad appeal: The technology would allow far-flung teachers to convene in a virtual classroom and run a course that would be distributed to other schools. Educators would also be able to take classes remotely on Internet2--in fact, such classes are currently available on the network in the state of California. And the network will help teachers devise multimedia presentations that can be sent to colleagues throughout the country. University of California, Berkeley, research astronomer Carl Pennypacker adds that increasing numbers of Internet2-connected schools will foster the development of Internet2 applications for children. Approximately 8 percent of U.S. public elementary and secondary schools are currently linked to Internet2 through its Abilene network backbone, but interest is growing among educators still smarting from their sector's late adoption of the first-generation Internet.
Click Here to View Full Article(Access to this site is free; however, first-time visitors must register.)

Spam and virus-writing have begun to converge in recent months, according to email security companies and experts such as Trend Micro's David Perry. Chris Miller of Symantec notes that spam and viruses "have a common goal--to do what they're doing without being seen." Evidence suggests that spammers are turning to virus-writing methods to boost their messages' chances of penetrating recipients' systems, while virus authors are adopting bulk emailers' tactics to give their malicious code the widest possible reach in the shortest amount of time. Experts are drawing parallels between the Webber and Sobig.E worms, which could be "spam zombies" that lurk on infected PCs until instructed by spammers to transmit vast quantities of emails that are untraceable; the worms convert victim computers into relay servers, which mask the email's point of origin. Spammers could use such viruses to commandeer a user's address book and use the PC as a launching pad for even more spam. In one documented case, a major airline manufacturer was recently sent spam that directed users to Web pages where surveillance software was unwittingly downloaded onto their computers, giving an outside party (or parties) a window onto users' online activities; such software can also collect data about email addresses, credit-card numbers, and passwords. These "drive-by downloads" take advantage of users' tendency to assume such downloads are a routine function of the Web site. Perry and other experts believe that the convergence of spam and malicious code will require any program with malicious intent to be classified as malware.
Click Here to View Full Article

Stanford researchers have launched a search technology startup, Kaltix, that's developing technology to improve upon the PageRank method that made Google famous and possibly lead to a breakthrough in cutting-edge Web search "personalization" technology. Kaltix technology searches groups of sites instead of the entire Web at once, speeding searches by almost 1,000 times, according to published research by the company's founders. That makes PageRank applicable to individuals, since it otherwise takes days to compute all the Web's linkages, according to Moreover President Jim Pitkow, formerly of search startup Outride, which was bought by Google two years ago. "If they've been able to take a computational block and remove it, that opens up new opportunities" says Pitkow. All the major search companies--MSN, Yahoo!, and Google--are trying to create competitive differentiation through search personalization. However, implementing personalized searches involves notifying users about what information is being collected and how it is used, as well as allowing them to adjust their profiles. Yahoo! already leverages information gathered from its registered users to present them with localized weather and news of interest to them. AltaVista also employs technology that guesses a user's location from their IP address and delivers contextualized search results, returning soccer sites for a British search on "football," for example. Still, Search Day editor Chris Sherman says user demand may not yet justify the costs of personalized search capabilities, something that could give a company an edge in the advertising-sponsored search business, expected to be worth as much as $8 billion in five years. Sherman says, "Personalization is one of the holy grails for search...When it comes out of the lab and what flavor it takes are the big questions."
http://news.com.com/2100-1024-5061873.html

The Sensor Web Project at NASA's Jet Propulsion Laboratory (JPL) was established to provide a way to constantly monitor the harsh Martian surface for signs of life. After a five-year development period, a Sensor Web has been deployed to manage agriculture in Antarctica in preparation for a Mars trip. NASA's Sensor Web differs from other sensor networks because each pod is constantly processing and interpreting data on its own instead of simply routing information, as in mesh networks. Each pod in the network is outfitted with cellular communications, acoustic sensing components, and a microcontroller that can complete computing tasks. Together, the pods gather and analyze environmental data. JPL Sensor Web Project leader Kevin A. Delin says a constant presence would more likely discover life on Mars since any organisms there probably only emerge when environmental conditions allow liquid water, for example. Another application would be in agriculture, where a Sensor Web could control irrigation more intelligently than other automated systems. When rain is imminent, Sensor Web would know from atmospheric readings and refrain from irrigating. Or, a Sensor Web could learn to adjust for terrain features that make higher portions of fields more dry than other parts. Delin says the Sensor Web's intelligence benefits from the parallel-processing architecture, which mimics the processing architecture of the brain.
http://www.pcmag.com/article2/0,4149,1213107,00.asp

A number of American research universities across the country have launched electronic databases to give exposure to unpublished scholarly works and to archive data that could potentially be lost, such as the data accumulated by a professor about to retire. Getting scholarly works published in a journal can traditionally take months or years since the works must be subjected to peer review, and the journals themselves can cost up to $5,000 or even $15,000 annually. DSpace is the digital archive at the Massachusetts Institute of Technology (MIT), and the university estimates the system's free software has been downloaded approximately 3,400 times. Each MIT department determines what works will be included in DSpace based on its own criteria; the university expects to collect 5,000 scholarly works by autumn and 7,500 later in 2003. Rick Johnson, a director of the Scholarly Publishing and Academic Resources Coalition, says digital databases offer "the best of both worlds," although he does not believe the digital version will replace the offline format. Johnson believes online databases offer the recognition of a journal while allowing access to a wider audience. The New England Journal of Medicine, meanwhile, is mulling whether to include digitally archived works in its journal, which is currently prohibited, according to Jeffrey Drazen, the journal's editor in chief. A coalition of universities (Columbia, Ohio State, Rochester, Washington, and Toronto) have now teamed up with MIT to test what the database can accomplish, such as bilingual searching.
http://www.nytimes.com/2003/08/03/edlife/03EDTECH.html(Access to this site is free; however, first-time visitors must register.)

In his book, "Black Ice: The Invisible Threat of Cyber-Terrorism," author Dan Verton deals with the vulnerability of the United States' critical infrastructure to cyber-attacks. The book posits that the Sept. 11, 2001 bombings in New York were "the worst cyber-terrorist attack in history;" it also outlines a seamless connection between physical security and cybersecurity, argues that Al-Qaeda is a likely cyber-terrorist candidate, and notes that many U.S. Web sites offer terrorists a wealth of data about potential targets. Verton cites a National Security Agency exercise proving that hackers, using tools available to anyone online, could hobble the U.S. military's Pacific theater command and control system. Worse, the same attack strategy could conceivably disrupt a significant portion of the private-sector infrastructure in the United States. Verton reports than an average-sized large utility company suffers about 1 million cyber break-in attempts annually, many of which appear to be supported by Middle East-based organizations. The key point of vulnerability within many private-sector industries is unintentional Internet connections between corporate networks and Supervisory Control and Data Acquisition systems, and this vulnerability has increased as operations migrate to the Internet to boost the bottom line or because of deregulation. Verton says deregulation is also responsible for increased consolidation in the telecommunications sector, which has resulted in vast numbers of new infrastructure linkage points that cyber-terrorists can take advantage of; furthermore, by concentrating operations in fewer data centers, telecom providers have boosted the Internet and communications infrastructure's susceptibility to physical assaults that would leave many companies crippled. Another effect of all this deregulation is the creation of multiple points of potential failure in the financial industry infrastructure, which is also being weakened by centralized operations resulting from a flurry of mergers and acquisitions. Verton contends that the private sector is in a state of denial, an attitude fostered by federal policy that has allowed market forces to shape security investments.
Click Here to View Full Article

Falloffs in certain tech markets do not signal an atrophy of IT growth in general, because technology maturation paves the way for new markets, and underlying disruptive technologies such as computer chips, disk-drive capacity, and Internet-connection speed are advancing at an exponential rate. Still, IT spending is unlikely to return to the record-breaking growth rates of the dot-com boom, because many buyers are now demanding products that promise returns within six months. As customers become more value-conscious, they exert more influence on suppliers, which means producers will need to devote most of their energies to core strengths and farm out their remaining operations to outsourcers. Suppliers must also strive to ensure their products are glitch-free and interoperable prior to shipping. The tremendous opportunity consumers represent can only be leveraged if tech companies redirect progress toward making cheaper products. Technologies likely to land the most new customers are those that boast more cost-effectiveness and ease of use--technologies such as Web services, smart phones, radio frequency ID chips, and wireless networks. Weathering the sales downturn will depend on companies embracing new business models and setting up massive economies of scale, but a major challenge lies in making venture capitalists more confident about investing in innovative technologies; many variables--legal and regulatory issues, security and privacy concerns, and other factors--have eroded VC confidence. Recessions such as the current tech downturn are the turning point of the boom-and-bust cycle typical of all technological revolutions.
Click Here to View Full Article

Minute sensors derived from advancements in the fields of microelectromechanical systems (MEMS) and nanotechnology promise to transform the manufacture and maintenance of products, and some vendors see unlimited potential applications when such devices are integrated with diagnostic software and wireless communication infrastructure. MEMS- and nano-based sensors appeal to very different markets: In-Stat/MDR senior analyst Marlene Bourne observes that "Nanotechnology is a material science and MEMS is an engineering technology." No matter what application sensors are used for, their primary purpose is to emit electrical or biological impulses from which data is collected, and the widescale industry adoption of sensor technology will act as a catalyst for a manufacturing revolution. Applied Nanotech CEO Dr. Zvi Yaniv reports that "nanotechnology is crucial for...understanding how to magnify and accelerate to get good signal data that can then be exploited by controllers and computers," and he expects most nano-based sensors will be designed to detect indications of chemical, biological, and physical reactions such as changes in smell, humidity, light, and heat. MEMS research is much further along than nanotech research, and MEMS sensors are smaller and less expensive than traditional complementary metal oxide semiconductor-based sensors. The other major advantage of MEMS sensors is their sensitivity to vibrations, which is what makes the technology potentially useful for the manufacturing sector. MEMS sensors could, for example, be integrated with industrial machinery to detect manufacturing defects early in the assembly process. Dale Gee of GE Industrial Systems cautions that entering the nano- and MEMS-based sensor market is a risky proposition, given that the market is already overcrowded.