The ACLU used an Aug. 25 press conference to question the Transportation Security Administration's (TSA) proposed Computer Assisted Passenger Pre-screening System II (CAPPS II), which is still a magnet for criticism despite revisions added by the TSA in response to public outcry. CAPPS II, as it currently stands, would have airline passenger reservation systems collect each traveler's name, address, phone number, and birthday; this data would be sent by TSA to commercial data aggregators, who would produce an authentication score that determines passenger ID confidence levels. Passengers would then go through a "risk assessment function" and be assigned a risk score. Speaking at the press conference, Laura Murphy of the ACLU inquired how passengers misidentified by CAPPS II as terrorists or other criminals would be able to clear themselves and avoid pre-boarding searches, and criticized the TSA for not being more forthcoming about the system's risk score criteria. "Credit records, highly personal medical, business, educational and mental health information could still conceivably be mined, but in the secret world of national intelligence and federal law enforcement agencies," she warned. Former Rep. Bob Barr (R-Ga.) said even an error rate of 1 percent or 2 percent would generate millions of false positives annually, given the huge numbers of air travelers passing through airports daily. TSA administrator Adm. James Loy and Homeland Security Department chief privacy officer Nuala O'Connor Kelley released a joint statement promising that CAPPS II will not trample on individuals' right to privacy. The press conference panelists held differing views about a more effective, less invasive solution: Grover Nordquist of Americans for Tax Reform suggested arming airline pilots, while Jim Dempsey of the Center for Democracy and Technology thought the TSA needs to improve its passenger watch lists.
http://www.infoworld.com/article/03/08/25/HNcoalition_1.html

An Aug. 25 decision by the California Supreme Court is considered to be a clear triumph for the DVD Copy Control Association (DVD CCA), because the ruling overturns an earlier court decision that allowed DVD decryption software to be published under the claim that such code was protected under the First Amendment. The court ruled that a ban against posting DeCSS software does not constitute a violation of free speech rights, and upholds DVD CCA's argument that the posting infringed on its trade secrets rights. "Disclosure of this highly technical information adds nothing to the public debate over the use of encryption software or the DVD industrys efforts to limit unauthorized copying of movies on DVDs," the court declared. "We do not see how any speech addressing a matter of public concern is inextricably intertwined with and somehow necessitates disclosure of DVD CCA's trade secrets." The Motion Picture Association of America filed a lawsuit in 1999 against software developer Andrew Bunner for allegedly violating the Digital Millennium Copyright Act by publishing DeCSS online, while the DVD CCA cited trade secrets rights violations when it filed a separate suit against Bunner and others who posted the software. The California Supreme Court ordered the case to be sent back to the appeals court where judges would more closely scrutinize the trade secrets issue. Bunner's legal counselors said they will demonstrate to the appeals court that widely disseminated online content such as DeCSS is not entitled to protection under trade secrets law. Some attorneys argued that the earlier court ruling allowing Bunner to post the DVD decryption software threatened to debilitate software companies' power to shield their intellectual property from infringement.
http://news.com.com/2100-1028_3-5067665.html

The extent of the damage caused by the SoBig.F computer worm was limited somewhat thanks to the efforts of security researchers such as F-Secure's Mikko Hypponen, who helped dissect the worm and warned authorities about network weaknesses that could aid its spread. Such experts notified the FBI about these vulnerabilities, and the bureau moved quickly to isolate them. Ilkka Starck of F-Secure's North American operations reports that speed is of the essence when it comes to heading off viruses, while cooperation and data sharing among virus researchers is also critical. F-Secure and other antivirus companies usually capture samples of malicious code through honeypots--online decoys set up as a lure. Based primarily in Finland, F-Secure sells antivirus software that features automatic updates whenever new viruses are identified, and Hypponen says his lab's objective is to issue patches to subscribers within two hours of receiving the first malware sample. He adds, "We believe we have a good chance of being successful in the U.S. because we make [patching] very convenient." Hypponen notes that antivirus experts on his team hail from diverse backgrounds, and that there is no specific academic focus on the field. Perhaps the most frustrating challenge for virus sleuths is tracking down virus authors, who often resort to crafty means to mask their identities and cover their tracks.
http://www.nytimes.com/2003/08/27/technology/27VIRU.html(Access to this site is free; however, first-time visitors must register.)

The rapid spread of computer viruses such as SoBig and Blaster in recent weeks sends a clear message that commercial software makers must design more secure products. Although the damage caused by such viruses has been minor so far, Watts Humphrey of Carnegie Mellon University's Software Engineering Institute theorizes that a bug could conceivably result in a loss of life, given the ubiquity of software in today's world. He says software makers "need to focus on the practices of the individual engineers, and by and large nobody does that." The software industry, which has long prioritized speed over quality, is notorious for churning out products and fixing glitches later. May 2002 estimates from the U.S. National Institute of Standards and Technology indicate that software makers and users pay between $22.2 billion and $59.5 billion a year to correct "inadequate" software. Microsoft is the most frequent target that virus authors usually focus on: The Blaster worm exploits a hole in the Windows operating system so that it can infect machines through a feature whose original purpose is to facilitate communications between network-linked computers. Microsoft VP Mike Nash also notes that many corporate customers have complained that the patch the company released to plug the hole is hard to implement. Microsoft executives report that CEO Bill Gates' 2002 mandate to build more "trustworthy" products has led to significant changes in software development practices, including more intensive security training and accountability among programmers, as well as more documentation. However, critics contend that programming culture, which values freedom, conflicts with such rigorous procedures, which makes changing programmers' attitudes and practices a formidable challenge.

Sandia National Laboratory cognitive psychologist Chris Forsythe is leading an effort to create "synthetic human" technology, which would allow computers to learn and store information about people in order to better interact with them. Forsythe began his research looking for a way to create psychological composites of foreign leaders or groups of people on a computer, in order to bolster national defense. Robotics scientists and Sandia, however, saw promise in Forsythe's work to create computers that understand users and can interact with them as humans do. Forsythe says Microsoft's maligned Clippy application is exactly what he plans to avoid because it is one-size-fits-all, and instead is devising an assisting interface that knows and responds to individuals. Humans build knowledge of one another with each interaction, and Forsythe wants to replicate some of that with episodic memories built behind his cognitive programs. Eventually, Forsythe expects that the technology will be so easy to apply that people can quickly create simulations of probable consequences, such as how a traffic accident up ahead will affect a driver's route. The trick is to get the computers away from rigid "if, then" logic and use more nuanced and informed decision-making similar to humans. "Humans are certainly capable of logical operations, but there is much more to human cognition," Forsythe explains. He predicts that cognitive technology will be embedded in all user interfaces within 10 years.
http://www.wired.com/news/technology/0,1282,60153,00.html

Although the outsourcing of high-tech jobs to cheaper overseas labor is attractive to U.S. businesses from a financial point of view, opponents argue that cost-benefit studies often overlook important variables. Marcus Courtney of the Communication Workers of America's WashTech affiliate contends that replacing domestic employees with offshore workers results in a drain of knowledge that can put companies at a disadvantage. Software development projects are often divided between groups, and this division causes discontinuity throughout development cycles, according to Courtney. He says that WashTech and other labor organizations are readying a multi-pronged strategy to fight the migration of jobs out of the United States, including educational campaigns that focus on the issues at hand. "One of the things we've found out is that members of Congress are not educated when it comes to the outsourcing of white-collar jobs," Courtney observes, noting that this discrepancy can be corrected by lobbying for new legislation; WashTech in particular is calling for a General Accounting Office-sponsored federal analysis on the overseas outsourcing of technical positions. Courtney also says that his organization is trying to close a loophole in the L-1 visa program that is being exploited to bring in foreign workers while putting their American counterparts out of work. Maria Grant of Deloitte and Touche is doubtful that outsourcing will be halted by labor groups' efforts, and she suggests that individual U.S. workers can improve their job security by becoming competent on multiple levels. Adding analysis and decision-support skills to their technical know-how is one way American professionals can keep their jobs, while IBM's Ray Schreyer notes that a combination of technical skill and leadership qualities can also boost high-tech employees' value.
http://www.newsfactor.com/perl/story/22151.html

A team of Purdue University engineers believes that the incorporation of an Internet-based monitoring system into the North American electric power grid could have forestalled the massive blackout that struck on Aug. 14. The system would gather data via fuzzy logic and neural networking to anticipate overloads in local substations, isolating disruptions before they could spread and lead to a cascading outage. Purdue team leader Leftari Tsoukalas is a member of a research consortium that has devised Telos (transmission entities with learning capabilities and online self-healing), a system that tracks and forecasts electricity consumption and communicates with substations to arrest problems. Telos employs software agents that use fuzzy logic and neural networks to recall consumption patterns at specific locations in order to surmise subsequent upsurges in power demand, and then collaborate with substation software to balance out the consumption by identifying areas of less demand. The Telos software could primarily be connected to power meters via local computer clusters for up to $100,000, while the system could eventually be meshed with small embedded modules with 8-bit microprocessors and Internet capabilities to establish communications with home-based power meters. Engineers note that meters, protective relays, and other digital gear could work with the Purdue team's "neurofuzzy" system by gauging, recalling, and training the software about currents, voltages, power, and frequencies at specific times. The system is being tested at Argonne National Laboratory, where engineers are teaching the software to recognize electrical consumption patterns, while Commonwealth Edison is also trying out the technology on a virtual "data town." The researchers believe setting up such a system is far cheaper than upgrading the national power line network, which could cost as much as $100 billion.
http://www.eetimes.com/sys/news/OEG20030825S0050

Researchers at Lucent Technologies' Bell Labs facility are looking to the natural world to develop processes that would improve the fabrication and capabilities of their parent company's products. The latest example is the Venus' flower basket sponge, a deep-sea invertebrate with an exoskeleton that features a ring of glassy filaments that anchor it to the ocean floor. Dr. Joanna Aizenberg of Bell Labs, together with researchers from Tel Aviv University and Lucent's OFS spinoff, learned that the sponge glows because its filaments can transmit light from nearby luminescent creatures. In addition, the filaments bear a sharp structural resemblance to the commercial, light-carrying optical fibers in telecommunications systems. Aizenberg thinks the glow mechanism is used to lure food, and although the sponge's fibers are not as transparent as commercial fibers, they are highly flexible and tough. The Bell Labs researchers hope to duplicate this flexibility, and possibly improve manufacturing processes by studying how the sponge fuses silica molecules into glass by manipulating proteins. Aizenberg's group discovered several years ago that the vision system of another aquatic creature, the brittlestar, features many minute crystal lenses that focus and direct light, which could conceivably be harnessed for optical computing or telecommunications. Bell Labs researchers claimed earlier this year that they had successfully imitated the brittlestar's facility in generating arbitrarily configured crystals.
http://www.nytimes.com/2003/08/26/science/26SPON.html(Access to this site is free; however, first-time visitors must register.)

Samuel Madden and Wei Hong of the University of California at Berkeley are working on TinyDB software used by a network of minuscule sensors or "motes." The researchers think wireless sensor networks have near-infinite applications, ranging from battlefield surveillance and soldier health monitoring to the anticipation of equipment malfunctions to assisted living to environmental and habitat surveillance. Hong reports that key to the success of network sensors is the development of an inexpensive, renewable power supply, while Madden explains that disconnection--a problem especially prevalent in remote environments--also needs to be resolved. Madden and Hong agree that their research is valuable to the nanotechnology sector, and Madden predicts that motes will have reached the nano-scale within a decade. Hong notes that sensor networks' usability will be dramatically broadened through the incorporation of database software, since there is a vast amount of physical-world data that can be collected by tiny sensors. Hong adds that wired technology will not be phased out by wireless technology, and suggests that wireless networks should only reside on the network edge; although he sees value in a global wireless database, he cautions that such a vision will remain elusive until privacy and security issues are addressed. Hong says the adoption of sensor systems will be hastened by the advent of do-it-yourself kits, and the researchers believe that such kits will eventually be sold commercially. Madden observes that home adoption will hinge on finding useful consumer applications for wireless sensor nets, one of them being electricity bill reduction. Hong concludes that "The database industry should focus on the integration of real-time data that will begin streaming in as we deploy more sensor networks across enterprises and natural habitats."
http://www.svbizink.com/otherfeatures/spotlight.asp?iid=314&naviid=

Johns Hopkins computer security expert Avi Rubin ignited a firestorm of publicity regarding electronic voting systems when he leaked the findings of his voting machine security study: Although prominent computer scientists such as Stanford University's David Dill protested paperless systems in premise, Rubin's actual study of Diebold Election Systems' source code alerted officials to the serious risk of using the machines. Maryland has since halted plans to buy $56 million worth of Diebold machines as part of a congressionally funded program to upgrade the nation's election infrastructure. After the 2000 national election debacle, Congress allotted hundreds of millions of dollars for new electronic voting equipment--but computer experts skeptical of vendors' security claims warned no computer system should be completely trusted. Dill left his Stanford position to organize a reform movement demanding provisions such as a verifiable paper trail. Still, opponents had no proof of security flaws since the source code for the systems was kept secret. This summer, however, the source code for one of Diebold's machines was accidentally published on a New Zealand-based Web site; an activist alerted Dill, who passed the code onto Rubin, technical director for Johns Hopkins' Information Security Institute. Rubin says the source code had glaring problems and that he decided to notify the media in order to get news to officials as soon as possible. Many state and local authorities are already using electronic systems such as the one studied and many more plan to use them in the 2004 elections. Rubin's work has prompted a bi-partisan group of federal lawmakers to seek changes to the Help America Vote Act out of concern that Congress acted hastily and must consider the security implications more seriously. Computer security experts generally agree that the software code for electronic voting systems should be made public to encourage better testing and reassure users.
http://www.sunspot.net/features/lifestyle/bal-to.vote25aug25.story

New IEEE 802.20 technology promises to replace carriers' still-nascent 3G rollouts, though groups with heavy investments in 3G cellular technology are likely to resist a quick adoption. The new standard is still being developed and leadership of the IEEE working group is undefined, but 802.20, dubbed mobile broadband wireless access (MBWA), touts broadband speeds over cellular coverage areas. Already, Flarion Technologies is hawking an 802.20 variant called orthogonal frequency division multiplexing (OFDM), which Flarion's Ronny Haraldsvik describes as LAN technology extended to the wide-area network. However, International Data wireless analyst Keith Waryas says it is not clear how 802.20 will impact the cellular space because it is a packet-based system and not compatible with existing cellular services. Carriers and equipment makers have spent billions in licensing and physical infrastructure to realize 3G networks, and rolling out a new infrastructure with new authentication and billing issues is not an easy task. Haraldsvik, however, says the company has already drawn interest from Nextel and Korean and Japanese carriers, and that opposition to 802.20's quick deployment comes from original equipment manufacturers that do not want to lose their entrenched industry positions. Haraldsvik says it is still unclear what constitutes 4G, but that 802.20 does allow end-to-end packet switching envisioned by major industry players and customers.
http://www.newsfactor.com/perl/story/22160.html

Even if SCO's multibillion dollar lawsuit against IBM for allegedly shunting copyrighted Unix code into Linux is dismissed, vendors and open-source advocates are worried that the action will breed enough fear and uncertainty within the marketplace to jeopardize Linux adoption, especially with SCO threatening to pursue litigation against users. Open Source Initiative general counsel Lawrence Rosen contends that, should SCO win, no penalties will likely be levied against users: First of all, SCO would lose its status as a damaged party if IBM pays the damages, and thus be unable to demand money from anyone else; and second, there is little chance that someone can be sued for infringement merely for using illegal products while unaware of their illegality. Meanwhile, critics argue that SCO's case is weak, particularly because the software company has been less than forthright about the so-called infringing code. SCO seems determined to keep its code under a veil of secrecy, and is only allowing people to view it on the condition that they sign a nondisclosure agreement. SCO opponents are fueled by a Linux Journal article from software developer Ian Lance Taylor, who signed the agreement, viewed the code, and reported that, although there were similarities, he was not made privy to a revision history to establish the code's point of origin. He also noted that the code was not a core part of the Linux kernel, and characterized it as "fairly trivial." On the other hand, Laura DiDio of the Yankee Group also saw the code and felt that SCO could have a valid claim, even though there was no empirical evidence; she added that her suspicions were aroused by the fact that Linux vendors such as IBM are not offering indemnification to their Linux customers. Technology analyst Gordon Haff counters that it would be fiscally careless to offer possibly ongoing indemnification.
http://www.salon.com/tech/feature/2003/08/18/sco_ibm/

The Navigational Assistance for the Visually Impaired (NAVI) system developed by University of Rochester researchers uses radio and passive transponders to help guide blind people, though its potential applications could extend beyond this primary capability. NAVI features a detector that beams a radio signal to the transponders, which are attached to stationary objects such as buildings; the detection of these transponders triggers an audio response burned onto a CD in a portable player that gives the user navigational aid. In addition to helping the visually impaired, the device can be used to enhance museum tours or walkthroughs of important places. "This is a wonderful example of our students taking theory from the classroom, knowledge of some of the difficulties faced by some groups of people, and combining that with existing devices to transform it into a real-world application that is of genuine usefulness to people," declares NAVI project leader Jack Mottley, associate professor of electrical and computer engineering and biomedical engineering at the University of Rochester. The CD player would allow users to change informational CDs as they move to different locations, while future versions of the device could store data in solid-state memory that can be automatically revised upon entering a new locale, or let persons set up their own tags and record relevant data as they desire. The passive tags do not need to be plugged in and require no batteries, and deploying them is a relatively cheap proposition, says Mottley. The development team behind NAVI is applying for a patent on the device, in the hopes of partnering with a manufacturer to maximize the system's user-friendliness.
http://www.eurekalert.org/pub_releases/2003-08/uor-nnt082603.php

RSA Security researchers claim they can allay consumers and enterprises' privacy concerns about the use of radio frequency identification (RFID) tags with an inexpensive solution designed to convince RFID readers that all possible tags are present at any given time. The privacy issue stems from the fact that any RFID reader can read the numbers on any tag, regardless of where the tagged items are acquired. With such tags embedded in virtually anything, a department store reader could conceivably record the items in a shopper's cart as well as the amount of money in the shopper's wallet, the shopper's credit cards, etc. RSA Labs' proposed solution would allow enterprises and customers to easily determine what tags readers should be allowed to read, as well as when they can read them. The technology employs blocker tags to simulate all possible tag serial numbers, thus rendering the reader incapable of uncovering the presence of specific tags. "The conceptual basis is reasonably simple, and the blocker tags should cost no more than twice what normal tags cost," notes RSA Labs principal research scientist Ari Juels. RFID tags currently cost about five cents apiece. A paper detailing the blocker tag method written by Juels and RSA co-founder Ron Rivest and colleague Michael Szydlo will be presented at the Association for Computing Machinery's Conference on Computer and Communications Security in October. Large euro notes are expected to incorporate RFID tags in a few years.
http://www.eweek.com/article2/0,3959,1229497,00.asp

The National Institute of Standards and Technology (NIST) recently awarded funding to three advanced IT projects out of a total of 16 qualifying for the Advanced Technology Program. One recipient is InRAD of Knoxville, Tenn., which is developing the Automated Knowledge Discovery System with Sarnoff and Knowledge Based Systems. The system would work based on users' incorporation of an ontology of the research topics and a technology road map that outlines organizational objectives and needs; intelligent agents would use the ontology to carry out searches, with the resulting data organized according to labels assigned by the agents via the technology road map. Meanwhile, Bit 9's Computer Immune System is a security tool designed to shield computers and networks from unfamiliar attacks. Bit 9 President Todd Brennan says the system does not heuristically monitor code behavior, and does not identify patterns in the same way anti-virus software does with established threats; he adds that the system's modus operandi cannot be detailed until Bit 9 receives certain patents, though he says stability, upgradeability, and scalability must be added to the program in order to maximize its effectiveness. The third IT project NIST is funding is Rosetta-Wireless' Wireless Intelligent Personal Server, which would allow mobile workers to access large files and complex data. The server is designed to have a maximum coverage area of 50 feet, and to encrypt and password-protect information as it is being transmitted and stored. The wallet-sized device would enable workers to automatically link to an office network for email and other files, permitting notebooks or other mobile devices to access the server data.
Click Here to View Full Article

Colleges and universities are changing their computer science and IT programs in order to graduate workers that better fit the needs of industry. During the dot-com boom, employers often signed students before their graduation, but today many demand real-world experience. Deutsche Asset Management Technology programs manager George Voutes says U.S. academia is only doing a middling job in teaching the right skills to students. According to a Computerworld survey of 244 IT professionals, 75 percent believe today's computer science education is not preparing students well enough for the job market. CIOs are working with schools to tailor courses and introduce new ideas directly to students and their professors; among the skills in demand are more business acumen, industry-specific knowledge, interpersonal communication skills, and project management. Computerworld columnist and management consultant Thornton A. May says schools are slow to grasp business-world needs because they see industry "as a funding source, not as a place of insight." Partly in response to CIO input, Ohio State University (OSU) computer and information science chair (and former ACM president)Stu Zweben says his group now teaches methodology rather than specific programming languages; OSU is also sending more students to internships and co-op programs where they can gain work experience before hitting the job market. At MIT's Center for eBusiness@MIT, teams of students work on projects proposed by CIOs from sponsor companies, such as customer relationship management or business Wi-Fi implementations. The Stevens Institute in New Jersey is teaching students how to work in an outsourced infrastructure environment and provides special graduate IT tracks for the pharmaceutical and financial services industries.
Click Here to View Full Article

Peter G. Neumann of SRI International's Computer Science Laboratory writes that the poor state of computer security in the United States is exacerbated by growing reliance on computers, the proliferation of the Internet, the buildup of popular applications that are interrelated, and a general ignorance of security issues and the many factors that can compromise system reliability. Solutions to many of these problems are already available or can be developed relatively quickly, but they are not being widely adopted by the mass market, which has made boosting speed and capacity and other bells and whistles a priority. System development problems that existed four decades ago are still prevalent, yet Neumann argues that many could be solved through regular deployment of hardware safeguards, robust programming languages, carefully crafted system interfaces, and dependable software engineering disciplines. Furthermore, the technology's requirements should be clearly defined at the outset, but Neumann notes that the curricula of most U.S. schools suffer from a noticeable lack of emphasis on systematic system development. The author extols "open-box software" that allows source code to be examined, tweaked, and used in different capacities, and remarks that a community of open-box software developers can significantly augment compatibility, evolvability, security, and reliability. Neumann thinks the development of autonomous systems capable of self-repair and self-maintenance will be critical, but notes that the concept has generated little interest from the commercial sector. It is his opinion that a multidimensional strategy is the only way to solve security problems. His suggestion is to raise security and privacy awareness among researchers, development managers, system designers and programmers, system administrators, procurement officers, system evaluators, corporate leaders, government funding agencies, legislators, law enforcement communities, and users, as well as to improve undergraduate and graduate education.
http://www.nap.edu/issues/19.4/neumann.html

Programmers are employing genetic algorithms that follow a Darwinian pattern so that computer programs can evolve to carry out a specific operation or action by passing on their most advantageous traits to subsequent generations. Oxford researcher Torsten Reil has developed a program in which a digital character learns how to walk with the help of a genetic algorithm. The first step is the generation of a stick figure that is assigned gravity, joints, muscles, and a neural network; the algorithm produces 100 animated stick figures with randomly arranged neural networks. The figures then attempt to walk across the computer screen, and the neural networks of the most successful characters are replicated 20 times and tweaked with slight mutations. These individuals, along with 80 new stick figures with randomly assembled neural networks, constitute the second generation; subsequent generations follow the same evolutionary path until a figure that can march across the screen with perfect stride is bred. Reil's program and other genetic algorithms are key components of animation software used to produce more realistic digital characters in films and video games, but Bill Gross of Idealab sees even more advanced applications for the technology in the field of engineering. He thinks that the entire design process could be handed over to such algorithms. "I think this is the way engineering should be done: Instead of defining your part or your circuit board, define your objective and let the software evolve the answer," Gross explains.

Technology for rendering people invisible is within the realm of physical possibility, though the power and hardware requirements will be formidable, writes aerospace engineer Wil McCarthy. For an invisibility cloak to be truly effective, it will need to image the scene behind it from all angles simultaneously; McCarthy envisions a system that employs at least six pairs of stereoscopic video cameras and a computer that can model the surrounding environment in three dimensions and synthesize the scene from all perspectives. Displaying the imagery would require the cloak to be covered by 180 x 180-pixel videoscreens behind hemispherical lenses. These so-called hyperpixels would transmit custom-colored light beams to every degree of arc and allow for a maximum of 32,400 different viewing angles, which would be coordinated and distributed by image-warping software. Some of these capabilities could be furnished by commercially available technologies such as small color cameras, 16-bit displays, and light-emitting diodes (LEDs); much more difficult would be getting the cloak to work in all lighting conditions, which would require the display to replicate all brightness levels. In addition, the cloak display must be able to update itself faster than the human eye's ability to register flickering, a property that could be supplied by a grid of ultra-bright LED microarrays. McCarthy estimates that the cloak's CPU would consume about 8 to 10 kilowatts and would have to run at 10 billion GHz, while image capture, stereo vision, 3D scene manipulation, image warping, and deformation correction would likely boost the computing power requirement 100 percent. There is no current technology for eliminating the heat signature of the wearer, while the cloak itself would undoubtedly add to the heat output.
http://www.wired.com/wired/archive/11.08/pwr_invisible.html