Although negotiators announced on Monday that film studios, technology firms, and consumer-electronics companies were close to a deal that could speed up the adoption of digital television, the coalition fell apart yesterday as the participants were unable to reach a consensus on how to protect digital content on the Internet. The Broadcast Protection Discussion Group, featuring representatives from leading high-tech companies, major media firms, and electronics manufacturers, was formed in November to develop a voluntary standard for anti-copying technology that blocks the online trading of digital content. On Monday, negotiators said the group was ready to release a report that would say that most industry players are in accordance that digital TVs, recordable DVDs, and other equipment should be able to identify "broadcast flags" that enable consumers to make personal copies but prevent Internet distribution. Such a standard might then be written into law by Congress. But no consensus has been reached, as electronics firms as well as Microsoft complained that Hollywood studios wanted too much control over which technologies to use. Royal Philips Electronics President Tom Patton said "there is no consensus embodied in that report. None." Microsoft's Andy Moss said the film studios were proposing subjective criteria for determining the copy protection system. Microsoft, Philips, and others are developing their own copy protection systems. Computer firms have said that building copyright controls into their products would ultimately be futile, while the Electronic Frontier Foundation, another participant in the discussion group, is concerned about the loss of fair use rights for consumers. Several group participants are now calling for government participation, while the House Energy and Commerce Committee has plans to hold hearings on digital TV next week.
http://www.nytimes.com/2002/06/05/technology/05DIGI.html(Access to this site is free; however, first-time visitors must register.)

A soon-to-be-released white paper about the inherent security risks of open-source software--and the authors' reticence to reveal the identity of the report's sponsor--has aroused suspicion from open-source proponents that Microsoft may be behind it. A Microsoft spokesman acknowledged in an email that the company contributes funding to the Alexis de Tocqueville Institution, the organization that furnished the report. However, institute president Ken Brown and chairman Gregory Fossedal made no comment as to whether Microsoft sponsored the white paper, prompting experts such as security researcher Richard Smith to wonder "if they have something to hide." The white paper warns governments that open-source software is less secure than proprietary software, but most security experts think that neither software is more or less secure than the other. Bill Wall and Darwin Ammala of Harris Corporation's STAT computer security unit say that proprietary software cannot be studied by hackers because its underlying code is concealed; but they also argue that while open-source software code can easily be viewed by hackers, its security is boosted because of its exposure to a large audience of software developers. The security issue cannot really be settled until it has been thoroughly studied, according to security specialists, and Wall suggests such a task should be carried out within the software engineering research community by agencies such as the Defense Advanced Research Projects Agency or the Software Engineering Institute. Open-source advocates hint that Microsoft's possible complicity in the report is in response to the growing government and military interest in open-source systems.
http://www.wired.com/news/linux/0,1411,52973,00.html

The National Conference of Commissioners on Uniform State Laws is trying to revise its unpopular legislation governing software licenses. The group's proposal, the Uniform Computer Information Transaction Act (UCITA), has only been adopted so far in Maryland and Virginia, and failed last year in all the state legislatures where it was considered. Critics of UCITA say it gives too much power to vendors and strips end users of many of their rights. The new revisions are intended to assuage some of the opposition by taking away software companies' ability to remotely disable their products in the event of a licensing dispute. Reverse engineering techniques for purposes of making software interoperable are also sanctioned under the new UCITA bill. Some major opponents, including the American Library Association and the Society for Information Management, remain skeptical of UCITA, even after other changes were made earlier this year. The latest revisions are in response to criticisms by a committee tasked to examine UCITA by the American Bar Association.
Click Here to View Full Article

Smile.D, a new cross-platform virus, has been released that is more difficult than most to detect, according to antivirus companies, who say the code could force antivirus developers to rethink their strategies. Simile.D, the fourth variation of the bug, changes its technical parameters such as its file size in order to conceal itself from detection. Antivirus companies have already devised multiple-approach methods that can find Simile.D and have incorporated them into their latest updates, but Network Associates researcher Jimmy Kuo says the virus' innovation has upped the arms race between virus writers and security researchers. Although more standard ways of detecting the virus would probably work, it would bog down computer systems with extensive searching. Simile.D, also known as Etap.D, can also cross from Windows to Linux and back, though only Linux systems running in "superuser" mode are fully infected. The virus carries a relatively innocuous payload--it is scripted to display pop-up dialogue boxes in several months--showing it is a proof-of-concept virus meant to demonstrate technical feasibility.
http://zdnet.com.com/2100-1105-932447.html

MIT physicist Seth Lloyd reckons that every event that has taken place in the universe since its birth can be reduced to a form of computation; he has essentially modeled the universe after a computer. The link between physics and information science becomes apparent when dealing with quantum-scale events--shifts between the quantum states of particles can be likened to switching a computer bit between 1 and 0. "If one regards the universe as performing a computation, most of the elementary operations in that computation consist of protons, neutrons, electrons and photons moving from place to place and interacting with each other according to the basic laws of physics," Lloyd explains. He has estimated that simulating the universe in its entirety would take a computer that possesses 10(90) bits and is capable of manipulating those bits 10(120) times. However, the universe probably contains only about 10(80) fundamental particles. Lloyd arrived at his figures by calculating the total energy in the universe using E=mc(2) formulated by Einstein.
http://www.nature.com/nsu/020527/020527-16.html

MIT student Andrew Huang reports in a recently published research paper that he has successfully breached the security system of Microsoft's Xbox game console and extracted software keys that would allow the console to run unauthorized applications. He performed this operation by soldering a customized circuit board between two critical Xbox components; this enabled him to retrieve the keys by intercepting the traffic between the components. The job took three weeks and cost Huang only $50. He says the Xbox security codes are vulnerable because they are not encrypted: Microsoft opted on a third-party hardware design that depends on HyperTransport connections and dummy chips, among other measures. Huang has refused to reveal the key or secret code to others since that would constitute a copyright violation, although he hopes to perform an Xbox break-in whose results can be legally shared. Microsoft has stated that it supports Huang's efforts, which do not represent a threat to partners or customers.
http://news.com.com/2100-1040-931296.html

There will likely be an increase in online congestion and Internet access slowdowns as more people acquire wireless devices and crowd network frequencies. Wireless is becoming more and more popular for companies, universities, and households because of its convenience, its falling costs, and its easy installation. Cahners In-Stat estimates that Wi-Fi sales revenues alone will surge from $2.4 billion to $5.2 billion between 2002 and 2003, while competing wireless protocols such as Bluetooth operate in the same 2.4 to 2.483.5 GHz frequency range; conflicts can also arise from technologies that use the 902 to 928 MHz range. These frequencies are not being coordinated because the government allows them to be used free of licensing. Author Matthew Gast says interference can be remedied through the coordination and adjustment of settings, while some companies are getting a jump on wireless propagation--for example, Sirius Satellite Radio and XM Satellite Radio have requested that the government institute a requirement that wireless devices reduce their signal strength to forestall conflicts. Other solutions being looked into include technical and regulatory adjustments between Wi-Fi and Bluetooth, devices that can scan airwaves for crowding, the exploitation of newly available frequencies, and automatic signal intensity adjustment.
Click Here to View Full Article

NASA researchers plan to collaborate with each other using new technology when the agency sends a pair of explorer robots to Mars next year, according to an announcement on Monday. They will use IBM's BlueBoard, a plasma screen with touchpad capabilities, enabling scientists to examine and share data and images sent by the robots. In this way, they will coordinate the machines' mission on a day-to-day basis, says Jay Trimble of NASA's Ames Research Center. The boards, which will be tested this summer, will be equipped with a Web browser and a workspace to facilitate data and file sharing. "The key is a group can share and work around a 50-inch plasma display much more easily than they can a 14-inch laptop display," notes Trimble. NASA has a lot riding on the 2003 Mars mission, since previous missions ended in failure: One probe was lost due to a metric miscalculation, while a second probe vanished en route to the planet.
Click Here to View Full Article

Intel's new Itanium 64-bit processor, based on Explicitly Parallel Instruction Computing (EPIC) design, will drive computer developments over the next decade just as today's computer technology is running out of gas, says University of Waterloo computer science professor Peter Buhr. Itanium is getting its start with such high-end computing tasks as digital video editing, but experts predict that the 64-bit computing design will eventually take hold in everyday computing--on PCs and game consoles, for example. The Itanium, co-developed with Hewlett-Packard, makes much more memory instantly available to the processor than the older 32-bit design. It is likely to transform computing just as Intel's bread-and-butter chip design did, but from the top down, instead of from the bottom up, as happened with the PC revolution. IBM's Jay Bretzmann says a question remains as to how quickly the Itanium will be adopted in its target market, the corporate server room. Much of that will depend on the software available to users, because although the Itanium can run software designed for 32-bit systems, corporate buyers are often wary of new technology, even when it means efficiency gains. Meanwhile, Advanced Micro Devices is developing Hammer chips, a line of 64-bit processors that company representatives claim are more compatible with existing software, because their operations are similar to current 32-bit chips.
Click Here to View Full Article

Internet pioneer and WorldCom VP Vint Cerf says the U.S. government needs to be a more active participant in the effort to extend broadband's reach to the masses, which is currently blocked by existing and proposed telecom regulations. He believes the desire for high-speed Internet access could be rekindled if the government institutes a policy that requires phone and cable companies to open up their networks to ISPs with a fair pricing scheme. However, cable and local-phone companies complain that they will never make back the money they invest in new networks and technology if the open-access requirement is passed. Cable companies say they have started sharing their networks with rival ISPs in certain markets and advocate voluntary open access. "Our question is what problem needs fixing that the marketplace itself isn't fixing?" asks Marc Smith of the National Cable and Telecommunications Association. Meanwhile, local-phone companies are trying to get the FCC, Congress, and state governments to eliminate regulations that authorize the sharing of updated DSL networks with competitors. Cerf cites the haziness of the digital copyright protection issue as another factor inhibiting the proliferation of broadband, and suggested that a clear definition of fair use must be communicated. He also thinks broadband growth could be triggered by the implementation of symmetrical Internet services that give consumers equal uploading and downloading capacities.
Click Here to View Full Article(Access to this site is free; however, first-time visitors must register.)

An initiative by University of Buffalo computer scientists has yielded objective criteria for handwriting analysis, which could impact court cases that rely on handwritten documents as evidence. Sponsored by a grant from the National Institute of Justice, the UB researchers have designed a software system that can establish a writer's identity with 96 percent confidence, using 1,500 handwriting samples as the template. The results of such experiments clearly demonstrate that handwriting is unique to each person, according to Sargur Srihari, director of UB's Center of Excellence in Document Analysis and Recognition (CEDAR), the largest handwriting analysis research center in the world. At CEDAR, artificial intelligence researchers enhance handwriting analysis with pattern-recognition methods. Srihari says the system his team devised analyzes handwriting samples by deconstructing them, rather than reading them visually as a human expert would. The system can break down the writing's overall structure into 11 features, such as document layout and line spacing, while extracting 512 features from individual characters--stroke marks and character openings and closures, for instance. The UB research had a bearing on an April 29 decision of the U.S. District Court for the Eastern District of Pennsylvania, in the case of U.S. v. Gricco. The research is detailed in a paper that will be published in the July issue of the Journal of Forensic Sciences.
http://unisci.com/stories/20022/0530026.htm

University of California, Santa Barbara associate professor Tao Yang is helping three graduate students develop software that seamlessly connects computer clusters to the end user, using Yang's Teoma search engine as the primary development tool. Teoma was originally developed to rival top search engine Google, whose popularity is due to its ability to rank keyword-derived pages by relevancy. A keyword or phrase search on Teoma generates three distinct lists: Yang says the first list is similar to Google's results, based on relevancy; a second list displays refined topic groups that can narrow the search; and the third lists "expert" links related to the original search. Teoma can currently scan 200 million Web sites, while Google can reportedly handle 2 billion, and reviewers are waiting to see how the former expands its database and enhances its capabilities. Yang's team's service clustering software initiative, Neptune, aims to apply real-world principles to the project goal of developing software that allows computers to be added or removed from the cluster while maintaining reliability at a low cost. Grad student Kai Shen says the Neptune team is working "to merge a group of machines and make them work in sync."
http://www.newspress.com/business/060202teoma.htm(Access for paying subscribers only.)

Sally Ride, the first American woman in space, has founded a club that aims to help girls in elementary and middle school maintain their passion for science, math, and technology in the hopes that they might one day become scientists and engineers. Studies show that most girls lose interest in math and science in middle school, either out of frustration or simply because they do not consider the subjects to be fun. Ride notes that girls may not receive the same kinds of support from parents, teachers, and peers to pursue math and science as boys do. The club is arranged as a forum where girls can bond with one another through their scientific interests, communicate with professionals and role models, and take part in science-oriented exercises. The club is a key component of Imaginary Lines, a for-profit company that hosts national community science festivals for elementary- and middle-school girls. The company has garnered nearly $1 million in private investments and partnered with Honeywell, IBM, Hewlett-Packard, and International Rectifier to sponsor science-centered events. The Sally Ride Science Club currently has 1,000 members; the benefits of membership include monthly newsletters, Web site access, and email updates about upcoming events.
http://www.eschoolnews.com/news/showStory.cfm?ArticleID=3735

In Mali, delegates from approximately 40 African countries joined media groups and international organizations Tuesday for a conference on the continent's roll in the new "information society." Attendees discussed ways in which African nations could bridge the digital divide between their continent and the developed world. The United Nations' Economic Commission for Africa says there is about one Internet user for every 250 people in Africa--4 million in total--most of whom are in South Africa. This compares with a worldwide average of one Internet user for every 35 people. The Mali conference was designed in anticipation of the upcoming World Summit on the Information Society meeting, scheduled to take place in Switzerland at the end of next year under the aegis of the International Telecommunications Union and the United Nations.

The ICANN committee proposal to reform ICANN's internal structure calls for creating a structure that separates ICANN functions into three departments: technical issues, policy-making, and operational structures. The proposal calls for a study group to be created in order to determine how to delineate ICANN responsibilities to fit into these three categories as well as offer recommendations about how ICANN should function. Outsourcing some tasks while retaining authority over them is a possibility. Under the proposal, ICANN would have supporting organizations for various stakeholders: the address supporting organization (ASO); the generic name supporting organization (GNSO); the protocol supporting organization (PSO); the county-code names supporting organization (CNSO); the government advisory committee (GAC), which would be made up of world government representatives; the root server system advisory committee (RSSAC); and the security advisory committee (SAC). The GAC would have a board seat as well as liaison officers at other committees. This ICANN proposal would create more barriers between ICANN decision-makers and the public, according to ICANN board member Karl Auerbach. The proposal's ombudsman who handles ICANN complaints would be appointed by the ICANN board and not be required to make any information public, and while the proposal calls for an independent arbitration process to rule on ICANN violations alleged by others, all arbitration decisions would be non-binding for the ICANN board. The proposal "is an expansion of the ICANN staff to become even more of an empire than it is today...with complete disdain of the public interest," says Auerbach.
http://www.internetnews.com/bus-news/article.php/1182181

The spread of Web applications has spurred enterprises to improve performance for remote users, and edge computing is one such initiative. Content delivery networks (CDNs) and application infrastructure suppliers are teaming up to provide offerings that will enable companies to move "cacheable" application elements out to the network edge in an effort to smooth out the development process. For example, Microsoft and IBM have partnered with companies such as Akamai and Exodus so that hundreds of points of presence (POPs) around the world can take advantage of .Net and Java 2 Enterprise Edition (J2EE) edge servers and infrastructure software. It is a sensible maneuver to move selected application logic toward the edge as Web services migrate outside the firewall. At first, edge computing would be used for presentation layer customization or content transformation, but more back-end business logic and data replication capabilities should be added later on. There are still issues of application manageability and security to be resolved, however. Gartner reports that 50 percent of large companies with clustered, computer-intensive applications will use edge computing in three years.
http://www.infoworld.com/articles/fe/xml/02/05/27/020527feedgetci.xml

More and more companies are requiring both new and veteran IT workers to take extensive tests that assess their technical and business skill levels in order to identify shortcomings, formulate improvement strategies, and organize IT teams. Managers of Cable & Wireless' Global Internet Group say such testing offers a way to expose business employees to Web technologies and technically-oriented workers to industry. This trend is being driven by the Internet, which eases the testing process. But experts warn that companies that employ such testing may meet resistance from staff unless they clearly relay its purpose to them. Brainbench CEO Mike Russiello notes that resistance is common among many companies, in which the results have a bearing on decisions to lay off or terminate personnel. However, he says these protests are fleeting, once management assuages employee fears--"The complaints bubble up, but then people realize it's silly to raise them, and they get back to work." Testing can also be encouraged by incentives: C&W, for instance, awards bonuses to workers who reach certain testing milestones. In addition, certifications also look good on employee resumes and further a worker's status during employment reviews.
http://www.eweek.com/article/0,3668,a=27353,00.asp

David A. Fisher of the Computer Emergency Response Team (CERT) at Carnegie Mellon University notes that there is no real central control of the unbounded systems that compose the nation's online critical infrastructure, which makes defending it against hackers and cyberterrorists difficult. "High impact incidents"--computer virus outbreaks and other kinds of cyberattacks--are becoming more sophisticated, and doubling every year. In February, Fisher's team released Easel, a new programming language that simulates unbounded systems even in the presence of information gaps about their states. "So I can write programs that help control the power grid or help prevent distributed denial of service attacks," Fisher explains. Easel follows a different logic architecture than previous languages, thus easing the process of abstract reasoning. For example, whereas traditional computation might identify dogs by proper nouns (Fido, Rex, etc.), Easel would define them as common nouns; by limiting descriptions to certain defining elements--the height and basic features of a dog, for instance--instead of specifying particular representations, the simulation is guaranteed not to produce a wrong answer, says Fisher. In this way, the path of havoc wreaked by a new software bug or computer virus throughout a system can be mapped out. The purpose of Easel is to help design unbounded systems imbued with "survivability," in which the primary functions of these systems can still be carried out even in the event of damage to individual parts.
http://www.sciam.com/2002/0502issue/0502profile.html

In the quest to optimize IT employee performance, focusing strictly on financial advantages does not yield the best results, because workers have inner needs and desires that money alone does not satisfy, writes Richard Florida, co-director of Carnegie Mellon University's Software Industry Center. InformationWeek's annual National IT Salary Survey is a case in point: Intrinsic factors such as challenge and responsibility have repeatedly topped the list, while flexibility is also highly valued. Money, by contrast, came in fourth this year in terms of importance. The company's job culture should exploit IT workers' creativity to the fullest by allowing both major and minor workers to voice and flesh out their ideas. Interviews conducted by Florida follow the results of the InformationWeek survey very closely--turn-offs for creative workers include work that is boring, trivial, or insignificant; regimented, inflexible schedules and dress codes; and, paradoxically, a chaotic, unstable work environment. Avoiding these drawbacks and instituting perks such as allowing employees to personalize their workspaces, and building common centers, will help nurture creativity. But the job culture must also adapt to the needs of creative professionals. This can be done in a number of ways, such as giving workers idiosyncratic deals that fulfill their unique job requirements, identifying corporate subcultures and customizing management strategies to suit them, modeling management on intrinsic motivation-driven organizations such as the open-source software community, and treating staff as "de facto volunteers," as defined by management maven Peter Drucker. Florida estimates that 38 million Americans--roughly 30 percent of the U.S. workforce--comprise a "Creative Class" that is growing in leaps and bounds.
http://www.optimizemag.com/issue/007/culture.htm