Concerns from environmentalists and other factions threaten to hamper the progress of nanotechnology, the science of manipulating materials on the molecular level. For instance, the Science and Environmental Health Network wants the U.S. government to raise funding for research into nanotech's potentially dangerous effects, while the Action Group on Erosion, Technology, and Concentration (ETC) wants all nanotech research halted until its environmental impact is thoroughly evaluated. "We're concerned that nanoparticles can get into the food chain and drinking water, and that nanotubes will slide in and out of our immune system without us noticing," declares ETC executive director Pat Mooney. Many nanotechnologists believe that talking to the public early about safety issues will prevent public relations debacles, but the problem is that few laymen understand the technology, while concerns about nanotech lack definition. The wide scope of potential nanotech applications makes it hard to determine the most dangerous risks, but opposition figures such as Eric Drexler and Bill Joy have envisioned doomsday scenarios in which nanobots either self-replicate and overwhelm the planet, or wreak havoc from within the human body. It has been documented that certain nanoparticles are toxic, while other supposed threats--such as health risks from carbon nanotubes--have been debunked. Nanoparticles are already incorporated into many products, while the National Science Foundation forecasts that nanotech-based drug delivery systems, energy output, and electronics could be a mere three years away. Some advocates think that nanotech could actually help the environment, but critics such as Carolyn Reffensperger urge that risks be assessed first.
http://search.ft.com/search/article.html?id=020925000399

World Wide Web Consortium (W3C) staff member Danny Weitzner is leading a working group that will assess a RAND task force's proposal to allow working groups to add technologies that possess intellectual property claims under certain circumstances; the proposal goes against W3C's long tradition of using only royalty-free technologies. "The critical concern that has led us to push so hard for a royalty-free policy is that for all the different Web software implementers it would be terribly hard to negotiate with the parent holders," he notes. Weitzner says that some opponents to the royalty-free policy are worried that other standards bodies will adopt similar policies, but he believes that each organization has to make its own decision based on its needs. He comments that the need for royalty-free standards does not just apply to the open-source community. Weitzner says once his group evaluates the RAND exception at the end of September, it will be sent to an advisory committee, where director Tim Berners-Lee will make the ultimate decision to recommend the policy or not. Weitzner heads the W3C's Technology & Society Domain, which is tasked with developing the Semantic Web; he notes that the project focuses on increasing people's accessibility to Web-based information. "From a public policy standpoint, the Semantic Web technologies are crucial because they help people understand the social and legal contexts they're operating in," he explains.
http://zdnet.com.com/2100-1106-959235.html

Advocates and opponents of peer-to-peer (P2P) file trading squared off last week at a lunch hosted by the Cato Institute, where a central area of focus was a bill from Rep. Howard Berman (D-Calif.) that would grant copyright owners the authority to interfere with P2P networks. Internet proponents warn that the legislation would curtail consumers' private-use rights and allow copyright holders to launch hack, virus, and denial-of-service attacks on networks while being immune from legal reprisals. Troy Dow of the Motion Picture Association of America claimed that the bill would prevent copyright owners from being crippled by anti-hacking laws, while minority counsel for the House subcommittee on courts, the Internet, and intellectual property, Alec French, declared that the exemption from liability would only hold to owners who impair illegal use of their copyrighted works. However, Public Knowledge public policy director John Mitchell said the proposal was akin to "copyright law expansionism," and Computer & Communications Industry Association CEO Ed Black argued that the bill's supporters do not truly comprehend the scope of collateral damage such a policy entails. French insisted that Berman is open to collaborating on a better definition. Berman is a ranking member of the intellectual property subcommittee, which will hold a hearing on Thursday focusing on how P2P networks are allegedly being used to pirate intellectual property.
http://www.wired.com/news/politics/0,1283,55294,00.html

"Gray hat" hackers are being forced to reconsider their actions, which walk the razor's edge between acceptable and unacceptable activities, in light of new legislation, increased law enforcement, and the ever-changing definition of what constitutes ethical hacking. "We are reaching a crossroad where decisions have to be made as to which way people are going to go: Are they going to continue to function as a security consultant or go to the dark side?" declared Howard Schmidt of the White House's Critical Infrastructure Protection Board. Practices such as publicizing corporate security flaws may now carry the threat of lawsuits, prosecution, and even jail time for most independent security experts and consultants. Not even hackers who intrude in order to warn network administrators about security weaknesses are immune--the Hurwitz Group's Peter Lindstrom, for one, lumps gray hats and black hats in the same category. Although this is a minority view, the Digital Millennium Copyright Act (DMCA) and other statutes are helping to increase its strength. The vagueness and broad terminology of the DMCA is discouraging many hackers who specialize in finding and disclosing software vulnerabilities; the arrest of Russian programmer Dmitri Sklyarov and the Justice Department's case against his company for distributing a program that breaks e-book copy protections sent a troubling message to gray hats. Some think that programmers and hackers are shying away from alerting companies of software holes as a result of today's security-conscious atmosphere. Some hackers and others have responded to these trends by developing new guidelines for ethical hacking.
http://news.com.com/2009-1001-958129.html

Last week's DemoMobile event in La Jolla, Calif., was a platform for companies to demonstrate the latest consumer technologies. Among the products on display was a computing system from Shazam that can identify songs by audio; a "next-generation interface" from GeoPhoenix that allows small-screen device users to enlarge displayed content; and voice badges from Vocera Communications that include LCD readouts, speakers and microphones, and come with voice recognition, Wi-Fi networking, and Internet calling capabilities. Particularly exciting was Canesta's portable laser-projected keyboard, which can be displayed on any flat surface and typed on using "electronic perception technology." Logitech's Personal Digital Pen can store as many as 40 pages in its memory via an optical sensor that records what is written, and can transfer those stored writings to a computer. Meanwhile, "advanced notes recognition technology" from Pen & Internet can turn handwritten notes into text while leaving pictorial content unchanged and smoothing out scribbled shapes. Laptop users can enjoy more mobility while working on the Tcom International Oyster laptop dock, which was also on hand at DemoMobile. Other breakthrough technologies at the influential conference included wireless Windows Powered Smart Displays from ViewSonic and Phillips, while IBM's ViaVoice Translator software enables users to translate text written in one language to other languages that are expressed vocally in real time.
Click Here to View Full Article

Eleven U.S. colleges currently host the Federal Cyber Corps Scholarship for Service, a program that offers participating students free tuition, books, and accommodations, as well as a $1,000 monthly stipend. In return, students major in cybersecurity and then spend two years in the employ of a U.S. agency after graduation. To qualify for Cyber Corps, universities must fulfill 10 guidelines, such as hiring qualified faculty and setting up information assurance centers on campus. Colleges that offer the scholarship include the University of North Carolina at Charlotte (UNCC), Carnegie Mellon, Georgia Tech, and Purdue University. Congress earmarked $11.5 million this year and last year for Cyber Corps, while the White House injected another $19.3 million into the program when the president signed the supplemental appropriations act; around $8 million will be used to extend the program to four more schools. UNCC's allotment increased from $150,000 to over $2 million between 2001 and 2002, and overseer Bill Chu reports that such funds are channeled into scholarships, course development, educators, and computer labs. With more and more critical infrastructure depending on computers, the demand for information security specialists is likely to skyrocket, experts predict. Most qualifying students must also get a national security clearance, and Chu says that possessing one makes graduates attractive to private companies.
http://www.siliconvalley.com/mld/siliconvalley/news/editorial/4135141.htm

Intel and a joint venture between Motorola, STMicroelectronics, and Philips will soon produce chips with 90-nm-sized features, and experts say such developments, although not true nanotechnology, could indirectly have a positive effect on the nanotech industry. The chipmakers' efforts utilize strained silicon, a substance whose precise structural and atomic control gives it properties also desirable for nanomaterials. Pioneering work with strained silicon was conducted by AmberWave and IBM, and David Welch of the Brookhaven National Laboratory believes such research could be applied to the development of nanoscale substances. Strained silicon allows for freer electron flow, although it can also give rise to impurities that hamper the flow. CMP Cientifica CEO Tim Harper believes that strained silicon could be a cheaper alternative to gallium arsenide and indium phosphide. "Once you start combining faster, lower-noise technologies with denser integration, you can produce faster devices at a lower cost," he explains. Meanwhile, IBM's Paul Welser says IBM plans to incorporate strained silicon into all its PowerPC and server microprocessors in roughly two years. The company also intends to manufacture chips with 65-nm transistors by combining strained silicon with silicon-on-insulator (SOI) technology.
http://www.smalltimes.com/document_display.cfm?document_id=4694

The Department of Commerce (DoC) extended ICANN's Memorandum of Understanding (MOU) for another year, but has pledged to monitor ICANN more closely. Commerce assistant secretary Nancy Victory wrote on Friday that DoC "views the one-year term of this extension to be a critical period for ICANN to make substantial progress on the remaining transition tasks" outlined in the MOU, and said the DoC is "disappointed" in ICANN's progress so far. She said the DoC may not extend ICANN further if ICANN cannot create a secure environment for the Internet's 13 primary root servers. The DoC says that "this task [is] absolutely crucial, particularly in today's post-Sept. 11 world." The DoC also wants ICANN to complete other tasks outlined in the original MOU, but it does acknowledge that some tasks require international cooperation and other aspects outside of ICANN's strict control. ICANN CEO Stuart Lynn says that the issue of root server security does not indicate that the servers are unsecured, but rather that "we must make them as secure as they possibly can be." Root server security will require coordination between ICANN, root server operators, and the DoC, and will be done somewhat outside the public eye, according to Lynn. The DoC plans to talk to root server operators and work with the ICANN Governmental Advisory Committee (GAC) in order to get ccTLDs and regional Internet registries on board with ICANN. The DoC has also expanded its responsibilities under the new ICANN extension agreement, and will become more involved in DNS operations as well as its move to privatization.
Click Here to View Full Article(Access to this article on this site is free; however, first-time visitors must register.)

The Internet and other complex networks run on the same basic principles, according to a recent paper from a Notre Dame computer science team. Notre Dame computer science researcher Albert Laszlo-Barabasi says that the weighted nature of the Internet network of nodes and hubs, where every link is either assigned a 1 or 0 according to whether or not they actually connect, is similar to that which drives the economy, environment, and even evolution. In that sense, Laszlo-Barabasi's study of the architecture of the Internet could lead to better understanding of how the Internet will look in the future, as well as how it grows and changes. He points to the Internet's resilience to attack as evidence of its evolution--specific virus and hack attacks eventually are mitigated as Internet architecture improves. Stephen Jones, senior researcher with the Pew Internet Project, says Laszlo-Barabasi's map of the Internet's architecture resembles a map of a neuronal network in the human brain; however, he doubts anyone's ability to see how the Internet will look in the future, especially by mapping the wired Internet architecture as Laszlo-Barabasi is doing, because the Internet is moving toward wireless systems where applications are served remotely. Still, Laszlo-Barabasi says, "Metabolic organization...complies with the design principles of robust and error-tolerant networks. This may represent a common blueprint."
http://www.newsfactor.com/perl/story/19483.html

Officials in Seoul have contracted with a New York University academic to design a new digital city from the ground up on the outskirts of the South Korean capital. Anthony Townsend, of NYU's Taub Urban Research Center, has already received a grant from the National Science Foundation to study the effects of IT and digital communications as vital components of a city. Townsend says his team has plans to integrate work, academic, and living spaces together in a way that best allows people to use and produce digital media. "Half of designing a city is going to be information spaces that accompany it because lots of people will use this to navigate around," he proclaims. South Korea, which has the highest broadband penetration rates in the world and working 3G networks, wanted to build a technology area similar to Silicon Valley, but realized too many other Asian countries already had such projects. Instead, officials want to differentiate their city as the first built from scratch with digital technologies in mind. Townsend says wireless technologies will play a key role, and will have a dramatic impact on city life, similar to the effect of automobiles. He explains that telecommunications, especially wireless, have not made place a less important aspect of life, but actually has made some locations more important, such as wireless access hubs. Townsend foresees consolidation in the telecommunications industry and a greater role for wireless technology in the relationship between citizens and government. He observes that wireless, positioning technologies, and location-based systems are converging.
http://zdnet.com.com/2100-1105-958597.html

Nearly 200 international teams squared off at the RoboCup 2002 event in June, where their efforts in robotics and artificial intelligence competed in a soccer game. RoboCup has come a long way since its inception: Whereas the first competition involved only a few dozen groups convened in a hotel, this year's event took place in Japan's Fukuoka Dome stadium before an audience of 127,000. Two-legged humanoid robots also participated for the first time this year, which illustrated the debate over the use of such machines. Although wheeled robots have performed more efficiently on the playing field, Japanese developers claim that humanoid robots can be useful as well. Honda's Masato Hirose argues that human-looking machines support more natural interaction, and can cover the same area as humans, including confined spaces and up and down stairs. Sony's Hiroaki Kitano started RoboCup in 1997, hoping that such a challenge would inspire significant strides in robotics and artificial intelligence. American and European teams have distinguished themselves at RoboCup through artificial intelligence-based teamwork programs, which could eventually be incorporated into humanoid robots. Still, Carnegie-Mellon University roboticist Christopher G. Atkeson notes, "Work on humanoid robots in the U.S. has been slow to take off," largely due to an early U.S. focus on factory automation applications for robots.
Click Here to View Full Article

The enormous, almost limitless potential of nanoscience is prompting universities in North Carolina to build new facilities dedicated to its study in order to gain a competitive edge. University of North Carolina-Chapel Hill (UNC-CH) Chancellor James Moeser recently announced plans to open several nanoscience labs, including the $10 million-a-year Institute for Advanced Materials, Nanoscience, and Technology in Chapel Hill as well as a $3 million Triangle National Lithography Center at N.C. State University in Raleigh. The latter facility will include lasers that researchers and businesses can use to produce smaller chips and devices, and will allow chemist and nanoscale researcher Joseph DeSimone and colleagues to conduct further experiments involving electronics shrinkage. Duke University and NCSU also intend to construct nanoscience labs. Notable nanoscience innovations produced at Triangle campuses include a patented wax membrane that can deliver drugs through the bloodstream from Duke's David Needham; and a one-molecule-thick waterproof coating and a gold-based filtering material developed by Jan Genzer of NCSU. Meanwhile, UNC-CH physicist Rich Superfine is researching the creation of tools such as a nanomanipulator, which allows users to "feel" nanoscale objects via virtual technology. Although nanotechnology remains a theoretical science, fraught with obstacles such as the suspension of traditional physical laws by quantum mechanics, nanoscience efforts are bringing in tens of millions of dollars in funding to Triangle universities.
http://www.smalltimes.com/document_display.cfm?document_id=4642

Content providers such as movie studios and record companies are battling with IT industries over proposals designed to force digital transmission technology to incorporate digital rights management (DRM) that prevents copyright infringement. The entertainment industry is worried that the FCC's requirement that digital TV transmissions be unencrypted will threaten sales if users can find a way to copy and redistribute such transmissions; its solution involves embedding digital TV signals with watermarks that control whether home entertainment systems can copy programs, but Princeton's Edward Felten believes such controls will be relatively easy to extricate. Content owners' restructuring ambitions go beyond home entertainment systems--they extend to computers and the Internet, which have enormous potential as copying machines and distribution systems, respectively. Sen. Ernest "Fritz" Hollings (D-S.C.) introduced a bill that calls for tech companies to add a federally approved anti-copying standard to new digital products or face criminal charges, and to adopt the standard within 18 months or risk federal intervention. Capitol Hill sources report that the purpose of the bill is to accelerate consumer adoption of broadband services. Although technology companies also advocate copyright protection, many oppose the Hollings bill. At the heart of the fight between content providers and tech companies is their attitude toward their customers: Content providers look upon them as consumers who should not be allowed to take products or content for free; tech companies, however, consider customers to be users who desire more product extras and cheaper power, and restricting those products' capability through DRM curtails user empowerment. Meanwhile, users have been notably absent from the content-versus-technology debate.
http://www.smh.com.au/articles/2002/09/20/1032054954195.html

Microsoft's research chief, Rick Rashid, oversees 700 employees working in five laboratories around the world, including a Fields Medallist, a Wolf Prize winner, and a Godel Prize winner. Rashid says the labs are run "much more like a university computer science department" than a corporate lab, but since their work is so broad, just about everything they research has some application in one of the company's products. The resulting research has led to products such as the Windows Media Player, for example, or the algorithms used in Windows. "We do work closely with the product teams, but you can't have the kind of impact from a basic research group that you want to have unless you are able to be at the forefront in the fields that you work in," he explains. Rashid says that Microsoft is working on a number of advancements in how people interact with their computers, and is developing natural input methods involving speech and super-large displays. Rashid says such innovations would allow people to have much faster access to far larger amounts of information. Other related research has to do with how people interact with one another using computers, and how design or function in computers can facilitate human interaction. "Basic research is best when it's not directed from above, or directed from specific events," Rashid insists. Still, company-wide initiatives such as the Trustworthy Computing program have influenced research.
Click Here to View Full Article

Anti-globalization protesters in Oakland, Los Angeles, and Portland are building computers out of the recycled parts of old ones, and planning to ship them to poor communities in Ecuador, the Amazon, and elsewhere. The activists hope to put together about 300 Linux workstations in time for the Free Trade Area of the Americas meeting in Ecuador in October. After the planned protest, in which the computers will serve as communications tools, the machines will be deployed in the surrounding community along with a wireless network the activists plan to build. By equipping people with email and Internet access, the organizers hope to foster local collaboration and social improvement. This type of activism has been made possible by a confluence of technological opportunities--free open-source software, leftovers from the PC glut, and wireless networking that can supplement what lacks in wired infrastructure in poor communities. In Portland, the Free Geek group is harnessing the manpower of volunteers, who get to take home a computer after they assemble five working ones, and custom scripts that make short work of installing the Mandrake Linux operating system. Computer components donated to the Alameda County Computer Resource Center are being assembled into machines by members of the Independent Media Center, who also plan to use them in the Ecuador demonstration. Free Geek volunteer coordinator Reverend Phil Sano says that volunteers are drawn to the organization so that they can "remove the mystery" of computer operation. "And that's the thing that prevents many people from interacting with computers, that mystery. It's what keeps them on that side of the digital divide."
http://www.salon.com/tech/feature/2002/09/23/antiglobal_geeks/index.html

The Information Technology Association of America (ITAA) and Dice will issue a report today that maintains a gloomy short-term IT hiring forecast, despite the addition of 85,000 jobs since the year began. Dice CEO Scot Melland says the survey illustrates that the economic situation has diluted the more favorable hiring outlook made in early 2002. However, others complain that foreign IT workers brought into the United States on H-IB visas as well as offshore programming firms are stealing jobs away from qualified Americans. Aztech Professional Services President Norman A. Lane claims that corporations are taking advantage of offshore outsourcing so that they do not have to pay unemployment taxes when the demand for workers falls, and he suggests that such companies should be charged a levy for each outsourced job. "So much work is going offshore, we're putting ourselves at a substantial [intellectual capital and security] risk," cautions Linda McInnis of BostonSPIN. Meanwhile, ITAA President Harris Miller admits that outsourcing has affected the U.S. IT job market, although he gives the economy the lion's share of the blame; many ITAA members are technology suppliers that have used foreign IT workers extensively. In 2003, the General Accounting Office will release the results of a study that analyzes the effects of the H-IB visa program on the American workforce.
Click Here to View Full Article

The proposed Speech Language Application Language Tags (SALT) standard promises to improve self-service, call center, and similar applications. SALT is being developed by the SALT Forum, which aims to build a royalty-free, open source standard speech/Web content interface accessible through PDAs, phones, desktop and tablet PCs, and other devices. These interfaces would share space with text, audio, video, and graphics. Dan Hawkins of Datamonitor says that SALT is essential for mainstream penetration of concurrent multimodal applications, but it will be several years before this goal is reached. The Forum's first version of the SALT standard was issued to the public and submitted to the World Wide Web Consortium (W3C) in July; W3C fellow Dave Raggett says W3C working groups will start discussing a development strategy and possible interrelation with the VoiceXML spec this fall. Intel's Peter Gavalakis expects customer-service-intensive sectors that already use speech recognition and interactive voice response system will be quick to embrace SALT. Microsoft supports SALT, which Hawkins says is especially important for its adoption. Meanwhile, advocates face numerous challenges, such as convincing service providers and firms to invest in speech-enabled Web applications despite the economy, developing industry partnerships with VoiceXML supporters who are not SALT members, and making Web developers more aware of the technology.
http://www.nwfusion.com/buzz/2002/salt.html

Open-source is becoming a recognized source of secure and stable software, but users need to be aware of potential dangers--such as backdoor programs--and how to avoid them, writes independent security consultant Rik Farrow. When downloading open-source software (OSS), users should first check the digital signature on the download site to verify that the application has not been modified since the author's signature. Although open-source programmers do not get paid and have little legal liability over their work, they have more incentive to produce secure software than do paid programmers working for large software vendors. Allowing a backdoor program to be serendipitously included in their code would be tremendously harmful to their reputation in the open-source world. This is unlike the situation in the early 1980s, when open-source systems such as the Berkeley Software Distribution (BSD) and Unix were being developed. In 1984, Unix designer Ken Thompson surprised the IT community during an award ceremony at an Association for Computing Machinery meeting by revealing he had included a backdoor in the login program that would give him system access as any user, and that the backdoor had self-propagating characteristics so it would transfer itself to updated Unix versions. Including backdoor code in software programs is a barrier to commercial acceptance, but nowadays most people would rather trust the vendors' assurances that their software is secure than sift through the source code. GNU Privacy Guard signatures that come with most OSS packages are considered to be an excellent defense against altered source code.
http://www.networkmagazine.com/article/NMG20020826S0005

The MPEG-4 video standard promises improved compression and more interactive images; potential benefits include lower bandwidth costs and video transmission on just about any device, but the technology may not necessarily spark the same spectacular business as its MPEG-2 predecessor. Competing products and discord over licensing are barriers to the technology's spread, while iSuppli's Jay Srivatsa contends that software video compression will capture most of the PC market, making streaming video delivery on PDAs and cell phones MPEG-4's chief market. Furthermore, MPEG-4 lacks a strong driver, while support for the technology can be easily incorporated into existing chips. Certain companies are deciding to use programmable processors in order to deal with fluctuating standards, while others are opting for standalone chipsets; Sigma Design's Ken Lowe believes the market will be split between these two choices, with the first being used to deliver video on cell phones and the second serving a high-end market for specialized equipment such as DVD players and digital televisions. Still another route some companies are taking involves designing hybrid chips with MPEG-4 processing capability embedded within them. Marketing challenges include persuading consumers to exchange set-top boxes that are not enabled for MPEG-4 for units that are, and such economic obstacles have spurred chip makers to seek other clients, such as overseas phone companies. Srivatsa notes that "there's still really no traction yet" for the standard, which debuted three years ago.
Click Here to View Full Article