Venture capitalists are once again investing in open-source software technology companies despite the sometimes colossal failures of the late 1990s. This resurgence of interest is partly attributable to two developments: Increased adoption of open-source software by corporate users, and the establishment of a viable business model with the success of Linux support services vendor Red Hat, which generated $125 million in revenue last year and currently boasts a market capitalization of approximately $2 billion. However, some venture capitalists doubt other companies can successfully replicate the Red Hat business model. JBoss founder and lead developer Marc Fleury says some of the companies receiving funding make him anxious, giving rise to fears that they could sully the reputation of the open-source tech business community in general. One company whose business model deviates from the Red Hat paradigm is SugarCRM, which offers both a free product and a more secure and durable professional variant priced at $239 per year for each user. Another unique startup is SpikeSource, which acts as a third party to facilitate interoperability between proprietary and open-source software running in corporate data centers. VentureOne estimates that 20 open-source tech companies collectively raised $149 million in venture capital in 2004, and at least three startups raised $20 million in March alone.
Click Here to View Full Article(Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

Grid computing for businesses is getting a big boost with new guidelines from the Enterprise Grid Alliance (EGA) and the Globus Toolkit 4 from the Globus Alliance; the guidelines provide help with technical details such as security and utility pricing schemes, and is part of the year-old group's mission to accelerate grid usage in the private sector, define effective applications, and promote industry standards. The grid computing community is making a concerted effort to commercialize the technology, says Univa co-founder Steve Tuecke. Standardization of the technology will allow grid computing on a much greater scale, according to experts who equate grid technology today to where the Internet was 10 years ago. Illuminata analyst Jonathan Eunice says most grid computing implementations are currently internal projects that use proprietary tools. Though it makes business sense to adopt grid technology now, companies must be prepared to build the systems themselves and not simply buy a ready-to-go solution. The open source Globus Toolkit 4 makes it much easier to build these applications using distributed storage, database, and computing resources: The toolkit leverages a number of specifications, especially Web services. Hooking systems up to a grid helps optimize usage rates, and financial services firm Wachovia is expanding its grid computing efforts to include mainline transaction processing after seeing the benefits of a more narrow-scope project; Wachovia's Robert Ortega says old-school software licensing schemes, lack of packaged grid software, and immature business models are currently problems for industry grid projects. MCNC managing director Wolfgang Gentzsch, who led Sun Microsystems' grid efforts until last year, says the most difficult part of commercial grid projects is addressing cultural barriers.
Click Here to View Full Article

On the heels of heavy criticism from computer professionals and civil libertarians, the U.S. State Department is considering adding previously rejected privacy protections to federally mandated radio frequency identification (RFID) passports, according to deputy assistant secretary for passport services Frank Moss. The new e-passports, which feature a contactless chip carrying unencrypted data about their bearers, were criticized for being easy for properly equipped hackers to read from as far away as 30 feet. Moss says the solution his office is reconsidering involves the provision of a key or password by an RFID reader in order to read the data on the chip, which is encrypted when sent from the chip to the reader. The process thwarts accessing of passport data by remote readers in close vicinity--also known as skimming--since the passport folder must be physically opened and scanned through a reader to obtain the decryption key; encryption also discourages eavesdropping. The specifications for this process, known as Basic Access Control (BAC), were designed by the International Civil Aviation Organization, but Moss says the government originally dismissed the concept because the U.S. never intended to embed more data in the chip than what could easily be read visually, and was convinced that the read range was sufficiently small to deter hackers. Moss says member governments of the European Union have accepted BAC, and several vendors have already devised and tested BAC-compatible readers, though passports with BAC take longer to read than those without BAC. Barry Steinhardt with the ACLU calls the BAC solution better than the original solution, although he thinks an RFID chip is completely unnecessary in light of security concerns.
Click Here to View Full Article

The IST-funded RealReflect project is expected to deliver the first extensive industrial modeling application for Bidirectional Texture Function (BTF) image acquisition. "RealReflect is a major advancement over traditional virtual reality modeling, which basically relies on simplifications of reality by describing optical properties of a surface by a 2D matrix of data that does not show the real effects of lighting," notes project coordinator Attila Neumann of the Technical University of Vienna. The RealReflect system can acquire and render even the most precise textures virtually by taking illumination and viewing orientation into account; textures are captured from physical samples and rendered onto 3D models. "It is a much more powerful and demanding system than traditional virtual reality modeling, making it look real instead of simply believable," Neumann remarks. RealReflect's texturing data requirements multiply those of other VR modeling tools a thousandfold, so the project partners devised compression methods for the BTF data, as well as techniques to seamlessly replicate a small acquired sample of material on a 3D model. The immersive reality facilitated by the tool is particularly palpable when visualized in a cubic CAVE VR simulator. The RealReflect system has been specifically developed for the automotive industry, where it could substantially reduce vehicle development cycles and time-to-market, and also augment safety by allowing designers to see how vehicle surface reflectivity changes under different lighting conditions. The tool also has potential architecture and graphics applications.
Click Here to View Full Article

Silicon Valley is currently dominated by a mind-set that firmly believes technology should be protected as a wealth-generating asset, but a second culture advocating the unrestricted sharing of technology has even deeper roots that can be traced back to the counterculture movement of the 1960s and 1970s, writes John Markoff, author of the new book, "What the Doormouse Said: How the 60s Counterculture Shaped the Personal Computer Industry." The original PCs were products of the free-thinking, anti-establishment counterculture attitude. Many of Silicon Valley's computing pioneers were steeped in the counterculture movement, including Fred Moore, co-founder of the Homebrew Computer Club, where enthusiasts tried out and shared early PC technologies; Stanford Artificial Intelligence Laboratory (SAIL) founder and Lisp programming language creator John McCarthy, a left-leaning participant in the Mid-Peninsula Free University; and Stanford Research Institute (SRI) computer scientist Douglas Engelbart, who helped invent the mouse, hypertext, and ARPAnet, and who advocated "est" training as a performance-enhancing measure. Another key figure in the PC movement was writer Stewart Brand, who coined the term "personal computer" and played a key role in demonstrating to the world at large the tech breakthroughs researchers such as Engelbart were producing. In a 1972 Rolling Stone article, Brand characterized the free-thinking attitude at the heart of the emergent computer world with statements such as, "Half or more of computer science is heads [psychedelics]." He also accurately described the conflict between free tech advocates and tech entrepreneurs with the conclusion that "Information wants to be free, and information wants to be expensive." This tension has shaped debates over technologies such as open source software, peer-to-peer file-trading networks, and the Internet. These opposing cultures are symbiotic, and preserving them both will help preserve the mechanisms of innovation that comprise the core of technological success.
Click Here to View Full Article

DAFCA object architect and software design expert James Coplien said in an interview at the ACCU conference that companies' rush to bring software to market is fueling a decline in product quality, noting that the open source community subscribes to higher quality software standards. "The one glimmer of hope is the people who've said, 'Screw the industry, we're going to write excellent software and give it away,' in other words, the open source movement," he declared. Coplien argued open source software is more secure than closed source or proprietary software because it is borne out of a collaborative effort between contributors and a central community of developers. "The complementary, independent, selfless acts of thousands of individuals [in the open source community] can address system problems--there are thousands of people making the system stronger," he said. Coplien also said open source software is better tested than proprietary because it is scrutinized by more people, who are encouraged to look for glitches. Several industry experts at the ACCU conference disputed Coplien's arguments: Texas A&M University professor and C++ inventor Bjarne Stroustrup said not all open source software is of high quality, adding that some of the best code in existence is closed source. Meanwhile, Cambridge University security engineering professor Ross Anderson argued that the easy detection and patching of vulnerabilities in open source software, which relies on the wide availability of code, can be exploited by hackers.
Click Here to View Full Article

The AeA's annual Cyberstates 2005 report reflects a generally positive development in U.S. high-tech industry employment, with job losses slowing down considerably in 2004: The high-tech industry lost 25,000 jobs last year, compared to 333,000 jobs lost in 2003 and 612,000 lost in 2002. The study estimates that the software services industry added 30,300 jobs while the engineering and tech services industry added more than 30,000--a first for both sectors since 2000, notes AeA President William Archey. There was a 2% decline in high-tech manufacturing industry employment, with 31,900 jobs lost between 2003 and 2004, while the communications services sector experienced a loss of 54,000 jobs. Forty-six U.S. states lost high-tech jobs in 2003, according to Cyberstates 2005; California, Texas, New York, Florida, and Virginia were the five states with the highest concentration of high-tech professionals, with Virginia dislodging Massachusetts for fifth place from a year before. California and Texas also suffered the most high-tech job erosion with 67,800 and 32,900 lost jobs, respectively. The study indicates a rise in tech industry venture capital investment for the first time in four years, for a total of $11.8 billion--10% more than 2003 VC investments. U.S. high-tech exports, which accounted for 23% of all U.S. exports last year, rose 12% from $171 billion in 2003 to $191 billion in 2004. "The good news is that the technology industry looks to have turned a corner," says Archey, who adds that AeA's report, "Losing the Competitive Advantage?," investigates drivers of tech innovation such as research and development and a skilled work force, which must be of particular interest as international competition to the U.S. technological lead heats up.
Click Here to View Full Article

Data encryption technology is now a mature market with infrequent updates, but the failure of public key infrastructure (PKI) to take off in the commercial sector has left a gaping hole in the encryption framework. Encryption comes in two flavors: Traditional symmetric encryption and asymmetric encryption that uses public and private keys. Asymmetric encryption popularized by RSA Security protects traditional symmetric encryption by adding another encrypted piece of data, which dramatically increases the difficulty of code-breaking; elliptic curve cryptography is a niche application of asymmetric encryption that uses less resources and is more suitable for PDAs and smart phones, for instance. Digital signatures protected by hashing functions, which ensure the message package is unmolested while in transit, allow parties to authenticate one another. Recently, the SHA-1 hashing algorithm was shown to be vulnerable to certain methods of attack and could prompt the industry to move to another, more secure, standard. PKI was created in order to protect against the fraudulent creation of encryption keys and involved the top-down issuance of certificates through organizations such as VeriSign, but PKI was pushed too hard, too fast, says Capgemini global chief technical officer Andy Mulholland. When PKI was promoted heavily five years ago, the bulk of online transactions was done by consumers, not businesses. If PKI was launched today, its commercial success would be far greater, says Mulholland. Encryption also faces the problem of complexity where ordinary users find even PGP encryption difficult to use, while another challenge is government involvement, especially governments' ability to obtain and decrypt keys.

TCP/IP co-author Vint Cerf compares the introduction of TCP/IP to the then-embryonic Internet in 1983 to the ongoing switch from IPv4 to IPv6, though the later move is vastly more complicated because of the hundreds of millions of computers now connected to the Internet; back in 1983, TCP/IP ran in parallel with the old NCP protocol for about a month until problems were resolved. As ICANN chairman, Cerf sees the move to IPv6 as a critical priority, especially as the Internet grows to include all six billion people. Router vendors built equipment that supports IPv6, but much of the existing software infrastructure has to be updated so that applications are not confused when both IPv4 and IPv6 addresses are returned for a domain name request, for example; this requires augmentation for the domain name system. Rapid adoption of Internet technology for household appliances, mobile phones, or automobiles could hasten the need for IPv6, says Cerf. Parts of the Internet can now deliver voice conversations with higher quality than the traditional telephone network, but because of the Internet's distributed structure, that capability is not uniform. Cerf says downloaded entertainment content will become more important in the near term than real-time transmissions, and notes that as much as 50% of Internet traffic is currently dedicated to BitTorrent file-sharing. Illegal activity on the Internet is often more innovative than legal activity, and there needs to be greater governmental cooperation in order to combat Internet crime; at the same time, Cerf is extremely wary of introducing mechanisms that could allow governments to censor political speech online, and says filtering portions of the Internet where they have jurisdiction is more palatable. In general, he says existing laws should be applied to the Internet and existing organizations given appropriate scope, such as the World Intellectual Property Association and the World Trade Organization.
Click Here to View Full Article

Advocates of municipal wireless networks being built by local governments in Europe and the U.S. support them for their quick and easy deployment for use by city employees and safety personnel, which lowers telecom costs overall. Philadelphia government officials recently announced the Wireless Philadelphia initiative to make broadband available to everyone in the city with a 3,000-node wireless network, which will be set up at a cost of $10 million and maintained for the first two years for $5 million. However, legislatures in some U.S. states are pushing for a ban on such networks, as cable companies and wireless service providers apply pressure in attempts to halt the establishment of unrestricted wireless and WiFi networks that could lure away paying customers. One opponent, Tony Katsulos of InnerWireless, says the "wireless clouds that are being discussed for Philadelphia, Minneapolis, and other cities will not only be very difficult to implement, but they won't provide in-building coverage, for the same reason that cell phone signals can't penetrate buildings." He also warns that municipal wireless networks could disrupt corporate networks, although a representative for Wireless Valley counters that such challenges will be addressed through rigorous network design and planning. "The increased interest in municipal wireless broadband networks is driving innovation among software developers--back office, network security and hotzone management--and vendors of mesh networking equipment and antennae," writes wireless analyst Esme Vos in the MuniWireless.com March 2005 Report. CurrentOfferings.com analyst Dave Mock, author of "Tapping Into Wireless," says he has experienced no problems when using the free municipal wireless network in Fullerton, Calif.
Click Here to View Full Article

With a $12,000 grant from Texas Instruments and the Dallas Women's Foundation, the University of Texas at Arlington has developed the Metroplex Area Gender Equity Institute with the goal of boosting the number of girls pursuing math and science careers. Attaining this goal requires reforming the teaching habits of middle school educators, which set up a gender inequity in the classroom that can discourage female students' interest in math and science. Teachers convene at the institute to work out strategies for giving both genders equal representation in the classroom and to share technology and other resources to help maintain girls' interest. Cultural expectations often dampen girls' ambitions for math and science careers: Whereas boys are taught to be independent and aggressive at an early age, girls are more often sheltered by parents and educators. Director of Texas Woman's University's Science and Mathematics Center Cathy Banks says attracting girls to science and math in middle school and retaining them through high school is critical, and adds that cultural gender bias toward males can have a negative effect on the economy and the business world, as it cuts out an entire segment of the work force. Texas Instruments VP Tegwin Pulley says the socially meaningful and family-friendly aspects of technical careers must be played up if more women are to be drawn to them. There are signs that the gender gap is shrinking in Texas public schools and colleges, although girls are still underrepresented in more advanced subjects such as physics and computer science. Experts again blame cultural expectations for this shortfall.
Click Here to View Full Article(Access to this site is free; however, first-time visitors must register.)

Indiana University is a hub for higher education cybersecurity efforts: In addition to hosting the Indiana Higher Education Cybersecurity Summit this week, the school is home to the Center for Applied Cybersecurity Research (CACR), an expanding information assurance program committed to improving the integrity and security of information systems, technologies, and content via a variety of disciplines, including computer science, informatics, organizational behavior, criminal justice, law, and public policy. CACR is driving the development of an interdisciplinary cybersecurity curriculum. "The whole nation is talking about cybersecurity, especially in higher education," says CACR director and Indiana University School of Law-Bloomington law professor Fred Cate. Computer hacking and identity theft incidents are becoming more sophisticated, severe, and frequent across the government, nonprofit, business, and higher education sectors. No educational institution is completely cyberattack-proof given the complexity and highly distributed management of schools' IT infrastructures. But Cate thinks the impact of such attacks can be minimized through a "highly coordinated" initiative involving the top leadership echelons. "Engagement in the discussion is a critical step in developing strategies that will deter attacks, reduce vulnerabilities, and help to ensure that disruptions are infrequent, of minimal duration, and cause the least damage possible," he says. Cate says CACR not only has the improvement of cybersecurity in mind, but also the improvement of cybersecurity efficiency, cost, and its effects on individuals, the economy, and the public.
Click Here to View Full Article

The Disability Discrimination Act requires U.K. Web sites to be accessible to visually impaired users, and a recent Disability Rights Commission (DRC) survey found that 97% of major online organizations were cognizant of the Web accessibility problem; but although more than two-thirds of respondents reported that they were taking positive steps to resolve the issue, another DRC report estimated that 81% of Web sites do not comply with basic accessibility needs. Theoretically, organizations with inaccessible Web sites could be open to prosecution. Julie Howell with the Royal National Institute of the Blind (RNIB) says awareness of the Disability Discrimination Act has increased significantly among Web designers, and her organization is a major promoter of the issue as well as a consultancy service for companies that wish to make their sites more accessible to the visually impaired. Guardian Unlimited chief technical strategist Stephen Dunn says his staff is trained to consider Web accessibility, and he has consulted with RNIB and witnessed demonstrations of screen-reading software. He feels guidance is needed in regards to best practice, noting that "we need people to move from saying there is a problem to saying what standards they could comply with." Best practice for Web site accessibility guidelines are being composed by the DRC with assistance from the British Standards Institute. Meanwhile, the pan-European Enabled Initiative seeks to improve Web access through screen readers and other technologies as well as give visually impaired users a better understanding of visual content with equipment such as on-screen touch displays.
Click Here to View Full Article

NASA computer scientist William Clancey has long advocated the view that cross-disciplinary research with emphasis on the humanities is critical to the development of artificial intelligence, and is putting his theories into practice as he tests the use of robots in expedition scenarios astronauts stationed on Mars might face. The researcher, chief scientist for human-centered computing at NASA's Ames Research Center, believes live study of the needs of geologists carrying out exploratory work at the Mars Desert Research Station in Utah is the only way to accurately evaluate astronauts' requirements. NASA envisions robots assisting the Martian explorers, for instance by performing tasks that may be beyond the abilities of astronauts on a mission. One Utah expedition Clancey followed demonstrated the need for robots to sport an antenna so that the explorers can maintain contact with home base. The robots currently under development at NASA are comprised of a four-wheeled machine equipped with a pair of laptops. One of the laptops receives simple commands, and the other is supported on a wireless network linked to laptops carried by the astronauts on their backpacks. Clancey says the robots must be properly contextualized to be effective. "I still believe that we will build machines that have human-like intelligence, but it's not going to be soon," he says.
Click Here to View Full Article

Robots modeled after animals are migrating from the laboratory to the wild to participate in previously unworkable animal behavior research and experimentation. University of California, Davis, behavioral biologist Gail Patricelli studied the courtship rituals of satin bowerbirds using a stuffed, joystick-controlled robot bird, while a more advanced robot controlled by radio is being developed to monitor the mating signals of the sage grouse. Birds can easily mistake a robot for one of their own since their movement and the robot's jerky motions are so similar. UC Davis graduate student Aaron Rundus is testing the mechanisms squirrels use to ward off rattlesnakes using a stuffed squirrel simulacrum that swings its motorized tail aggressively to generate heat that snakes are sensitive to. Rundus says the robot, which he built with mechanical engineer Sanjay Joshi, permits a controlled experiment that could not be carried out in the wild. Joshi has also teamed up with psychologist Jeffrey Schank to build autonomous robotic rats that follow simple behavioral rules. "We are trying to quantify the lowest level of cognitive ability necessary for certain types of behavior," which can then be related to theories about animal behavior in the wild, explains Schank. Meanwhile, Frank Grasso, director of the City University of New York in Brooklyn's Biomimetic and Cognitive Robotics Lab, is studying now lobsters sniff out prey with a mechanical lobster sensitive to a scented plume in water. He says, "The robots allowed us to improve on ideas that had been around for a long time."

HP Laboratories receives just 5% of Hewlett-Packard's research budget, but is working on important breakthrough technologies that aim to revolutionize IT. Unlike in the past, many of the projects pursued today are focused on high-level system optimization and not improving discrete component technology, such as CPU architectures, says HP Labs strategic planning director Robert Waites; the goal is to dramatically reduce the cost of IT infrastructure maintenance through automation, and grid computing, self-managing systems, virtualization, and smart data center technology are being developed in order to reach this goal. IT systems maintenance involves many tasks that need to be done with precision, such as provisioning disk arrays or configuring networks, says HP Labs storage systems research manager Beth Keer: "And because these tasks are repetitive and complex, they are not a good fit for human cognitive skills," she says. Eventually, HP Labs research could help halve the 80% of IT budgets spent on hardware and software maintenance. Software-based Utility Data Center virtualization technology would create a single logical layer through which various hardware assets could be managed, while Federated Array of Bricks (FAB) storage technology uses brick components that can be added ad hoc, each with disks, a CPU controller, and Linux-based software. Statistical Learning, Inference, and Control provides system intelligence to identify abnormal behavior, and Smart Data Center technology allows for automated dynamic cooling that could cut thermal management costs by 70%. HP Labs is also developing a novel crossbar latch molecular computing technology that would augment silicon technology as it nears fundamental physical limits a decade from now. HP researchers want to build ultradense memories on top of regular CMOS chips, for example.
Click Here to View Full Article

IBM's Watson Laboratory, Microsoft Research, Intel, and the Xerox Palo Alto Research Center (PARC) are all engaged in artificial intelligence research that could pave the way for computer systems capable of learning from their users as well as their surrounding environment. IBM Watson's neurocomputing project seeks to add complexity to the artificial neural network model by developing layered networks whose behavior is dictated by the biological properties of vertebrate nervous systems; the project involves figuring out how these multilayered networks can circumvent the problem of writing programs with advanced knowledge of all the unknown factors of each task they may be faced with, and a March paper demonstrated that such networks could avoid the superposition catastrophe problem through pattern recognition. Microsoft Research scientist Eric Horvitz is focusing on how computers can serve as a memory aid to people through projects such as MemoryLens, which employs software to analyze a user's activities and schedules to recognize important events that can later function as reference points to pinpoint information. Horvitz's team is also working on prototype "streaming intelligence" software for relaying AI rules to smart phones. Intel is exploring various applications for "machine learning" techniques, including a system that uses a "learning engine" to project what chips on a silicon wafer are of good quality, and user interfaces that can more proactively predict user requirements via statistical methods. PARC's user-interface group is developing software such as ScentHighlights, which helps users scan through information by selecting key sentences from a document, which relate to keywords entered by the user. PARC research fellow Stuart Card says the software illustrates an emergent type of interface that responds to things that capture a user's interest.
Click Here to View Full Article

The dynamic nature of many business processes necessitates adaptable support systems, writes Delphi Group chief analyst Nathaniel Palmer, noting that business process management (BPM) software derives its optimal value from its ability to insert a layer between users and IT infrastructure to provide process adaptability without expensive coding and IT development. Next-stage BPM calls for the delivery of real-time reporting on process states and performance indicators for improved visibility and flexibility. The first phase of BPM's development was integration-centric and focused on event-driven processes, while the second phase marked a shift to orchestration and goal-oriented processes. "Orchestration-focused BPM frameworks support business performance by applying goals, policies, and rules, while adjusting process flow to accommodate outcomes that can't be predicted," writes Palmer. He characterizes the third phase of BPM development as a blending of integration and orchestration with real-time reporting to confirm and polish business users' comprehension of performance drivers on a continuous basis. The most recent BPM trend, driven by the transition to less rigid architectures and goal-driven process models, is to incorporate more advanced reporting capabilities into the BPM environment. Next-stage BPM systems ought to deliver key performance indicators oriented around metrics such as process throughput and application availability, and also facilitate end-to-end information flow, integration point, and performance analytics transparency. Palmer expects the increased utilization of business process execution language to ramp up the need for improved reporting.
Click Here to View Full Article

For wireless broadband communications to equal wired communications in terms of reliability and coverage, three key innovations must take place: An increase in bandwidth, better modulation schemes and associated elements, and more advanced wireless network protocols. Short-range, low-bandwidth technologies that could enable broadband personal area networks (PANs) include radio-frequency identification (RFID), Zigbee, Bluetooth, and near-field communications, while ultra-wideband (UWB) promises to deliver broadband PAN enablement by combining higher bandwidth with low power. The architectural implementation of wireless local area networks (WLANs) is evolving, in which dense access point deployments allow WLANs with terrific coverage, capacity, price/performance and total cost of ownership to be established in nearly any domain. Multiple-input, multiple-output (MIMO) radio technology can further augment WLAN reliability, while proprietary fixed wireless broadband is expected to be supplanted by IEEE 802.16-2004-based WiMAX over the next 10 years. However, mobile WiMAX based on the in-development IEEE 802.16e standard should be wildly successful, given its appeal to mobile broadband users. Anticipation is building over a hybrid Wireless Wide Area Network (WWAN)/WLAN handset as the technology of choice for enterprise users, although such a breakthrough requires a fluent two-way handoff of both voice and data communications and the unflagging dedication of cellular operators. The advantages for cellular carriers include the ability to enhance their services' coverage and capacity with their WLAN networks, and the opportunity for achieving a new level of enterprise penetration and customer lock-in.
Click Here to View Full Article