To raise user awareness of online scams designed to trick them into revealing sensitive information to data thieves and other miscreants, organizations such as the U.S. Military Academy are conducting exercises in which people are sent phony emails disguised as official requests to link to Web pages and enter confidential data, and then upbraided if they do so. Through this strategy, defenders hope to teach users to be more cognizant of "spear phishing" scams in which attackers craft email messages that would seem to originate from the recipient's company or organization. Last June, over 500 West Point cadets were sent mock emails from a fictitious colonel instructing them to click on a link to confirm that their grades were correct, and more than 80 percent of recipients complied; the cadets were gently reprimanded via email and advised to be more cautious in the future. In recent months, almost 10,000 employees of New York state were sent emails that were supposedly official notices asking them to access sites and enter their passwords and other personal details, and those who did were sent a note explaining the purpose of the exercise. "The bottom line lesson was: Even if the request comes from legitimate individuals, never give out personal information," said New York CIO William Pelgrin. However, such methods could potentially erode employees' trust for their organizations' information-security personnel. Still, SANS Institute research director Alan Paller called such exercises "a key defense against large-scale theft of confidential information."

The bulk of the documents comprising the Web cannot be retrieved by most search engines because they reside in databases that are inaccessible to Web crawlers, but Glenbrook Networks claims to have developed a search engine that can successfully mine this "deep Web." Julia Komissarchik, who runs Glenbrook with her father Edward, says most of the information on the Web requires human intervention to be accessed, and Web pages stored in databases are a prime example: Many such pages are nonexistent until an individual fills out a form on a Web site and requests the information. Julia and Edward Komissarchik say their technology can analyze the forms on the Web pages and understand what kind of information the sites are looking for, after which Glenbrook's Web crawlers employ artificial intelligence to negotiate these forms, answering questions in the same way a person would. Glenbrook has built a search engine that can extract job listings from the databases of various Web sites, which the company claims is beyond the capability of most search engines. Maryland research librarian Gary Price, author of "The Invisible Web," says the concept underlying Glenbrook's technology has been around for some time, but Glenbrook could successfully market the data it has compiled. Edward Komissarchik believes his company could retrieve the numerous job listings stored in databases on corporate Web sites. Meanwhile, Yahoo! has forged an alliance with the New York Public Library, the Library of Congress, and other parties to index the content in their databases, and Google has augmented its search results with WorldCat, a bibliographic database that could previously only be accessed via libraries.
Click Here to View Full Article

The energy bill that President Bush signed last week authorizes the Federal Energy Regulatory Commission (FERC) to establish mandatory standards to shield electric power systems from cyberattacks and other disturbances that could lead to instability or failures. FERC is required to set up an "electric reliability organization" to draft such standards, and Ellen Vancko of the North American Electric Reliability Council believes the commission will award that title to her organization. She says many power grid operators follow voluntary cybersecurity guidelines drafted by the council, which has already been designated the electric sector's infrastructure protection coordinator by the U.S. Energy Department. Three months ago, the Government Accountability Office (GAO) released a study pointing to officials' growing concern that systems controlling utility infrastructures are at risk of attack. The report partially attributed the control systems' vulnerability to their increased linkage to private networks that use the Internet in order to facilitate remote management. Clarence Morey with Internet Security Solutions says the Internet was never a factor in the design of the computer system currently used by utilities and public transportation facilities. "As companies connect these systems to the Net to allow remote access or drive efficiency, they're opening themselves up to risk," he warns.
Click Here to View Full Article

Companies' decision to move robots from indoor to outdoor environments is spurring a global trend to develop machines with greater autonomy so they can arrive at more logical decisions independently. Robotics is a critical element of ground warfare and other defense applications, where autonomy is essential. Frost & Sullivan analyst Amreetha Vijayakumar expects robotics to play a key role in reducing the need for humans to participate in high-risk activities such as detecting ordnance, while the U.S. military is eager to fund academic and corporate efforts that could enhance battlefield medical treatment with robotics. Fulfilling the need for a versatile, multifunctional robot requires the provision of a standardized platform to combine the various software modules. "Varied software components are available for synthesizing voice to make robots respond to vocal commands, or for processing the images captured by the robot's camera eye," notes Vijayakumar, who reports that the simultaneous function of these tools is blocked by the absence of a common platform. There are industry-wide projects to address this problem, while social issues are another obstacle. Vijayakumar says most workers are afraid of being phased out in favor of more efficient machines that require no compensation. Still, rising demand for miniaturization driven by advances in artificial intelligence, along with technological milestones such as neural networks and nanobots, will probably fuel the growth of robotics technology and ensure its place in everyday life.
Click Here to View Full Article

University of Tokyo researchers led by Takao Someya have developed a thin plastic film embedded with sensors that could serve as a flexible electronic skin for robots that imparts human-like tactile sensations. Pressure- and temperature-sensitive transistors are arrayed in a matrix, and sensor readings are provided at wire intersection points. Fluctuations in pressure or temperature are signified by changes in current. "We really want to develop new technologies which make it possible to entirely cover the surface of robot bodies with e-skins," says Someya. He and his colleagues write in Proceedings of the National Academy of Sciences that future e-skin could be sensitive to light, humidity, ultrasound, and strain. University of Tokyo robotics expert Max Lungarella believes the proposed sensor's size can be shrunk even further, although he is concerned that the e-skin might be sensitive to electromagnetic disturbance.
Click Here to View Full Article

Many issues that have cropped up in the wake of Internet search's explosive growth are a focus of academic research, and the University of California at Berkeley is launching an interdisciplinary facility for advanced search technologies. The center will involve the participation of approximately 20 faculty members from various departments to concentrate primarily on privacy, fraud, personalization, and multimedia search. "If you have 20 researchers interested in search, then getting them together where they are cross-fertilizing ideas, you make something bigger than its parts," said center director and UC Berkeley computer science professor Robert Wilensky. Trust and privacy will be a principal research area at the center, which will use Berkeley's Wireless Research Center as an operational template. The center's faculty will include UC Berkeley Department of Electrical Engineering Chair Jitendra Malik and computer science professor David Forsyth, both of whom are conducting research into computer vision. Wilensky said the center will be opened in early 2006, adding that he is courting Google and other major search companies as possible participants. Other centers of search technology research include Carnegie Mellon University and Stanford. Researchers at CMU's Language Technologies Institute have created an add-on application that enables users to maintain and modify personal data within a search profile so that the search engine can query the profile as well as the user's search term to generate personalized search results while shielding sensitive information; among Stanford's notable search technology projects is professor Andrew Ng's work on artificial-intelligence methods for mining text in a search index for knowledge.
Click Here to View Full Article

University of Missouri-Rolla professors Ray Luechtefeld and Steve Watkins have developed a virtual facilitator that could improve team performance by encouraging the sharing of ideas and adherence to an agenda. Luechtefeld says a trial involving 100 student teams demonstrated significant performance improvements via exposure to interventions that expedited more effective information sharing and faster problem-solving. Luechtefeld sees a connection between facilitation and therapy, noting that "We help people to express ideas clearly, identify barriers, and understand the situation they're facing." He reasons that the virtual facilitator could currently be employed in chatrooms, e-teams, and educational scenarios, and also be used to train people to interact by assessing their reactions to a specific simulation against those of a master facilitator. Luechtefeld and Watkins think a grant of $450,000 would allow them to tailor the facilitator for particular applications within a year. Potential application areas Luechtefeld envisions include negotiation, team learning, conflict resolution, motivation, leadership skills, and change management. "As speech recognition capabilities mature and people become increasingly connected through cell phones and the Web, this technology will enable people to get help with vexing 'people problems' anytime, anywhere," Luechtefeld explains. The UMR professors also foresee their software finding use in far-flung venues such as Iraq, where flesh-and-blood facilitators fear to tread.
Click Here to View Full Article

As an alternative to the cable and phone companies, the established network of electrical wires, which reaches virtually every home in the United States, may be the vehicle to achieve universal Internet connectivity. IBM, Google, and other corporate backers have been leading trial projects testing the viability of Broadband over Power Lines (BPL) in communities such as Manassas, Va., and Cincinnati. "Our hope is that in the next two years you'll see millions of homes" gaining access to the Internet through BPL, said Kevin Kushman of CURRENT Communications Group. Although power companies are typically slow to embrace modifications to their business model, the capabilities BPL offers for utilities to read meters remotely, quickly pinpoint the location of outages, and conduct preventive maintenance on power grids are very appealing. Rural communities will be the principal beneficiaries of BPL implementation, as in urban centers established phone and cable services already have a stronghold, though extremely isolated individual residences may still not receive service as Internet signals degrade if run over long distances without getting a boost along the way. BPL remains largely a trial balloon, as it faces many obstacles to universal adoption, among them alternative technologies such as fiber-optics and WiMAX, as well as opposition from ham radio operators who maintain that the added signal in power lines disrupts their transmissions.
Click Here to View Full Article

Email began innocently enough as a simple tool to send text messages between networked computers; now, like so much else about the Internet, its use has become corrupted by hackers and scammers who seek to use email for their own profit through schemes to steal unsuspecting users' identities. Though the industry is consistent in its view that email needs help, many believe that an email authentication system would be costly, time-consuming, and ultimately ineffective. Among proponents of an authentication system, two systems have emerged: Yahoo! is backing domain keys, a method that requires senders to go through a verification process before the ISP or email gateway service authenticates the sender's identity; like domain keys, the Sender ID Framework (SIDF) requires a two-step verification process that obliges senders to register for a list that confirms their IP address. Domain keys holds the distinct advantage of employing encryption, though its implementation is a lengthier process, which has accounted for some of SIDF's popularity. Both systems rely on educated recipients making informed decisions about what messages to open. Detractors believe users have a poor understanding of the scope of authentication services, and fear that verified senders could still abuse the system, undermining the benefit to e-commerce. Authentication systems develop a reputation score, which helps filter out spam, though their effectiveness will hinge on a user's ability to interpret the score. Cloudmark scientist Vipul Ved Prakash is concerned about the threat zombie computers pose to authentication, as a computer in the control of a remote user manipulating it to send out spam could damage the legitimate user's reputation score, though others believe ISPs would detect a large number of email originating from an individual subscriber and deal with it accordingly.
Click Here to View Full Article

The U.S. State Department recently announced that electronic passports will start being issued in December. Each passport will be outfitted with a chip in its cover: The chip will contain all the information about the bearer held in current paper passports, along with a digital signature and digital photo; the former will be used to shield the stored data from doctoring and reduce the likelihood of photo substitutions, while the latter will permit biometric comparisons via facial-recognition technology. The passport cover will also be equipped with technology designed to thwart unauthorized reading, while technology to prevent access to data until the document is opened and read electronically is also being considered. The State Department said the e-passports will augment border security as well as streamline and further secure Americans' identification for international travel. The passports will be issued exclusively through the State Department at first, while domestic passport agencies will be able to issue them by next October. Electronic Frontier Foundation staff attorney Lee Tien is concerned that e-passports may not be sufficiently protected from unauthorized data theft, an argument the EFF and other privacy groups raised in comments sent to the State Department earlier this year. "Given that they do seem to be going forward, they need to study and implement better privacy protection," Tien remarked.
Click Here to View Full Article

Don Becker, founder and chief scientist of Scyld Software, recently expressed his beliefs that Linux needs improvement at the operating system level, that grids are still only marginally accepted, and that small and large companies need to move toward an embrace of clustering software. Scyld is a Linux clustering vendor and a subsidiary of Penguin Computing, a vendor of Linux workstations and servers. Becker is quick to point out the progress Linux has made, though he maintains that there are still flaws in the system, citing storage and file systems as two important areas for improvement. Grid implementation is a difficult process which Becker likens to the adoption of a new network protocol, adding that the industry has been unable thus far to agree on coherent definitions of grid and clustering. Once clustering becomes pervasive, each machine will be able to initiate clustering and scale up, and utility computing will interconnect the clusters. Scyld has traditionally centered its efforts on high performance computing, though Becker is hopeful that the company will diversify its focus to improve its logging system, and develop lights-out diagnostics and a defined boot system to automatically reconfigure a more expansive array of hardware. Becker says there is room for a limited number of open source licenses, noting that the General Public License so far works well enough, though the line between commercial and non-commercial distribution is of yet far from clear.
Click Here to View Full Article

A National Science Foundation report issued on Aug. 10 concludes that graduate enrollment in science and engineering at U.S. colleges rose 4 percent overall between fall 2002 and fall 2003, although there was a notable decrease in foreign student enrollments. Whereas annual foreign student enrollments increased by more than 10,000 in each of the previous three years, fewer than 1,300 enrolled in 2003. The study analyzed data from 12,000 academic departments at roughly 600 U.S. institutions, and found increased graduate enrollment in all major fields and subfields--except for computer science, which experienced a 3 percent decline in enrollment from 2002 to 2003. Foreign graduate science and engineering students accounted for 31 percent of all graduate students in those programs in fall 2003, a 1 percent dip from 2002. Meanwhile, the growth in full-time enrollment of U.S. citizens and permanent residents from 2002 to 2003 overtook that of foreign students for the first time in 9 years. There was an 8 percent fall-off in first-time, full-time enrollment of foreign students in 2003, mostly among men, which would seem to indicate lingering fallout from the Sept. 11 attacks and the subsequent institution of more restrictive U.S. visa regulations. The percentage of full-time students climbed from 67 percent of total enrollment in 1993 to 72 percent in 2003. The percentage of female science and engineering graduate students rose 5 percent from 2002 to 2003, while Asian and Pacific Islander students increased 11 percent, Native American and Hispanic students 8 percent, and African American students 6 percent.
Click Here to View Full Article(Access to this article is available to paid subscribers only.)

As the perception of careers in IT has been tarnished by the dot-com collapse and widespread offshoring of jobs to India and China, many IT positions at home may go unfilled. The approaching retirement of baby boomers and declining enrollment in IT worry Phil Zweig with the Society for Information Management (SIM) studying the future of the IT industry. A Gartner study projects the IT departments at midsize and large companies will be at least one-third smaller by 2010 than they were in 2000, and that by that same year, 10 percent to 15 percent of IT professionals will drop out of the industry, leaving an uncertain fate for that sector of the economy. Amid dubious forecasts of the future of IT, fewer students have been pursuing math and science; the number of undergraduates declaring computer science as a major dropped 39 percent in the four years following 2000, leading many industry watchers to speculate on the detrimental effect this trend may have on U.S. technological innovation. Gartner's Diane Berry says parents are another factor deterring kids from pursuing IT, but some are unfazed by the decline in IT enrollments; for example, Partners HealthCare System CIO John Glaser notes that many of his technical workers acquired their skills at community colleges or through on-the-job training. Additionally, previous attempts to forecast labor markets have been fraught with error. To proactively address a shortage of talent, companies are advised to inventory the skills they anticipate needing in the next five years and turn their focus in that direction. More globally, Zweig emphasizes the importance of reaching out to schools to overhaul the perception of IT careers, particularly at the secondary level and earlier, citing the outreach efforts of the more than 30 chapters of SIM in the country.
Click Here to View Full Article

The National Science Foundation announced on Aug. 15 that it would pledge $7.5 million over five years to establish A Center for Correct, Usable, Reliable, Auditable, and Transparent Elections (ACCURATE), a facility for a multidisciplinary effort to make e-voting technology more trustworthy and reliable. The percentage of American voters casting ballots with e-voting systems, which stood at 13 percent in 2000, is expected to increase significantly in 2008, despite reliability and trustworthiness concerns regularly cited by experts, computer scientists, and legal scholars. Johns Hopkins University computer science professor Avi Rubin has been tapped as ACCURATE's director, and his university is expected to receive roughly $1.2 million from the NSF grant to fund his research into voting technology and for ACCURATE's administration. "The basic question is how can we employ computer systems as trustworthy election systems when we know computers are not totally reliable, totally secure, or bug-free," says ACCURATE associate director and Rice University computer science professor Dan Wallach. ACCURATE will also involve the participation of researchers from SRI International, the University of Iowa, Stanford University, and the University of California, Berkeley. Some ACCURATE participants will focus on e-voting hardware and programming, and study methods for safeguarding against election tampering. Others will concentrate on legal and policy issues, and questions about human behavior as it pertains to the e-voting switchover. The ACCURATE team's findings will be publicly disclosed and used to help create technical standards and proposals for simple-to-use e-voting systems where tampering can be easily detected.
Click Here to View Full Article

Boryung Ju of Louisiana State University's School of Library and Information Science and Myke Gluck of the Virginia Military Institute's department of math and computer science propose a reorganized software menu interface based on users' direct input. Their study yields important insights into how user-process modeling can shape an information system. The authors employed the Augmented Seriation software package to obtain user input that led to the interface menu's reorganization: The proposed menu delivers a timeline of use for software features that more closely aligns with their order of use, resulting in an interface that reads left to right and then down, and with main menu items revamped to represent the user's view of process with a beginning and end. Submenu items have also been renamed and regrouped so that menu items are more reflective of users' work. Usability testing indicates no significant differences in user performance (task completion time and accuracy) or user satisfaction between the proposed interface and the traditional interface. The study also shows that usable and effective menu organization is more reliant on the types of tasks, and corresponds more closely to the domain knowledge, than menu arrangement. Furthermore, test subjects favored the proposed menu over the traditional menu as far as labels, submenu groupings, and presentation order of menu items were concerned. The failure of subjects who used the traditional menu interface to outperform those who used the proposed interface leads Ju and Gluck to conclude that the latter interface is more viable and effective.

Commercial and open-source IDEs are emerging as the preferred programming environment for embedded programmers and enterprise codejockeys, two software development groups that are often at odds. Borland Software's Raj Seghal reports that "now [programmers] typically have more resources, in some cases substantially more [resources], and so we're seeing much more sophisticated [embedded solutions]." Industry observers say embedded software development occurring in enterprise environments is continuously isolated from conventional applications development, but the increasing pervasiveness of mobile and wireless devices is changing that perspective. Seghal reports that some enterprises are developing applications that combine conventional and embedded components. Organizations would like their developers to be able to access identical versions of source code and guarantee consistency between developers operating in the embedded and conventional domains, but some embedded programmers are concerned that such a model could lead to problems--resource leaks, memory fragmentation, and so on--with possible life-or-death implications. The majority of embedded experts would concur that the best conventional software development practices can be applied to embedded software development, yet Gibson Audio's Matthew Hamrick contends that codejockeys who operate in conventional applications environments frequently ignore many best practices. He reports that management is failing to assess a tool based on its suitability for specific task, while developers' efficiency is not being measured across changes in methodology or the tool chain, and the need for realistic, measurable interim milestones is being disregarded.
Click Here to View Full Article