Computer majors are finding that technical skills alone will not ensure steady, well-paying work as the offshoring of tech jobs to lower-wage countries increases due to advances in the Internet and low-cost computing. They are therefore boosting their chances of landing secure and challenging work by acquiring skills in other areas. The latest IT students on campus usually need to combine their computing knowledge with proficiency in another field or fields, and are often more concerned with technology's applications in those fields than with the technology itself. Concurrent with this trend is an expanding job market for people who can customize IT for specific companies or industries. Students majoring in fields other than computing are also finding that a certain amount of familiarity with computing or programming is necessary for their chosen careers. University of Washington professor Edward Lazowska says computer science's real benefit is giving students the ability to sift through and evaluate information, think analytically in terms of algorithms, handle complexity, and model and abstract. Several universities are devising multidisciplinary courses that blend research in computing, management, the social sciences, and engineering in order to equip students with the tools they need to be productive, innovative, and insightful members of the fast-growing services industry.
Click Here to View Full Article(Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

Speaking at the Progress and Freedom Foundation Aspen Summit, Sun Microsystems President Jonathan Schwartz revealed the Open Media Commons, an open-source project to develop a royalty-free standard for digital rights management (DRM). He cited explosive growth in demand for new network services, warning that a monolithic DRM system would impede progress. Schwartz urged cross-industry cooperation to devise an open standard for the unrestricted creation, duplication, and distribution of digital content, arguing that Sun's effort aims to uphold both fair compensation and fair use. "The Open Media Commons is committed to creating an open network growth engine, all the while continuing to protect intellectual property in a manner that respects customer privacy, honors honest uses of media, and encourages participation and innovation," he declared. Sun will immediately give the open-source community full and unconstrained access to its Sun Labs program Project DReaM (DRM/everywhere available) under the Common Development and Distribution License sanctioned by the Open Source Initiative. Analyst Phil Leigh expects an open DRM standard to eventually prevail, but competition will not end until a clear winner emerges. He doubts that Apple or Microsoft will agree to share anything, and Sun is working on an ultimate standard to relieve media companies, device manufacturers, and consumers of their frustration at the lack of such a standard.
Click Here to View Full Article

University of Iowa computer science professor Douglas Jones' five-year, $800,000 National Science Foundation grant to study the use of electronic voting systems is seen by his peers as an acknowledgement of his accomplishments in the area of secure elections and voting systems. Jones first noticed security vulnerabilities in computer-based systems as chairman of the Iowa Board of Examiners for Voting Machines and Electronic Voting Systems in the early 1990s; in one case, a machine unintentionally disclosed the vote of the previous user. Jones says Iowa's decision to forgo punch cards was a positive step and highly rates the state's polling place administration procedures, but says voting fraud security and absentee ballot counting are areas in need of improvement. Jones' NSF grant is part of the foundation's $7.5 million A Center for Correct, Usable, Reliable, Auditable, and Transparent Elections (ACCURATE) project. The UI professor will collaborate with nine other top specialists with experience in e-voting, computer security, and analysis of public policy issues associated with the use of computers and human-computer interaction. Jones says developing an easy-to-use security application that does not complicate voting is a challenge.
Click Here to View Full Article

In a recent interview on the current state and future of technology, Carnegie Mellon computer science professor Brad Myers immediately identified the problem of information overload, noting that the proliferation of informative Web sites and email as well as the growing nuisance of spam has made it difficult to digest all of the information available. With cell phones having emerged as the largest consumer electronics market in the world, Myers believes a central challenge facing technologists will be to effectively harness the capacity of cell phones without making them too complicated, which could be achieved with the aid of external devices in the environment, such as a television. The difficulty that many people have using computers is due partially to the relatively little attention computer makers have paid to human-computer interaction (HCI), which in turn can be attributed to the general preference among consumers for cheaper, rather than more usable, electronics. Myers is the principal investigator for CMU's Pebbles research project, exploring the capabilities of handhelds when interacting with PCs, other handhelds, and appliances such as radios, telephones, and automobiles. Meanwhile, CMU's RADAR project seeks to create a cognitive personal assistant out of software to help busy managers and military personnel spend less time tending to routine tasks; Myers hails the project as one of the first "to involve significant collaboration between AI researchers focusing on making the system learn about the user, and HCI researchers focusing on how to make intelligent assistance useful" in a practical setting. In the future, Myers foresees the widespread implementation of voice and intelligent interfaces. Management will also be transformed, as future computers will be able to perform so many tasks that each human will have to effectively manage these applications as if they were humans.
Click Here to View Full Article

IBM has entered into a partnership with the Share user group to offer support for IT students eyeing mainframe computing as a career through the zNextGen project. The potential market for mainframe-savvy people is growing as mainframe experts approach retirement age, while companies in China, Eastern Europe, and elsewhere are in the market for skilled personnel as they increase their mainframe investments to boost their computing power. IBM and Share issued a joint statement on Aug. 22 in which they promised to give students access to contacts and resources to help them land mainframe computing jobs following graduation, as well as assist young people already employed in big iron with job networking. ZNextGen participants will be able to take advantage of mentoring and internship opportunities from both Share and IBM, and access Share's global membership of Fortune 500, government, and academic employees. Meanwhile, IBM is planning to launch a mainframe contest for American and Canadian students, in which participants will receive accounts on a z/OS hub and be asked to solve a series of problems. For the past several years, IBM has used its Academic Initiative Program for the Mainframe in an attempt to address the under representation of mainframe computing in the educational sector. The company has thus far set up zSeries mainframe courses in 150 universities worldwide, and its ultimate goal is to produce 20,000 mainframe-proficient IT workers within a five-year timeframe.
Click Here to View Full Article

Just as Great Britain witnessed a chill in government interest in science after World War II, the United States appears to be in a similar state of decline, writes Harold Evans. Writing in the Wall Street Journal, Vinton Cerf, the former Defense Department scientist who is often credited as one of the fathers of the Internet, bemoans the current administration's lack of support for science, decrying the slashing of research funds in the 2006 budget as a formula for "irrelevance and decline." John Marburger, science advisor to the president, admits that next year's allotment is "pretty close to flat," though pointing to an additional $733 million in funding for research and development. However, for the first time in a decade inflation has outpaced federal funding, and in real terms, money for basic research and computer science has eroded; currently, federal funding for university research is roughly half of what it was in 2001. Comparatively, the United States ranks sixth in the world in the proportion of its budget spent on research and development. Although the declining percentage of U.S. Nobel Laureates and research publications in the world's share indicates the emergence of significant global competition, the Bush administration has held steady in its disdain for scientific inquiry, Evans argues. As more Americans pursue law and medicine, and visa restrictions curtail the enrollment of foreign students, the prospects for the future of U.S. scientific activity darken. Though the Bush administration claims its scientific decisions, on matters ranging from oil exploration to stem cells, are based on the best available evidence, 7,600 scientists, including 49 Nobel Laureates, have signed a petition supporting the Union of Concerned Scientists' campaign to reestablish science as a major policy consideration.
Click Here to View Full Article

Thanks to an influx of new technologies and increased collaboration among biologists, computer scientists, and chemists, biomimetics, or the imitation of biological processes, is shrinking down to the micro and nano levels, ushering in a host of new applications. The biomimetics formula turns to nature to copy an idea and then seeks to replicate the process in a lab, but George Whitesides takes a different approach in his lab at the department of chemistry and chemical biology at Harvard: He uses biomimetics as a starting point, and searches for a practical application from there. His lab's work imitating the automatic processes of enzyme formation, the grouping of lipids, and the joining of two DNA strands together, could see applications in electronics, computers, and other devices powered by wires or chips. The self-healing element of natural processes is most appealing, as "eventually, electrical components will be so small you can't manipulate them, so it would be nice if it could manipulate itself," said Derek Bruzewicz, a researcher in Whitesides' lab. Research imitating the shape of the spine could be adapted to make stronger wires in electronics, where they could turn corners and eventually even heal themselves. Michele Rucci, as assistant professor in the department of cognitive and neural systems at Boston University, is using a robot named Mr. T to simulate human cognition in order to gain insight into the functioning of the human brain, with the ultimate goal of building a more capable robot. Rucci believes his work will eventually inspire systems that more closely imitate human cognition to be able to recognize and analyze components of an image, which could lead to better treatments for eye problems. Mr. T goes beyond other robots in its imitation of the complex human coordination among the brain, eyes, and hands.
Click Here to View Full Article

Hacker turf wars sparked by the increasing strategic and monetary value of compromised computers have usually simmered out of the public eye, but such skirmishes were in plain view last week when the Zotob worm infected computers at a major airport, media outlets, and industrial companies, and prompted an all-out battle between competing malware. Zotob appeared a mere six days after Microsoft announced a patch for the security flaw the worm was crafted to take advantage of, and Curtis Franklin Jr. of Secure Enterprise Magazine reports that the average time between the disclosure of a vulnerability and the release of an exploit has shrunk from 21 days to eight days in the last 24 months. Experts say this shorter timeframe can be partially explained by the apparent use of prewritten program "shells" by malware authors, while the patching process can be held up by negotiations between corporate network managers and other parts of the corporation. "Zero-day exploits" in which malware appears on the same day a flaw is announced are generating the most concern, and Franklin says the Zotob turf war illustrates a convergence among the various forms of malware in terms of function. Intelguardians Network Intelligence security consultant Tom Liston says hacker turf wars have increased significantly over the last three years. University of Southern California at Los Angeles professor Peter Reiher adds that such battles used to be primarily over bragging rights, whereas today they indicate a greater interest in controlling infected systems.
Click Here to View Full Article

The content management system Mambo could be headed for a fissure, as developers of the open source application are dissenting from the Mambo Foundation and Miro International, which oversee the project. The split comes as the two sides vie for control of the project's direction, and the developers claim that the foundation is trying to shut them out. Both sides have vowed to continue work on Mambo, with the foundation expecting to hire new programmers, and the developers claiming they will perform the same work under a different name. Though the foundation and Miro parrot the developers' claim--that the other is only interested in controlling the project--Miro has softened its position slightly to loosen its grip on Mambo's intellectual property. Miro's Justina Phoon is optimistic that a new development team will bring a fresh approach, noting that the idea for the foundation was initiated by developers Andrew Eddie and Brian Teeman, and that Miro elected to support the foundation even though it suspected that the developers were after exclusive control of Mambo. "We felt once the project was adequately protected, we could involve the core team and let them decide if they were truly interested in the Mambo project, or simply controlling the license," Phoon adds. For his part, Mambo developer Emir Sakic responds to the question of a fork in the code with the contention that the programmers will carry on the Mambo project in its pure form, with the name being the only change. The reunion of two competing strains of code after a fork is not without precedent, as was the case when rival factions of the GCC code reunited in 1999.
Click Here to View Full Article

The FCC's recent proposal to expand the Communications Assistance for Law Enforcement Act (CALEA) to certain broadband and voice-over-Internet (VoIP) providers has sparked protest from civil rights advocates as well as allied broadband suppliers and Internet associations concerned that such a maneuver would hurt industry innovation and jeopardize national security. Staff counsel for the Center for Democracy and Technology (CDT) John Morris believes the move will impede innovation on the Internet and drive innovation further overseas because it may force entrepreneurs to retain legal counsel and negotiate a quagmire of FBI bureaucracy to obtain approval for new devices prior to commercialization. Sun Microsystems security specialist Susan Landau says CALEA's application to VoIP authorizes government officials to design wireless wiretapping standards directly into the Internet protocols, creating a vulnerability that threatens national security because it puts wiretapping capabilities in the hands of abusers as well as federal law enforcement officials. Some critics view the CALEA expansion as illustrative of how law enforcement misunderstands broadband, pointing out that VoIP, unlike traditional telephone networks, is portable and disconnected from underlying networks. "I would hope that there's no expectation that you're going to be able to route all voice communications through central control points in order to accommodate the potential need for wiretapping," says ITAA counsel Mark Uncapher. He and other VoIP supporters believe the focus on CALEA draws attention away from VoIP's digital storage potential, which could be much more advantageous to law enforcement as a tool for enhancing surveillance.
Click Here to View Full Article

The U.S. Government Accountability Office (GAO) released a study earlier this year concluding that the nation is ill-prepared to deal with threats to its critical Internet infrastructure, and increasing numbers of computer-security experts believe a catastrophic cyber-infrastructure failure is probable. The GAO report cited the Department of Homeland Security's failure to develop national threat and vulnerability assessments concentrating on federal and private-sector contingency recovery plans for cyber-security and plans for restoring core Internet functionality should it be crippled in a major cyber-attack. The study identified seven serious threats to Internet infrastructure and national security, among them: "Bot" networks controlled by hackers looking to enhance their reputation or challenge themselves; profit-driven criminal organizations using malware such as phishing and spam; foreign intelligence services for countries that want to enable themselves for information warfare; disgruntled corporate insiders with intimate knowledge of their companies' networks; individuals and small groups that steal identities via phishing; spammers that use phishing schemes; and terrorists who want to exploit, cripple, or destroy vital infrastructure and cause extensive economical and/or physical damage. Solutionary's Mark Rasch believes the first step to protecting Internet infrastructure is to fortify the computers connected to it and train users about avoiding weaknesses. Better-protected hardware and software means the infrastructure will have less security threats to contend with. Shoring up the infrastructure itself involves incorporating better redundancy and sharing threat information and best practices with ISPs. Rasch also recommends implementing an IP address-based prioritization framework to be used in the event of emergency.
Click Here to View Full Article

The recently released National Plan for Research and Development in Support of Critical Infrastructure Protection creates nine sector-wide themes that include both physical and cybersecurity concerns. The nine themes, which will be integrated with other national security strategies and focused on 17 critical infrastructure sectors, include detection and sensor systems; protection and prevention; entry and access portals; insider threats; analysis and decision support systems; response, recovery and reconstitution; emerging threats; advanced architectures and system design; and human and social issues. The plan is a departure from past plans in that it combines the 17 sectors rather than focusing on security plans for each on an individual basis. According to the plan, "The use of a sector-based plan for examining operational issues is not appropriate for R&D, as it tends to create artificial repetition and loss of opportunity for integration." The plan focuses on assessing linkages between sectors and incorporating vulnerability reduction efforts into future system designs. Eight major R&D priorities are identified by the plan, including improvements to sensors, risk modeling and analysis systems, secure architectures, human-technology interfaces, physical security and cybersecurity systems, and large-scale situational awareness systems. The plan also outlines three long-term goals: A national operating architecture for critical infrastructure; computing and communications networks with security as part of the original design; and a physical and cyber infrastructure that is strong, self-diagnosing, and self-healing.
Click Here to View Full Article

Synapse is an effort to create an open-source implementation of a Web services broker/enterprise service bus (ESB) as an alternative to existing commercial ESB/broker/gateways founded on proprietary protocols. The project is being incubated by the Apache Software Foundation with support from Blue Titan, Iona, WSO2, Sonic Software, and Infravio. WSO2 Chairman Sanjiva Weerawarana says the Synapse initiative is expected to yield components that will interoperate with other Apache and open-source projects. Sonic Software's Dave Chappell says Synapse is not an ESB in itself, but rather "a mediation framework that allows users to get in the middle between service requesters and providers and perform various tasks--including transformation and routing and that helps to promote loose coupling between services." Weerawarana says an ESB, a mediation environment, or a means for policy based management could be built from this framework. Chappell insists that ESB will not be commoditized through Synapse, which will supply a unified framework that both vendors and open sourcers can agree on. He says Synapse will aid the promotion of interoperability by helping move convergence for common elements forward.
Click Here to View Full Article

The IT industry's resurgence has been slow in coming partly due to offshoring and productivity trends, as well as the IT job market's precipitous decline following the dot-com implosion. But fewer college students enrolling in computer-science courses means that companies may face a shortage of IT talent as the job market recovers. The same outsourcing trends and dot-com bust that impeded the industry's comeback are partly responsible for the enrollment downturn, as the lack of jobs discouraged students from entering the field. Stony Brook University dean of engineering and applied sciences Yacov Shamash expects continued enrollment declines and domestic staff shortages to encourage even more IT outsourcing. Director of the City University of New York's computer science doctorate program Ted Brown stresses the need for adaptive IT workers, and he plans to introduce finance and business coursework into CUNY's tech and computer-related curriculums to meet employers' staffing requirements. Shamash, however, argues that training students in a specific application or industry limits their options. Regardless of the validity of either argument, more specialized skills are a must for IT professionals today. Edward Korth of Brentwood says job-hopping has become a viable option thanks to improvements in the IT industry where specific skills and little need for training are highly prized.
Click Here to View Full Article

A gesture interface is being developed by Georgia Institute of Technology researchers as a tool for communication between the hearing and the hearing-impaired. The Georgia Tech team is collaborating with cognitive scientists at the University of Rochester and engineers at George Washington University to create TeleSign, a one-way American Sign Language-to-English phrasebook that combines a video camera, wrist-mounted accelerometers, and machine learning. The camera lens studies the area at the front of the signer's chest where hand gestures are usually made. The user turns the system on by clicking a button on the wristband, and turns it off after signing by clicking the button again; TeleSign then searches its database for the closest probable English-language matches using hidden Markov models. The two or three most likely matching phrases are displayed on a portable device, and the user may either choose to re-sign or select a phrase from the list. Georgia Tech researcher Thad Starner says TeleSign "limits the vocabulary to a few phrases--currently about 20--that are sufficient for a variety of situations and are most likely to elicit responses like a nod or a point in a particular direction." Starner's team attempted to streamline the system by working with GWU's Jose Hernandez-Rebollar on a glove interface designed to identify signs by recognizing beginning and end hand positions as well as the movements in between.
Click Here to View Full Article

Just as the Internet has had a major impact on everyday life over the last 10 years, the next 10 are also likely to be equally transformative. Internet2, currently the exclusive province of research and educational institutions, offers unparalleled speed; though Internet2 CEO Doug Van Houweling believes that its speed will soon trickle down to everyday users, the network itself will remain strictly a research tool. A recent Pew Internet Project survey of technology experts and social analysts found that 66 percent predict a crippling attack on the Internet or the U.S. power grid in the next 10 years, while 59 percent gloomily envision greater government surveillance. On the other hand, Internet Protocol Version 6 (IPv6) will exponentially increase the number of IP addresses available, providing the capability for everyone in the world to have an individual address. Survey respondents also predict that the media and publishing and health care industries will undergo significant transformations over the next 10 years, with initiatives such as the digitization of medical records already underway. Technology will also extend to new frontiers, as evidenced by MIT's creation of the $100 Linux-based laptop and AMD's project to bring Internet access to 50 percent of the world's population by 2015. To combat mounting security threats, Van Houweling believes new authentication measures will emerge, such as Internet2's Shibboleth software, which guards against intruders by creating closely knit, trusted communities where the presence of an intruder is easily detected. Regardless of the scope of the coming changes, the mode of thinking that casts the Internet as a virtual, non-physical adjunct to our world is fast becoming archaic.
Click Here to View Full Article

The U.S. government's declaration that the H-1B visa cap for fiscal year 2006 has already been reached--the first time the application process has been closed before the start of a fiscal year--is likely to prompt high-tech trade organizations to lobby for additional visas beyond the current limit of 65,000. Immigration attorney Vic Goel said the rapid depletion of the fiscal 2006 H-1B allowance indicates that not enough visas are available, particularly in the face of positive economic growth. Rochester Institute of Technology professor Ron Hira disagrees with such an assessment, claiming the fiscal 2006 limit being reached before companies have even hired new employees "seems to indicate that companies are planning ahead for positions that don't exist right now, which highlights the fact that, contrary to conventional wisdom, they aren't searching for Americans first." John Palafoutas of the American Electronics Association said importing leading overseas talent to the United States is becoming harder and harder, and IT industry groups have been meeting with congressional leaders to determine the best course of action. He raised the possibility of a "market-based" solution that would automatically trigger an H-1B cap increase in response to strong demand. Congress authorized an additional 20,000 visas both for fiscal 2005 and fiscal 2006, although those visas are reserved for foreign workers with advanced degrees from U.S. schools.
Click Here to View Full Article

Some software vendors and consultants foresee agile software development getting waylaid by a bottleneck in the build management process as sophisticated development organizations strive to make integration automated and continuous. "Teams that want to be more agile are headed for a train wreck if they have long build times; they'll need to find ways to build all or part of the software more frequently to get the kind of continuous feedback that helps agile teams move quickly," warns consultant and author Mike Clark, who sees people rather than tools and technology at the root of the problem. His view is echoed by software consultant and trainer Mary Poppendieck, who blames people-centric difficulties such as long interims between development and testing; detailed and incessant customer clarifications of their actual requirements; approval processes that bury a development organization in work or overwhelm the organization; and implementation, particularly when support organizations have been uninvolved until then. Inappropriate use of agile development tools can throw even the most well-planned strategies out of whack. A lazy employment of version control is one such trap. Many agile development boosters claim open-source continuous integration tools are more than adequate for their needs. However, Electric Cloud CEO John Ousterhout cautions that "open-source tools can accelerate builds from one to three times, but they fall over dead after that."
Click Here to View Full Article

Electronic waste is a growing problem, and action is being taken--and not being taken--to address the issue. E-waste is comprised of electronic devices (TVs, PCs, etc.) rife with hazardous materials that are being discarded in vast quantities, oftentimes for being obsolete rather than broken. The Silicon Valley Toxics Coalition estimates that 60 percent to 80 percent of e-waste collected for recycling is exported to countries such as China, India, and Pakistan, where about half of it is disposed of by laborers in unsafe conditions. Computer recycling is problematic because it is expensive, requires a lot of hard work, and is dangerous, while the markets for recycled materials can be small and unreliable; in addition, most recycling centers have to pay for recycling by themselves. Increased public awareness of the e-waste problem has prompted some computer manufacturers to launch take-back programs, but consumer participation is limited given the cost and inconvenience such programs present. Still, nearly half the U.S. states have e-waste take-back legislation either active or pending: California, for instance, prohibits the dumping and overseas export of e-waste and requires sellers of hazardous electronic equipment to pay a $6 to $10 "advanced recovery fee" collected from consumers. In addition, U.S. advocacy groups and European efforts to address e-waste have driven computer manufacturers to reduce or eliminate certain toxins in their products as well as make their products easier to disassemble. Initiatives such as the Computer TakeBack Campaign are pressuring electronics producers to institute "extended producer responsibility" to pay for collection and recycling themselves, thus creating an incentive for manufacturers to design longer-lasting, more environmentally safe products.
Click Here to View Full Article