On Tuesday, Intel will formally launch Haswell, the fourth-generation Core processor the company says will help pull the PC industry out of its downward spiral.

Unfortunately for PC makers, that won't happen. Consumers are going to continue to choosing tablets and smartphones over PCs despite Haswell's longer battery life at cheaper prices.

Intel Singing Same Old Song

As Intel has done for years, with each new generation of processor, the company digs out the PowerPoint slides used to market the previous generation of chips, changes the codenames and touts the latest increase in battery life and performance. This time around, the mobile version of Haswell is expected to get 50 percent more battery life with no loss of performance from the previous generation.

In addition, spinmeister Intel has been touting the low prices PC buyers will find in stores during the industry's crucial back-to-school and holiday shopping seasons. Ultrabooks will sell for as low as $499, a whopping $500 less than the rival MacBook Air from Apple, and thin-and-light notebooks will be selling in the $300 to $400 range, Intel executives told financial analysts in mid-April.

Along with low prices, stores will be stocked with new designs, such as notebooks that convert to tablets or have detachable displays that become tablets. Many of the new mobile PCs will have touch-enabled screens, courtesy of Microsoft Windows 8.

"If you look at touch-enabled, Intel-based notebooks that are ultra-thin and light using non-Core processors those prices are going to be down to as low as $200, probably," Intel Chief Executive Paul Otellini, who stepped down last month, told analysts.

Nevertheless, a new chip, low prices and new PC designs were not enough for IDC to change its prediction that PC shipments would fall almost 8% this year, much steeper than the roughly 3% drop last year. Tablets and smartphones are responsible for the decline because they let people hold on to their PCs longer. Why buy a new PC if your more convenient mobile device can surf the Web, play video and access email?

"The vast majority of PC sales are replacement sales, and if I keep my PC longer, then replacement sales are going to go down," IDC analyst Bob O'Donnell says.

Chip Power Doesn't Sell

In the 1990s, people paid attention to Intel's latest and greatest processor, because PCs were so darn slow. Every new generation of chips from Intel held the promise of a peppier PC.

Today, people are more interested in convenience. No one cares that the ARM chip powering nearly all tablets and smartphones isn't close to the performance level of Intel's Core processors. As long as ARM is good enough to handle what the devices are made to do and aren't sucking down battery power too fast, then consumers are happy.

Because Intel has yet to crack the tablet and smartphone markets, it's stuck in a PC world of evolution, not revolution. There is nothing in PC manufacturers' product portfolio that's truly innovative.

Intel could get out of its rut next year, if its next-generation Atom microarchitecture, dubbed Silvermont, is successful in the tablet and smartphone markets. But even then, the chipmaker will be playing catch up.

With each generation, Intel chips deliver higher performance and better power efficiency. But that alone isn't enough. They have to help create a mobile device that wows consumers, and that remains elusive.

Low-level cyberscuffles between nations may be about to escalate into more serious conflicts. U.S. government officials are reporting a new wave of attacks aimed at sabotage within the U.S., apparently originating from somewhere in the Middle East.

The New York Times reported over the weekend that saboteurs are using probes to look for ways to seize control of processing plants of mostly U.S. "energy companies" — presumably oil and gas producers. Senior officials with the Obama administration said the attacks are aimed at the administrative systems of 10 major American energy companies, which the sources have refused to name.

Tension, Apprehension And Dissension

To be sure, so far no one seems to have independently corroborated these alleged attacks. As such, there's no good way to know whether they are as potentially serious as these unnamed government officials — and, of course, the NYT — would have us believe.

If the warnings are sound, though, cyberwar escalation still wouldn't be a huge surprise. Security experts and government officials have long predicted that hackers bent on wreaking havoc will will eventually become as commonplace as those looking to steal government and corporate secrets.

In February, then-Secretary of Defense Leon Panetta warned that the technology used in cyberattacks is able to "cripple a country, to take down our power grid system, to take down our government systems, take down our financial systems, and literally paralyze the country. That is a reality."

The U.S. and Israel provided the motivation for their enemies to pick up the pace with their cyberattack on Iran's nuclear facilities several years ago. The two allies used the Stuxnet worm to damage centrifuges used in making high-grade uranium that could be used for nuclear weapons, according to the NYT. Experts believe Iran retaliated last year with the attack on Saudi Aramco, one of the world's largest oil producers.

A virus unleashed on Aramco administrative offices wiped out data on thousands of computers, replacing the deleted files with a burning American flag. The hackers targeted Aramco's production facilities, government officials said. The mission reportedly failed because Aramco's administrative offices were on a network separate from that used for industrial control systems. Using separate networks in this way is a best practice recommended by security experts.

The Aramco attack was soon followed by a similar one launched against Qatari energy company RasGas, which also claimed the attack was stymied because its compromised office network wasn't connected to production systems. Israeli officials said Iran's "cybercorps" was behind the assault. Iran organized the group after the Stuxnet attack.

Tit For Tat

These tit-for-tat attacks could be morphing into a new phase of cyberwar where the consequences are much greater than the damage caused by pilfering a company's trade secrets. Any attack that could destroy critical infrastructure — from oil production and the electric grid to manufacturing facilities and water treatment plants — has the potential to affect the lives of hundreds of thousands of people.

Experts have warned for years that industrial control systems that run these facilities are filled with vulnerabilities that could be easily exploited. Fortunately, hackers haven't yet been able to infiltrate the networks these systems are on.

To shore up the nation's critical infrastructure, President Barack Obama issued this year an executive order requiring government agencies to share cyberattack information with private industry. Industry, however, is under no orders to share information with the government, and changing that will require action by Congress, which is struggling with the privacy implications of requiring companies to share data with government agencies.

People want their television to work like a TV. Sending tweets on Twitter, posting photos on Facebook and browsing the Web are best left to smartphones and tablets. Indeed, more than 40% of U.S. households with Internet-enabled TVs haven't even bothered to hook them up to the Web, according to market researcher NPD Group.

This is not the future TV manufacturers expected.

RIP, Smart TV

In 2010, reimagining TVs as computer hybrids with big screens for the living room seemed to make lots of sense. Why not play games, run applications and surf the Web from the same box that shows movies and programming from a cable or satellite provider? Proponents quickly dubbed the new device the "smart TV."

Intel, sensing a new market for its microprocessors, was a huge supporter, saying the smart TV "could be the most significant change in television history." Yet by end of 2011, Intel had abandoned the smart TV business to focus on smartphones and tablets.

The main problem was that what Samsung, LG big TV makers delivered was a mishmash of applications that had nothing to do with watching TV — the main reason people gather around the big box in the first place. Unsurprisingly, very few consumers wanted to spend more for supposed next-generation television sets that included a bunch of features they didn't want in the first place.

Today, the TV is evolving much differently. Internet video now comes to the set via other devices such as the Apple TV, Roku and Boxee Box. Nearly six in 10 consumers who own an Internet-connected high-definition TV use such services to supplement pay TV subscriptions, NPD says.

As for other once-vaunted "smart TV" activities — reading or posting on Twitter or Facebook, reading digital books or magazines, video calling, shopping or gaming — well, they attract well below 10% of such people.

Second-Screen TV

Video is clearly what people want on their TVs, so pay TV providers have turned their attention to tablet apps. Instead of shipping expensive set-top boxes, service providers want people to use tablets to find movies, see what friends are watching and browse their favorite programming.

The apps will add to the enjoyment of watching TV by providing player stats in a baseball game or actor bios and behind-the-scene clips from the users' favorite shows. These apps could yield be a goldmine of subscriber data that can be fed to advertisers who could then turn around and use the information to target advertising.

Having an app that knows your viewing habits could be useful when you're traveling. Imagine connecting your tablet to the TV in a hotel room and immediately having the same viewing experience you have at home.

"The TV needs to be more like a docking station," Paul Gray, analyst for DisplaySearch, an NPD company, told me. "It needs to play nice with these mobile devices."

Panasonic is one of the first manufacturers to ship televisions capable of communicating wirelessly with a tablet. Rivals will surely follow suit, as manufacturers emphasize seamless integration with mobile devices.

Dumb Monitors Need Not Apply

To call these sets "dumb monitors" would oversimplify things. A lot of good engineering is needed to provide reliable interoperability with any tablet or smartphone, irrespective of whether it runs Android or Apple's iOS.

"I do contest people who say that TV ends up as sort of a big dumb monitor," Gray says. "You actually probably need quite a lot of intelligence, but it's kind of under the hood."

TV manufacturers, however, are still stuck in the same box they've long tried to escape: Their products are mostly all alike and thus hard to differentiate. Shifts in broadcast technology — such as NTSC to HD, and before long, HD to 4K — or screen technology (LCD vs. LED, for instance) enable some innovation, but once things shake out and picture quality is comparable across models, TV sets once again become commodites. That leaves Panasonic, Samsung, Sony and the rest with price cuts and not much more to lure buyers.

Commoditization is the curse of the consumer electronics industry. TV makers will look for ways to add value after the use of second-screen apps become mainstream. The trick will be to avoid another failure like the smart TV.

Security researchers recently found gaping vulnerabilities in a wide variety of critical business and industrial equipment. It turns out that weak or absent passwords made it easy to break into more than 100,000 terminal servers used to provide their Internet connections. Fixing the problem is simple. Change the credentials dramatically reduces the risk. But for many companies, actually solving the problem is nearly impossible.

Vulnerable, But Hidden

The threats discovered by security firm Rapid7 exemplify the difficulties organizations face in plugging even known holes in critical gear. In this case, the affected systems include industrial control equipment, traffic-signal monitors, fuel pumps, retail point-of-sale terminals and building automation equipment such as alarms and heating and ventilation (HVAC) systems.

Rapid7 found more than 114,000 unprotected terminal servers, mostly from Digi International or Lantronix, that a hacker could use to take control of the underlying systems. Finding the serial ports on the server requires the use of a scanning tool, such as Nmap. Once an active port is found, a command-line program similar to what those used in 1980s vintage home computers is all that's needed to access a control panel or menu or capture data.

Fortunately, while tech-savvy saboteurs or terrorists would have no difficulty gaining access to the equipment, they most likely would not know who owns it or where it is located. Without that information, the find would not be very useful. "There's no telling who they are going to hurt, if they don't know where the device is," explained HD Moore, chief research officer for Rapid7.

How Security Gets Missed

Nevertheless, any hole that can provide access to critical equipment is worth plugging, but it's not likely to happen in many of these cases. Often, companies do not even know the terminal server exists, much less that it needs security updates.

How is that possible? Well, picture a vendor working with the facilities crew installing an HVAC system that uses a terminal server so the equipment can be monitored from a remote location. No one knows the server exists, and no one cares, as long as everything works. "A lot of times IT is not even aware of these systems," said Matthew Neely, director of research at risk management company SecureState.

Vendor marketing can also exacerbate the problem. Equipment is often sold as being "secured," when in fact it is only "capable of being secured." That means the buyer still has to add the technology or turn on and configure the security features.

This can get missed if the installers assume the equipment is "plug and play," said Joe Weiss, a security consultant for Applied Control Solutions. "It's like getting a toy for Christmas and you pull it out of the box expecting it to run, because the box doesn't tell you it needs two AA batteries," Weiss added.

Terminal servers, also called serial port servers, often get missed by electric utility companies because they are not covered under federal cybersecurity requirements. So the devices never make it on the utility's compliance checklist. "They don't even have to check these out to find out if they are or not secure," Weiss said.

This bizarre situation demonstrates that ensuring the security of critical equipment is never a matter of technology alone. True security requires people to pay attention, not just sweep everything under the rug.

Cyberespionage is usually considered a threat to government agencies and large corporations such as defense contractors and banks. But a new Verizon report on data breaches finds that cyberspies are going after small organizations with the same enthusiasm they once reserved for big outfits.

It's A Small Cyberworld

Not surprisingly, 95% of the state-affiliated attacks aimed at stealing intellectual property, which included classified information, trade secrets and technical resources, originated from China last year, according to the 2013 Data Breach Investigations Report. No organization, no matter how small, was safe.

"The big surprise for us was that there were a lot of small organizations being targeted for cyberespionage," Jay Jacobs, senior analyst with the Verizon RISK team, told ReadWrite. The targets included manufacturing companies, computer and engineering consultants and professional services firms that were "relatively small, even under 10 employees kind of small."

The attackers went after small outfits using the same tactics waged against big companies. In a way, the hacker strategy parallels the way investigators go after the small players in a criminal enterprise, hoping to flip them in order to implicate higher-ups. Only in this case, the hackers are frequently targeting small companies to lay hands on the trade secrets of their larger partners.

Roughly one in five cyberattacks in 2012 were to steal intellectual property in order to further a country's national and economic interests. The most common mode of attack was spearphishing, which involves sending an email disguised as coming from a colleague of the recipient. The message typically contains a malicious link or attachment.

Chinese hacking of American computer networks has placed a damper on relations between China and the Obama administration, which has demanded the country curtail its hacker army. On Monday, Joint Chiefs of Staff chairman, Gen. Martin E. Dempsey, and Gen. Fang Fenghui of China met to discuss cybersecurity.

Other Attacks

Despite all the attention, cyberespionage was a distant second in terms of attacker motivation. Three quarters of data breaches committed last year was for financial gain, with the remaining 5% a result of hactivism, the report found. Verizon confirmed a total of 621 data breaches and more than 47,000 reported "security incidents," which included denial-of-service attacks.

Among the companies that suffered data breaches, 37% were financial services firms, 24% restaurants and retailers, 20% manufacturers, transportation organizations or utilities, and the remainder classified as "information and professional services firms." Malware was used in 40% of breaches. Three quarters of the compromises involved exploiting weak or stolen user names and passwords.

Discovering data breaches was not easy for most organizations. Verizon found that the time from compromise to discovery took months, and sometimes years.

Verizon worked with 18 organizations worldwide in gathering data for the report. The groups included national computer emergency response teams and law enforcement agencies.

No one found any cutting-edge methods used by attackers to break into networks, so organizations can go a long ways toward protecting themselves by focusing on the basics, such as stronger passwords and educating employees about bogus email.

IBM has no stomach for low-margin businesses, which is why Big Blue may be ready to dump its commodity server business — i.e., servers that run on Intel-compatible "x86" processors. If the reported talks with Lenovo lead to a sale, the move would mark IBM's final break with the low-end computer business.

A Win-Win

The deal would be a win-win for both companies. Lenovo, which bought IBM's PC business in 2005 for $1.75 billion, would immediately become the third largest maker of x86 servers, behind market leader Hewlett-Packard and runner-up Dell. Thanks to its market clout in its homeland, the Chinese company has risen to become the second largest PC maker worldwide, according to the latest numbers from IDC.

Adding x86 servers to its portfolio makes perfect sense for Lenovo, which has shown in PCs that it can do well in a low-margin, commodity market. For IBM, the opposite is true. The company's strength in hardware is in selling expensive — and profitable — mainframes.

IBM's mainframe business is the reason the company leads the global server market, at least in revenue terms. To give you some sense of how expensive these systems are, IBM's "System z" mainframe represented more than 12% of all server revenue worldwide in the fourth quarter. Because of a refresh in the product line, along with the introduction of new products, such as the zEnterprise, revenue from IBM's mainframe business rose almost 56% year over year in the quarter, according to IDC.

"Although revenue results for System z are traditionally heavier in the fourth quarter, this accelerated acquisition shows the breadth and depth of the IBM mainframe installed base," Jean Bozman, analyst for IDC said in a statement.

Lenovo would be a good buyer for IBM, because it doesn't compete in any of the markets IBM cares about, namely software and IT services. That wouldn't be the case if HP or Oracle were the buyer.

Disruption In Server Market

IBM may also have decided it wants no part of the disruption heading for the server market like a freight train. The increasing number of companies adopting cloud computing will mean fewer server sales, Larry Dignan points out at ZDNet. In addition, Internet companies with large server farms, such as Facebook and Google, buy customized white-box servers, which can't be good in the long term for traditional sellers, like HP, Dell and IBM.

While no one outside of IBM or Lenovo know how much the business would fetch, someone familiar with the talks told Bloomberg that the price would range from $2.5 billion to $4.5 billion, depending on the assets and liabilities included.

Lenovo Is Fired Up And Ready To Go

Not everyone agrees that IBM would be doing itself a favor by selling its x86 business. Gartner analyst Sergis Mushell says that without x86, IBM only non-mainframe servers would be its lineup of machines that run its Power processors — and that demand for those products is shrinking.

In other words, IBM would miss out on the opportunities to build systems based on x86 "while [its Power] architecture's ecosystem is shrinking," Mushell said. "Do you see how it would not make a lot of sense?"

Lenovo, meanwhile, is hungry to move beyond the PC market. The company announced last year a partnership with EMC in which Lenovo planned to introduce x86 servers that would include EMC storage systems. As part of the deal, Lenovo agreed to sell EMC networked storage products in China.

Given the jumpstart it would get from owning IBM's x86 business, Lenovo may be willing to make an offer that's hard for IBM to refuse.

The federal government appears ready to take dramatic action against U.S. wireless carriers that fail to protect Android smartphone buyers against malware — specifically by not pushing out timely operating-system updates. And the catalyst most likely to kick the feds into gear is an American Civil Liberties Union complaint filed Tuesday with the Federal Trade Commission.

Let The Market Decide

What the ACLU is asking is not difficult. Rather than have the FTC order carriers to ship security updates to the Android operating system as soon as they are made available by Google, the ACLU wants customers to be told upfront that they won't be getting the updates needed to protect their personal data from hackers.

"We think the companies should be forthcoming about this," Christopher Soghoian, principal technologist and a senior policy analyst for the ACLU, said. "If consumers knew that certain phones weren't going to get updates, they might not buy those phones in the first place."

Rather than force carriers to spend a lot of money on automatic update services, the ACLU wants the market to fix the problem, a stand that many lawmakers in Congress should applaud.

"We want the market to work, but consumers are never going to get to vote with their wallets if they don't know which phones are secure and which phones are not secure," Soghoian said.

The ACLU complaint names AT&T, Verizon Wireless, Sprint Nextel and T-Mobile USA. AT&T declined comment, Sprint said it follows "industry-standard best practices," and Verizon said it works closely with manufacturers to provide "mandatory updates to devices as quickly as possible."

T-Mobile was the only carrier to say that it keeps Android customers up to date with the latest software. "T-Mobile takes security very seriously, and regularly provides security updates to our customers, including those using the Android operating system," a company spokesman said.

The FTC Plays The Heavy

If that is what T-Mobile does, then it is more in line with the FTC's thinking than its rivals. In a February settlement with smartphone manufacturer HTC, the agency pointedly emphasized the need to secure mobile devices.

Under FTC pressure, HTC agreed to a "comprehensive security program" that includes patching vulnerabilities that could be exploited by hackers and spammers. The agreement was significant because it outlined for all device manufacturers what the FTC considers best practices for security.

Keeping software up to date is a critical defense against hackers, who often target known vulnerabilities in software because so many users continue to run older, bug-ridden versions. In a blog post following the HTC settlement, FTC chief technologist Steve Bellovin made it clear that securing mobile devices was the responsibility of manufacturers and carriers, and they have to work together at getting updates out to customers.

"Bugs happen, ergo fixes have to happen," Bellovin said.

Android malware is a much larger problem outside the U.S., particularly in Asia and Eastern Europe. That's because people in those regions will download applications from third-party app stores, many of which distribute malware-infected software. In the U.S., most people get their apps from the Google Play store, which regularly checks for malicious software.

Nevertheless, 97% of new mobile malware is directed at Android devices, which comprise 72% of the smartphone market, according to security vendor Symantec's latest Internet Security Threat Report. While most infections today occur from downloading bad apps, experts say hackers are increasingly trying to compromise devices through spam that carries links to malicious Web sites.

Given the mood of the FTC, and trends in Android malware, it should be obvious to carriers that the status quo is unacceptable. If they aren't ready to make changes on their own, then they're likely to get an unfriendly shove from the feds.

You've installed antivirus software on your computers, configured your operating system to update its security automatically and password-protected your Wi-Fi. So your home network is safe against hackers, right?

Guess again. And then take a long look at your wireless router.

What Can Happen (Hint: It's Bad)

For years, manufacturers of home routers have all but ignored security issues, at least when it comes to making sure that consumers update their firmware to close exploitable vulnerabilities. Let's put it this way: Have you ever updated the firmware on your router? If not, odds are good that it's got one or more security holes through which a properly motivated hacker could slip.

Attacks on routers aren't common, partly for logistical reasons that make them uneconomical for hackers. But that could change as technology evolves, criminal incentives shift and security tightens up in other areas. One big potential trouble spot: the embedded Web servers that many routers use for managing their settings — including, of course, security.

Router manufacturers have done a lousy job informing users about firmware updates that would patch security flaws, and are even worse making it easy for users to obtain and install those updates. Such patches are seldom available through automatic services, forcing users to look up the fixes on manufacturer websites.

"These are low-priced, low-power devices," Tod Beardsley, a researcher with application security vendor Rapid7, said. Manufacturers "may not have the margins on these devices to provide ongoing software support."

To see what can happen when a flaw remains unpatched, look no further than a major intrusion in Brazil in 2011, when hackers broke into 4.5 million home DSL modems over the Internet. The modems were reconfigured to send users to malware-carrying imposter websites, primarily so thieves could steal their online banking credentials.

From Brazil With Love

That exploit in Brazil was similar to one that application security tester Phil Purviance recently employed against a wireless Linksys EA2700, which was released about a year ago. Called a cross-site request forgery, the technique allowed Purviance to break into the router's embedded management Web site. Once in, Purviance found he could change the login information and remotely manage the hardware.

"What I found was so terrible, awful, and completely inexcusable!" Purviance wrote in his blog. "It only took 30 minutes to come to the conclusion that any network with an EA2700 router on it is an insecure network!"

Purviance found a total of five vulnerabilities in two Linksys routers, the EA2700 and WRT54GL. Separately, flaws recently found in Linux-based routers from D-Link and Netgear could enable a hacker on the network to gain access to the command prompt on the operating system, Rapid7 reported.

D-Link and Netgear didn't respond to requests for comment. Belkin, which bought Linksys from Cisco last month, said in an email sent to ReadWrite that the EA2700 was fixed in a firmware update released last June. Called Smart Wi-Fi, the firmware is available through an opt-in update service.

What Hackers Want

Manufacturers have gotten away with sloppy security practices because breaking into wireless routers usually requires physical proximity. That made it far harder for hackers to bust into multiple computers, because they'd have to move from network to network in order to target them. Thus hackers have tended to favor blasting out malware-carrying spam from a single location over attacking individual wireless routers.

But that could change. Industrial control systems that run manufacturing operations, power grids and other critical infrastructure are increasingly under pressure from cyberespionage campaigns. Vulnerabilities in these systems are as bad as in home routers. You can see just how bad is is via the search engine Shodan, which collects information on 500 million connected devices, such as routers, printers, webcams and servers, each month.

In time, hackers will develop better tools and malware for breaking into hardware, and this technology will eventually find its way into the criminal underground.

How To Safeguard Your Router

In other words, it makes sense to safeguard your router now. Here are a few steps you can take to make your home network a less inviting target:

In your router security settings, make sure you've changed any default usernames and passwords. These will be the first things any hacker tries, much the way a burglar jiggles a doorknob to see if it's unlocked.

Disable wireless access to your router's management console, which allows you to manage its settings by pointing a Web browser to an address such as 192.168.1.1. Disabling wireless access means you'll have to be physically plugged into the router in order to manage it, making it far more difficult to hack.

If you're sufficiently technically minded, consider replacing your router's doubtless buggy internal software with an open-source alternative such as DD-WRT, Tomato or OpenWRT. While these options aren't particularly consumer friendly, their firmware is less likely to contain obvious vulnerabilities — and will probably offer you some cool new features, too.

Just looking at ads is bad enough, so who would want to talk to them? While many people would likely answer "no one," voice-recognition software maker Nuance says the opposite is true.

What Is A Voice Ad?

Wanting in on the booming mobile ad market, Nuance developed a way for people to chat with ads much as they do with Siri on the iPhone. Called Voice Ads, the technology works off the Internet connection of any iOS or Android mobile device.

Voice-recognition software has been around for years, but remains relatively immature as a form of communication between humans and computers. Founded in 1994, Nuance has been developing the technology longer than most other companies. Nuance's technology reportedly powers Apple's Siri, although neither company will confirm it.

Nuance's voice-ad technology is available today through the mobile ad frameworks of Jumptap,Millennial Media and Opera Mediaworks. An ad framework is what developers embed into their mobile apps, so they can display advertising distributed by an ad network.

Advertisers using Nuance's software development kit could build two-way communications requiring only "Yes" and "No" answers - or ones with more complicated responses. An example of Voice Ads can be seen on YouTube.

The development process is not self-service, though. Ad developers have to work directly with Nuance to connect the advertising to the company's voice-recognition servers over the Internet. And because the technology is so new, it isn't supported in third-party rich-media ad creation tools, except Celtra.

Talking To Ads Could Make Sense

In many ways, Voice Ads make sense on a smartphone. Why fiddle with clicking on tiny links and trying to type on a 4-inch screen, when you can click once and start talking with a brand?

As people get comfortable talking to their smartphones through personal assistants like Siri, it's possible they could be enticed into starting a conversation through a product discount or promotional pricing. According to Nuance, advertisers see lots of potential.

"When you actually have a live conversation with an ad, it's sort of like you're creating more of a tight relationship with the brand itself, because you're having a discussion with it," said Peter Mahoney, chief marketing officer for Nuance. "The brand feels more responsive. It feels like something you can actually have a real live relationship with."

While the thought of having a "tight relationship" with an ad may sound absurd, there is big money at stake. Worldwide mobile advertising revenue is expected to hit $11.4 billion this year, reaching $24.5 billion by 2016, according to Gartner.

The key is the quality of the experience. Nuance's technology will have to convince people they are actually having a meaningful, two-way conversation. Advertisers will have to give potential customers something in return for having that conversation with a brand.

Speech is continuing to evolve as a means of communication with computers. As people get used to talking to the machines they use in their everyday lives, the jump to talking to an ad may not seem so extreme.

The money-go-round between app developers and ad networks is starting to blur the line between many free Android apps and malware. While these legitimate apps aren't stealing passwords, they're still riding roughshod over user privacy by gratuitously sucking up your contact and location information — or worse.

What These Bad Apps Glom Onto

Between last September and March, security vendor Bitdefender analyzed 130,000 popular Android apps on Google Play and found that roughly 13% collected your phone number without explicit notification, 12% stored your location data and 8% sucked up your email address. Included in those numbers are apps that siphoned off one or more of the three.

Many apps don't stop there. Other data they glom onto includes your browsing activity, your contact list, the unique identification number of your device and even your call registry.

These apps took all that information legally. Android apps display their privacy policies in seeking permission to gather personal data, and many developers bank on the fact that most people will just click through to the app.

All that data gathering typically starts when an app developer download an ad framework provided by more than 400 companies listed on the Ad Network Directory. Such frameworks makes it easy for developers to display ads in the app, and thus to get paid every time someone clicks on them.

Since free apps only make money for developers from such clicks (and, it turns out, the distribution of associated user data), very few pay attention to exactly what kind of information ad frameworks are gathering.

"Because they copy-paste the code, they don't really debug it; they don't really look through it and see what data it collects," Bitdefender researcher Liviu Arsene told me. "I bet they don't even care."

And It Doesn't Stop There

App privacy policies often stake out even more aggressive data-collection goals, presumably to pave the way for future updates to vacuum up more info and further erode user privacy.

Take, for instance, Airpush, the second-largest ad network for Android developers with 40,000 apps. Its privacy policy reads, in part:

[I]n accordance with the permissions you have granted, we may collect your device ID, device make and model, device IP address, mobile web browser type and version, mobile carrier, real-time location information, email address, phone number and a list of the mobile applications on your device.

The policy goes on to explain that Airpush might supply that information to third-party advertisers who are part of its ad platform and third-party vendors, consultants and other service providers. Because the data is available to so many organizations, it's virtually impossible to know who is using your personal data, and how, once it leaves the device.

Obviously, the possibilities for abuse here are legion. Suppose one of those third-party organizations is acquired by an outfit that is, shall we say, less reputable. Or that a third party company's computers are hacked, spilling your data into the hands of cybercriminals.

The FTC wants the mobile industry to bolster privacy controls by allowing phone users to opt out of being tracked by ad networks. The commission also wants apps to prominently display the kind of data they're collecting, rather than burying it in fine print. Congress is also considering proposals to tighten privacy protections on mobile devices, though it's hard to say how such measures will fare given firm opposition from industry.

Amazon Web Services is on fire, and EMC and VMware are feeling the heat. So the established enterprise-computing duo is striking back — by launching Pivotal, a joint venture that aims specifically to dethrone the current king of cloud computing.

Pivotal is led by Paul Maritz, the ex-CEO of VMware and a former senior executive at Microsoft. In leading the charge against AWS, Maritz is diving into a cloud-computing mosh pit that will include other tech heavyweights, such as IBM, Microsoft and Oracle.

Pivotal heads for battle with parent-company assets — database technologies, data analytics and an application platform — it is combining into services that customers can lease to run their own software in the cloud. EMC owns 69% of Pivotal and VMware the rest. The two owners will have to invest a total of $800 million this year and next in order to kick start Pivotal, which Maritz conservatively estimates will reach $1 billion in revenue in five years from $300 million this year.

Amazon's Lead

Those numbers show how long it will take Pivotal to catch up with AWS. While Amazon won't break out the numbers for its cloud-computing unit, analysts say it is lumped inside the revenue category the online retailer calls "other." In Amazon's fourth quarter earnings released in January, "other" accounted for $769 million in revenue for the quarter and $2.52 billion for the year. That's a respective growth of 68% and 64%, according to the International Business Times.

Nevertheless, the market is still young. Most AWS customers today are startups and small and medium-sized businesses. Amazon is expected to shift focus to large companies soon, heading right into EMC's and VMware's sweet spot. This is making both companies very nervous.

During a partner conference in February, VMware Chief Executive Pat Gelsinger warned that if "a workload goes to Amazon, you lose, and we have lost forever," CRN reported. To avoid that kind of customer drain, Pivotal will provide the public-cloud option for VMware customers using its infrastructure technology for private clouds. Supporting that migration is important to EMC, because it owns 80% of VMware.

Pivotal In The Cloud

On paper, Pivotal will provide an enterprise-class cloud-computing platform and infrastructure. The company includes Greenplum, EMC's Big Data analytics division, and Pivotal Labs, the storage company's application development environment. VMware is contributing cloud-computing platform CloudFoundry, and middleware and tools for building and running data-intensive Java applications.

Maritz will have to build a business on top of all this technology, but EMC's and VMware's commitment to Pivotal shows how they believe customer migration to cloud-computing environments outside their data centers is inevitable. The companies also know that failing to have what customers want would be suicidal.

In 2011, Gelsinger, then president and chief operating officer for EMC, said the company did not intend to become a casualty of any major change in the industry.

"The technology industry is ruthless and relentless," he said during an interview at the VMworld conference. "If you are not in front of those major waves of technological innovation, you will become one of the driftwood on the shores of the industry."

In cloud computing, stopping Amazon is how EMC and VMware plan to reach that shore alive.

Over the past dozen years or so, Intel has repeatedly demonstrated that it has a tin ear when it comes to consumer electronics. Despite a long trail of failure, the tenacious chipmaker keeps coming back with one bad idea after another.

This isn't a new idea. Apple, Google and Microsoft have also wanted to reshape television in a similar way, but have yet to convince Time Warner, NBC Universal and Viacom to license their TV shows and movies in a way that would give Internet TV a fighting chance. Go figure. And so none of them have moved forward.

A Desperate Intel

Intel, however, is plowing ahead. Why? Because it's desperate to break into new markets as sales of PCs, the majority of which are powered by Intel microprocessors, continue to deteriorate. The meteoric rise of smartphones and tablets that eroded the PC business blindsided Intel, which has had little success in supplying chips to these new markets. In the fourth quarter ended in December, Intel net income fell 27% year to year and revenue was down 3%.

The speed with which the PC market is vaporizing has made Intel willing to take on a lot of risk. According to Bloomberg, Intel is making progress in talks with media companies. But what that means isn't clear.

As a smaller operator, Intel would likely pay more for TV channels and movies than incumbent cable, satellite and telecommunications companies, which spend almost $38 billion a year licensing TV channels according to the Wall Street Journal. Media companies have no incentive to anger current licensees — much less cut into their potential profits — by agreeing to any terms that would give Intel an advantage.

So Intel will be paying more to go to market with a service that looks, well, a lot like what its larger rivals are already selling — only, maybe, less so. In addition, Intel would be dependent on broadband services to deliver its pay TV services. This could be a problem if cable and telecoms decided to ratchet down their data caps.

Intel's set-top box could offer unique whiz-bang features, such as a camera for video-conferencing and personalizing content based on facial recognition. But that won't be enough, since people watch TV for the programming, not what's inside the set-top box.

Intel Outside

Intel's history is a study in how not to combine technology with entertainment. Each of its attempts follows the same lame pattern: Intel hypes its plans at the Consumer Electronics Show, then programs its executives to continue slinging marketing BS in interviews with the press. Eventually, the whole venture falls apart, usually with in a year or so.

To wit:

In 2010, Intel launched chips and big plans for partnering with manufacturers to build the "smart TV," which was really nothing more than a set that would let users run apps and tap into the Internet while watching programs. The problem: no one wanted to play with apps on their TV. So Intel pulled the plug in 2011.

In 2006, Intel embarked on another would-be game changer, Viiv. This chipset for Windows PCs running Microsoft's Media Center was going to turn PCs into entertainment hub — along the way, moving the battle with viruses, software updates and computer crashes to the living room. The Intel hype machine churned into high gear, at least until the first Viiv PCs came out. As The Washington Post reported, the typical Viiv box offered little more than a "smattering of free Web video clips and discounts on online music, movie and game rentals — plus a nifty rainbow-hued Viiv sticker on the front of the computer." By 2008, Viiv was dead.

In 2001, Intel was pondering slower-than-expected demand for its Pentium 4 chips and a modest 10% growth in its core microprocessor business. So the company decided to jump into the market for digital music players with a $300 gadget to take on the leading models from Sony, Philips and the Rio division of Sonicblue. Unexpectedly, though, Apple released the iPod later that year and wiped out all its competitors. Including Intel.

Despite its money and army of smart people, Intel simply doesn't get the consumer electronics market, and likely never will. The company is very good at building the innards of PCs, including chips, memory and motherboards, but has shown little talent for doing much else.

Rather than launch pay-TV services that will take the company in a direction far beyond its expertise, Intel has to get much better at picking market winners. Missing out on the smartphone and tablet craze was a huge blunder. While making up for that miss, Intel needs to watch for what's next and move quickly. Launching a pay-TV service just makes the company seem desperate to try anything.

In June 2007, Apple launched the first iPhone, marking a new era in corporate mobility. Before the fashionable mini-computer, people used smartphones for voice, texting and email. With the iPhone and its remarkable touchscreen users could also be entertained with music, video and games. Corporate executives became so attached to their hip device, they wanted to use it for business, so they bullied IT departments into providing access to email and corporate data. Employees soon joined their bosses and the bring-your-own-device trend began.

Six years later, what started out with one smartphone has grown into an army - far too much for the Wild West atmosphere of BYOD to continue as it has been. Many companies that have allowed BYOD will soon be pulling back on such freedoms. While BYOD may not die altogether, it will carry stricter restrictions meant to finally get this trend under control.

The Fate Of BYOD

"BYOD is clearly an important trend, but we expect it to plateau in the coming one to two years as enterprises decide that the cost and security issues associated with unlimited BYOD do not warrant the anarchy and increased support costs it has often caused," a recent report from tech analyst J.Gold Associates said.

Where the iPhone use to be in a class by itself, the smartphone now competes with Android phones from Samsung, HTC, LG, Sony and 10 other vendors. In addition, there is the BlackBerry and multiple devices running Microsoft's Windows Phone.

In 2010, Apple added the iPad to the chaos, creating a whole new market for tablet computers that brought lots of competitors from manufacturers in the Android camp.

From the beginning, BYOD was a challenge for IT departments, which had to wrestle with data security, device manageability, support and app control. Nevertheless, enterprises went along with the trend and the majority allowed at least some workers to use their personal devices for business.

But configuration, workflow and security issues were always making things difficult for IT. For instance, cyber-criminals saw an easy target in Android - with so many devices running older versions of the OS, hackers could target known vulnerabilities that were left unpatched by manufacturers and wireless carriers.

BYOD Limits

A survey of enterprises that allow employees to use their own notebooks, smartphones and tablets found that nearly half had experienced a security breach. As a result, more than 40% of the companies either restricted mobile data access or installed security software, according to the poll of more than 400 IT professionals and chief executives conducted by Decisive Analytics and released in August 2012.

Despite the breaches, only 12% of companies outright cancelled BYOD programs, an indication that most remained committed to providing flexibility to employees, while moving toward imposing rules.

Indeed, Gold found that companies are realizing "the current mostly wide-open, laissez fare approach to BYOD is not sustainable longer term, and that more controls and better strategy are needed."

As companies clamp down on BYOD, employees will likely find they will have to surrender their devices in order for IT departments to install technology to protect corporate data and communications. At the same time, manufacturers are providing more enterprise features in order to ensure their products get approved for work and play.

Samsung recently launched technology called SAFE that the vendor boasts brings enterprise-class security to selected devices. People who buy the Galaxy S III or S 4 smartphones, the Galaxy Note II smartphone/tablet hybrid or the Note 10.1 tablet have the option of including SAFE, which provides a container for corporate data and email in order to separate it from personal applications.

BlackBerry, which has always been considered the gold standard in device security, has added similar data-separating technology in the new Z10.

In time, enterprises are likely to give the nod to those devices that can meet the demands of consumers and businesses and shun those that don't. So instead of BYOD, the policy of the future will be BYODA, or bring-your-own-device-for-approval.

In 2010, Apple captivated PC users with the release of the iPad. The thin and light tablet with exceptional battery life, ease of use and attractive design became the must-have mobile device for many corporate executives and employees. With nothing comparable in the Windows PC world, Apple had the business market to itself.

But Apple is a consumer electronics company at heart; so future iPad models remained devoid of features that were needed to meet corporate requirements for security, deployment, manageability, up-time, support and training. In the meantime, Microsoft, Intel and PC manufacturers picked themselves up and plotted their comeback. After three hard years, PC makers have finally released Windows tablets that tech analyst firm Moor Insights & Strategy says will likely reverse Apple's gains in the corporate market.

Apple's Party Is Over

"Enterprise tablets now exist that provide the best of both worlds between end user and IT, which puts the Apple in a precarious position of needing to add more robust enterprise features," Moor says in a white paper released Monday. "Until that point, Moor Insights & Strategy recommends enterprises re-evaluate their iPad pilot and deployments."

What's In the New Windows Tablets

Two crucial components are Microsoft's Windows 8 and Intel's Atom processor Z2760. The former provides a touch-based interface that's a key element of any tablet's appeal, while the former delivers the performance and battery life. In fact, a comparison review by AnandTech found that battery life with the Z2760 surpassed the iPad 4 when Web browsing.

Because Intel has built a competitive chip based on the X86 instruction set, the three tablets can run the latest touch-enabled apps for Windows 8, as well as Windows 7 apps. Among the most important app is Microsoft Office, the enterprise standard for office productivity. Office doesn't run on the iPad, and Apple's productivity tools are not regarded as being on par with Microsoft's.

There's also more baseline expandability with the Windows tablets. Depending on the vendor, the devices can come with a dock, USB, miniHDMI and microSD. Add other optional manufacturer-supported accessories and the iPad is left in the dust.

Other pluses include playing nicely with Active Directory, Microsoft's directory service for authenticating and authorizing users and computers in a Windows network. The tablets, through the Atom processor, also offer Intel security, which includes Secure Boot and the firmware-based Platform Trust Technology.

Overall, the fourth-generation iPad provides roughly a half-dozen enterprise features, while the Windows tablets have more than a dozen. Most important, those features are already in use in corporations, so there's no need to evaluate them before deployment, train IT staff or purchase new tools.

What this ultimately means is the Windows tablets will be less expensive when considering the total cost owning and managing the devices. In addition, they are more durable and as nicely designed as the new iPads, and have larger displays. The resolutions are less, but still more than adequate for businesses.

Some Disagreement

How much of a head start Apple has in the enterprise is tough to determine, since the company won't say how many iPads have been sold to businesses. However, a running tally of the top 100 iPad rollouts kept by SAP show that nearly 70 are K-12 schools, where Apple has always done well. Nevertheless, there are some notable names on the list, including the U.S. Air Force, United Airlines, British Airways, General Electric and the Walt Disney Company.

Not everyone agrees with Moor. Jack Gold, principal analyst for J. Gold Associates, believes the market momentum is still behind the iPad. Units within an organization, not the IT department, will often choose the tablet they want to use and many want the iPad.

"The iPad, and Android (tablets), will have a place as long as users demand it," Gold said. "And the Win8 devices will find a niche, particularly in those organizations that have company-owned assets that IT fully controls."

While Gold has a point, the advantages the latest Windows tablets have are too numerous for corporations to ignore.

Since 2002, when Microsoft launched its Trustworthy Computing initiative, security in the company's products have improved each year. But while the company has increasingly battened down Windows, Office and its other programs, the number of vulnerabilities in harder-to-patch third-party applications has grown dramatically, making overall security on the PC worse than ever.

More Risk In Third-Party Apps

Rather than go through the expense of battling Microsoft directly, many hackers now focus on low-hanging fruit, such as the Java and Adobe Flash browser plug-ins, which are often left un-patched even by users who conscientiously update Windows and Office. This trend was highlighted in a new study by Secunia.

The security vendor found Microsoft's highly effective automatic security updates now address only 8.5% of the vulnerabilities in a PC. The rest have to be patched through updates from various software developers, each with their own unique process. The complexity leads users who are not security savvy to forgo updates, vastly increasing their risk of infection.

"There is, to date, no one fix-it-all solution," warned Morten Stengaard, director of product management and quality assurance at Secunia, in the company's blog.

Theoretically, Microsoft could overhaul Windows to place each third-party application in its own container, making it more difficult for hackers to load malware in the operating system. However, such a massive change would require Windows software vendors to rebuild their own products, which would have a ripple affect on every corporate and consumer customer.

"Microsoft, to some extent, is hamstrung by legacy code and what they've done in the past," Jack Gold, analyst for J. Gold Associates, said. "They can't just rip everything up and start all over again very easily."

Fewer Flaws In Microsoft Apps

Ironically, the third-party threat is blossoming even as Microsoft continues to get its own house in order. In 2012, out of all the known vulnerabilities in the top-50 PC programs, Microsoft products accounted for only 14% of them, the study found. The rest were in other software. And the share of vulnerabilities on a Windows PC coming from third-party applications has been growing. In 2007, they accounted for 57% of the security flaws, compared to 86% last year, Secunia says.

"It's well known that they [Microsoft] have put great efforts into improving security of the operating system and the applications that they provide," Stengaard said in an interview. "What we're seeing is the long-term involvement and dedication is now paying off."

Windows, Office, Silverlight and other Microsoft products are not ironclad, of course. Given enough time, knowledgeable hackers can find their way in through these channels. But in the world of cybercrime, most hackers are not interested in a challenge. Instead, they look for the easiest way to break into as many PCs as possible, to enslave the machines into the many armies of remotely controlled botnets, or to steal credit-card numbers, social-security numbers and corporate intellectual property that will fetch a good price on the underground.

Including both Microsoft and third-party applications, the number of PC vulnerabilities has dropped by 5% since 2011, and by 10% among the top 50 applications. Since 2007, though, overall vulnerabilities are up 15%, Secunia found, and that jumps to a whopping 98% increase among the top 50 applications.

Where The Danger Lies

Applications most likely to provide an easy path into Windows machines include Java, Flash, Adobe Reader and Apple iTunes, according to Secunia. If these applications are not kept up to date, hackers can exploit known vulnerabilities that enable them to load their malware via the PC's system memory.

In addition, all these applications have very large user bases, which makes it easier for hackers to find targets.

Why PCs have so much outdated software varies. Sometimes it's because the update process is too cumbersome, so they don't bother. Other times, the vendor is slow in fixing flaws that hackers are already targeting. Updating Java, an open platform for running software on any operating, system has been a pain for a long time. However, Java steward Oracle is working to improve the process and is getting updates out quicker, most experts agree.

In 2012, Adobe had the worst record for updating applications, according to Secunia. The software maker released patches at a rate 80% slower than in 2011, based on the time it took the vendor to release updates of vulnerabilities reported by Secunia.

In fact, in 2012, 84% of vulnerabilities had patches available on the day of disclosure. In 2011, the number was only 72%. The most likely explanation for this improvement in ‘time-to-patch’ is that more researchers coordinate their vulnerability reports with vendors.

Patching Is Critical

The vendor based its study on 6 million PCs, mostly in the U.S. and Europe, running its freeware called Personal Software Inspector, which checks for application vulnerabilities. Microsoft products accounted for 35% of the programs on the PCs.

If you take Secunia's study seriously, then the takeaway is clear. Even if patching all your software is getting more complicated, making sure everything is always up to date is more important than ever.

Entrepreneurs have often used technology to bring us services we didn't even know we needed. Who would have thought a billion people would be willing to share their lives on Facebook and hundreds of millions more would change the news industry by microblogging on Twitter? But oftentimes entrepreneurs get it wrong and throw technology at a problem that only exists in their dream-chasing heads.

Total Boox

Such is the case of Total Boox, a digital bookselling startup founded by Israeli entrepreneur Yoav Lorch. Total Boox is scheduled to open for business this month, selling e-books on a pay-as-you-read basis. If you read a quarter of the book and decide it's not worth anymore of your time, then you only pay a quarter of the retail price. The way Total Boox sees it; customers win by not having to pay the full price for a book they may lose interest in. Publishers win by increasing revenues through "finding more readers, the right readers."

"When it comes to e-books, people talk about the technology a lot but they don’t spend much time looking at business models," Lorch, a trained economist, told online magazine Publishing Perspectives. "And so the old business model of pay first read later - which makes sense when applied to physical books - has been smart and sneaky enough to creep into the world of e-books. But it doesn’t belong there."

Lorch could not be more wrong. "There are very few things I can think of that strike me as having less of a chance of being commercially viable than this," said Mike Shatzkin, founder and chief executive of The Ideal Logical Company, a consultancy firm focused on digital change in the book publishing industry.

Few Benefits

People deciding whether to buy a book online can usually read a whole chapter, and sometimes more, for free. There's no evidence that people are looking to pay to sample a book. Also, given that there are lots of e-books that sell for less than $10, the amount of money saved doesn't justify the complexity of pay-as-you-read, which requires having a credit card on file to continuously pay for every page.

For publishers and authors the benefits are even less. Just because people may not finish the e-book they buy is not a reason to give them an opportunity to pay less.

"I see him solving a problem that doesn't exist with a solution that the owners of the rights are not likely to be happy with," Shatzkin says. "I think he'll get stopped by not having any content that matters before he begins."

Indeed, Total Boox has no major publishers onboard, and messages seeking comment from several of them went unanswered.

Some of the most expensive books in publishing are college textbooks, which can cost a student several hundred dollars per semester. Students would likely jump at the chance of spending less, but publishers have no reason to give them that opportunity. After all, students have to buy the textbooks in order to pass their classes. Even if the books are shared, they still have to be bought.

Publishing Isn't Dead

While e-books and the Internet have certainly caused major changes in the publishing industry, booksellers overall are adjusting. The stock price for the industry rose almost 24% year to year in 2012, easily beating the roughly 7% gain of the Dow Jones Industrial Average, according to Publishers Weekly Stock Index. Even if the leader Amazon is removed from the index, it was still up almost 11%.

This doesn't mean the industry doesn't still have its challenges. Among the biggest are shrinking profit margins due to higher discounts and falling prices. Total Boox is offering a business model to make that problem worse. If it wants industry support, it will have to go in the other direction.

The technology industry has been excluded from the government's definition of what constitutes the nation's critical infrastructure, giving them a free pass from regulations. While this may be good for IT businesses, telecom companies like AT&T and Verizon Communications are crying foul.

Information technology is crucial to business, and according to these telecom companies, IT is just as important in securing power plants, telecommunications and water filtration systems. Which is why they want IT companies to be listed as part of the nation's critical infrastructure, something IT vendors are resisting because they don't want to be saddled with more government regulation.

The very political situation raises many questions, and has few answers.

Obama's Executive Order

Currently, IT - think companies like Microsoft, IBM, Apple, Oracle, Cisco and more - is excluded from the government's definition of critical infrastructure, as defined by President Obama in an executive order issued last month. In directing the Secretary of Homeland Security to identify critical infrastructure at the greatest risk of attack, the order says the Secretary "shall not identify any commercial information technology products or consumer information technology services under this section."

This exclusion, the result of heavy lobbying by the IT industry, is not sitting well with telecom companies, such as AT&T and Verizon. They believe technology vendors are as important as the network operator in building adequate security to fend off cyberattacks from terrorists.

"The Internet ecosystem is far more interconnected and dependent on a host of players than it was even five years ago," a Verizon spokesman said.

Fighting Regulations

While the government battles terrorism, telecom and IT companies are trying to fend off regulations. The executive order sets the groundwork for cybersecurity legislation from Congress. So far, the IT industry has been excused, and the telecom industry wants it to share whatever regulatory burden results from current negotiations between the White House and Congress.

"The telecom community is concerned the tech industry is going to get a free pass here," David Kaut, a Washington analyst with Stifel Nicolaus & Co. told Bloomberg. "You have an ecosystem and only the network guys are going to get submitted to government scrutiny."

Telecom companies have a point when it comes to critical infrastructure. Hackers who break into the Windows computer of a telecommunications company could wind their way into control systems and shutdown wireless or landline service for hundreds of thousands of people. But is regulating IT security directly the best way to prevent such a breach? I don't believe so.

Instead of more regulations, the government should focus on requirements for companies directly involved with maintaining the nation's critical infrastructure. As IT customers, these companies, which include utilities, financial institutions, defense contractors and manufacturers, are in a much better position to get the security they need built into the products they agree to buy. If an IT company such as Microsoft, Oracle or IBM cannot meet the requirements, than another one will.

"Commercial products and services often are the weakest link, but regulating them directly means imposing costs that many users won’t be able to shoulder," Stewart Baker, a partner at law firm Steptoe & Johnson and a former assistant secretary for policy at DHS, said. "So you end up imposing costs on everyone to protect a portion of the economy."

Political Talks

This issue is sure to come up during negotiations underway between the White House and congressmen supporting a cybersecurity bill introduced in the U.S. House Intelligence Committee. The bill emphasizes sharing threat information between businesses and government, while the Obama administration also wants minimum security standards set for the most critical companies.

For telecom companies to get what they want, they will have to convince the Republican majority in the House, which adamantly opposes more government regulation, to broaden the cybersecurity bill to include the IT industry. That's unlikely, so telecom and other critical infrastructure companies should be prepared to take full responsibility for securing their systems.

"IBM is the big fish in the sea and for them to make the level of commitment that they did today is a big deal," said James Staten, analyst for Forrester Research. "That's the kind of heft OpenStack needs."

The announcement is likely to send OpenStack's two main competitors VMware and CloudStack, another open source cloud computing platform, into a battle for second place.

“OpenStack has won the race to become the standard, and it has done it rapidly,” Ann Winblad, a venture capitalist and a managing director of Hummer Winblad Venture Partners, told AllThingsD.

IBM And Open Source

IBM has conducted a long love affair with open source software. In 2000, it backed Linux and a year later committed $1 billion to the development of the operating system. IBM's support helped drive Linux into large organizations and made it a viable competitor against Microsoft as a server platform.

"IBM could have the same impact on OpenStack as it did on the Linux world," Staten said.

IBM recognized years ago that open source code fit its business strategy a lot better than proprietary technology. The company draws most of its $100 billion in annual revenue from providing IT services. By basing a lot of its own technology on the code from various open source projects, as well as industry standards, IBM is able to work its hardware and software into what enterprise types call "heterogeneous computing environments" — the combinations of patched-together technology from a variety of vendorstypically found in large companies, the segment of the tech market IBM is strongest.

"IBM has really great credibility in the open source community," Gary Chen, analyst for International Data Corp., said. "They really do understand open source."

IBM's First OpenStack Product

IBM followed its announcement with the introduction of its first OpenStack-based product, SmartCloud Orchestrator. SmartCloud is the brand name for IBM's platform for running cloud installations in customers' or IBM's data centers or in a combination of both. Orchestrator is a service customers use to configure the computing, storage and networking resources for cloud applications.

One unanswered question is how IBM will integrate its current SmartCloud code base with OpenStack. In an interview with NetworkWorld, Robert LeBlanc, a senior vice president of software for IBM, waxed mystical in describing how Big Blue will handle the transition.

"We're on a continual journey," LeBlanc said. "But we think this is a major step in that journey."

Cloud Standards

IBM clearly wants to influence OpenStack's technological direction and efforts to develop industry standards for cloud computing, which is still a relatively immature architecture. IBM has formed a 400-member Cloud Standards Customer Council to help push other tech vendors in a direction favorable to IBM. The company says it has more than 5,000 customers running private clouds on its platform.

While standards are key to making different technologies work together, they won't help companies make the cultural changes necessary to adopt cloud computing and make it work. Delivering applications as a Web service dramatically changes the role of IT departments and affects how employees interact with software, too.

Because of its success in professional services, IBM is in a strong position to help companies make those cultural changes, but it won't be easy. "A lot of enterprises are not ready to hear it," Staten said.

Nevertheless, the momentum in the tech industry is behind cloud computing. The public cloud service market alone is expected to grow 18.5% this year to $131 billion worldwide.

With that much money on the table, IBM plans to become a major player in the market and is betting that OpenStack can help it achieve that goal.

Cybercriminals and the mayhem they can cause have become the leading concern of security experts in cloud computing. That's the takeaway from the Cloud Security Alliance's latest poll on the top nine threats the industry faces.

Changes In Security Priorities

The nonprofit's latest survey found a reshuffling of security priorities pointing to the growing danger posed by cyberattacks aimed at stealing corporate data. Data breaches and account hijackings that were in the middle of CSA's 2010 list of top threats rose to the number one and three spots, respectively, this year. At the same time, denial of service attacks made their debut as the fifth most worrisome threat.

The CSA report is meant to give cloud service providers and their customers a snapshot of what experts see as the greatest dangers to storing data and conducting business with customers in the cloud. Fueling fears is a steady stream of break-ins at service providers and Web sites owned by businesses, government and educational institutions.

So far this year, 28 breaches attributed to hackers have been made public, resulting in the loss of 117,000 data records, according to the Privacy Rights Clearinghouse. Service providers hacked included Zendesk and Twitter. In 2012 there were 230 publicly disclosed breaches for a loss 9 million records. Service providers that suffered breaches included Yahoo, eHarmony and LinkedIn.

Experts agree that no organization doing business on the Internet is immune from a break-in, particularly as the quality of software tools available to hackers through the underground development community continues to grow in sophistication.

"All the vulnerabilities and security issues that on-premise, non-virtualized and non-cloud deployments have still remain in the cloud," Lawrence Pingree, analyst for Gartner, said. "All that cloud and virtualization does is enhance the potential risks by introducing virtualization software and potentially mass data breach issues, if an entire cloud provider’s infrastructure is breached."

Hackers Not The Only Threat

Surprisingly, the second greatest threat in CSA's latest list is data loss not from cybercriminals, but from cloud service providers themselves. Accidental deletion happens more often than a lot of people may think.

In a survey released in January of 3,200 organizations, Symantec found that more than four in 10 had lost data in the cloud and have had to recover it through backups. "It's really kind of astounding," Dave Elliott, a cloud-marketing manager at the storage and security company, told Investor's Business Daily.

Whether from hackers or a service provider SNAFU, the loss of data is damaging to the reputation of all parties involved – customer and service provider -- no matter who is to blame, Luciano "J.R." Santos, global research director for the CSA, said. The potential financial impact from losing customer trust is why data loss is so high on the threats list.

"It's your reputation," Santos said. "A lot of folks are saying these are the things that if it happened to me or if it happened to me as a provider, they would have the most impact to the business."

The fourth top threat according to the CSA marks an improvement in internal security. In 2010, insecure application programming interfaces was the second greatest threat listed by experts.

APIs are what customers use to connect on premise applications with cloud services, as well as to manage the latter. While the technology is improving, the fact that it remains on the list indicates that cloud service providers still have a ways to go in locking down their APIs.

The Bottom Four

The remaining top threats, starting in order with number six, are malicious insiders, abuse of cloud services, insufficient planning on how to use cloud services and the vulnerabilities that may exist as a result of the way a cloud provider architects its infrastructure, so it can be shared among many customers.

Abuse of cloud services refers to hackers who rent time on the servers of cloud computing providers to perform a variety of nefarious acts, such as launching denial of service attacks and distributing spam. This along with the other bottom four threats was higher in 2010.

Overall, I see this year's list as a mixed bag for cloud security. While some areas show improvement, data protection needs to get a lot better. Gartner predicts public cloud services will reach $206.6 billion in 2016 from $91.4 billion in 2011. That much growth won't happen unless businesses are comfortable with data security.

Free iPhone and iPad apps from Apple's App Store pose a greater privacy risk than free apps from Google Play. That's the finding of the latest study by Appthority, which is in the business of evaluating mobile apps for companies.

Why the App Store Loses

On the surface, the Appthority study — released Tuesday during the RSA security conference in San Francisco — appears to find iOS and Android apps equally culpable of privacy violations. Of the 10 top-selling apps the firm tested in each of five categories, 60% of the iOS apps shared data with advertising and analytics networks. So did 50% of Android apps.

A closer look, however, revealed that iOS apps were far leakier than their Android counterparts. A full 60% of iOS apps gathered your location data, 54% vacuumed up your contact lists and 14% siphoned information from your calendar. With Android apps, those percentages were 42%, 20% and zero, respectively — not exactly laudable, but certainly an improvement over the performance of Apple apps.

Encrypting user data was not a big priority for apps on either platform. All of the iOS apps sent unencrypted data to ad networks, while 92% of Android apps did the same.

Appthority says iOS apps fall short because ad networks are willing to pay more for user data from Apple devices, giving developers a greater incentive to gather and hand over as much information as possible. At the same time, there are more developers making iOS apps, so they have to work harder at making a buck — and that apparently tempts some to compromise on privacy.

"Developers are struggling to monetize, because it's hard to run a company giving apps for free or selling apps for 99 cents," says Domingo Guerra, president and co-founder of Appthority. "So, in turn, they use the ad networks to try and get money, and the ad networks will pay more money if the developers share more data on the users."

The Overall Numbers

Appthority tested business, education, entertainment and finance apps, as well as games. Entertainment apps were the worst when it came to user privacy. This category had the highest number of apps that tracked location and shared data with ad networks. Education and finance apps posed the smallest threat — relatively speaking, at least — to user privacy.

Individual developers built roughly 80% of the apps tested. Companies with iOS apps in the study included Apple, Intuit, Kids Games Club and PayPal. On the Android side, the companies included Imangi Studios, Intuit, PayPal and Intellijoy.

Appthority's last report was in July 2012, when the apps tested posed a slightly higher risk to user privacy. However, the study was done differently. It analyzed the top 50 free apps in each platform, regardless of category.

Last year's study also showed iOS apps gathering more user data than Android apps, though less than iOS apps this year.

The Trend

Guerra predicts the next Appthority study in three months will show a decline in risky app behavior, thanks to recent government crackdowns on online privacy abuse.

This month, the Federal Trade Commission announced an $800,000 settlement with social networking start-up Path, which was charged with uploading users' address book data without permission and gathering personal information on several thousand children without parental consent.

While prosecuting scofflaws can be a deterrent, sometimes the best way to protect privacy is to pay for an app, rather than hunt for something similar that's free. In general, paid apps gather less user data than free apps, Guerra says. "Your privacy is worth more than 99 cents, so just buy the app."

Mobile device manufacturers should pay close attention to a recent settlement between the Federal Trade Commission and HTC, which the Commission claimed had failed to protect customer's privacy and personal data. Rather than affecting only HTC, the agreement is a warning that the commission is finally prepared to hold device makers responsible for securing their products.

How It Started

HTC drew the attention of the FTC by deploying customized software in 22.5 million Android devices that allowed third-party applications to bypass a security mechanism requiring user permission before installation. The HTC software was meant to gather data only to help the manufacturer troubleshoot problems, but its implementation showed HTC was clueless when it came to security.

In investigating HTC's sloppy work, the FTC found a number of poor security practices. For example, HTC had no effective program for assessing the security of products before shipping them to consumers. In addition, engineering staff was not properly trained in security and privacy and there was no testing for security flaws. Also, there was no process for receiving and addressing vulnerabilities found by third-party researchers and academics.

The FTC's findings were listed in a complaint that HTC settled by agreeing to a "comprehensive security program" that includes patching vulnerabilities that could be exploited by hackers and spammers. The agreement is a big deal, because taken together with the original complaint, the FTC has outlined for all device manufacturers what it considers best practices for security.

"To settle the case - the FTC’s first against a device manufacturer - HTC has agreed to a far-reaching settlement that imposes a first-of-its-kind remedy: patching vulnerabilities on millions of mobile devices," FTC senior attorney Lesley Fair wrote in the commission's Bureau of Consumer Protection blog.

Dismal Android Security

Makers of Android smartphones and tablets have created a huge security problem by shipping devices with older versions of the operating system and then failing to quickly update the software with the latest security fixes from Google. This has left millions of customers with devices that contain known vulnerabilities that cybercriminals are working feverishly to exploit.

"It's reasonable to assume that the next thing the FTC will look at is the unpatched vulnerabilities in Android itself that Google has fixed, but where the fixes haven't reached end users either because of the handset vendors or the wireless carriers," Christopher Soghoian, principal technologist for the American Civil Liberties Union, said. "This is probably the most interesting FTC case to come out in the last couple of years."

The rise in Android malware is substantially faster than any other Internet-delivered malicious app, according to Cisco's recent 2013 Annual Security Report. At the same time, cybercriminals are developing better software tools for breaking into Android devices.

In October 2012, the FBI warned that cybercriminals had built a mobile version of FinFisher, commercial spyware sold to law enforcement and governments, to steal personal data from Android phones. Also last year, the first Android botnet was discovered on the Internet, according to Cisco. A botnet is a network of compromised devices used to distribute malware and spam.

The FTC Isn't Alone

The FTC won't be alone in demanding better consumer protection from device manufacturers. Rep. Ed Markey (D-Mass.), co-chair of the bi-partisan Congressional Privacy Caucus, plans to reintroduce this year "The Mobile Device Privacy Act," which would require companies to get the permission of consumers before using any monitoring software on mobile devices.

“With this important settlement, the FTC has sent a strong signal to the mobile marketplace that consumers’ sensitive information must be safeguarded,” Markey said.

With so much government attention on mobile device security, it's clear that manufacturers can no longer treat data protection and consumer privacy as an afterthought. Both will soon have to become a top priority.

Recent reports of Chinese cyberspying have revealed hacking operations with a shocking scale and level of sophistication. China's hackers appear to be stealing massive amounts of intellectual property and proprietary information from U.S. companies, including those connected to the nation's critical infrastructure, such as waterworks, the electrical power grid and oil and gas pipelines. A recent study by security company Mandiant has shown that, in all probability, some of the snooping has been done by an arm of the Chinese military.

The revelations of China's misbehavior have led some writers to rashly declare that the U.S. is at war with our Asian rival, at least in cyberspace. This could not be further from the truth, and here's why.

There's No War

First, something obviously needs to be done to punish China for its thievery. But to describe the current state as war or cyberwar draws emotions at the expense of rational thinking. We are not at war with China, either in or out of cyberspace.

Real cyberwar would start with an attack that destroys something valuable or vital, kills people, or both. If the recipient labels the strike an act of war then time for negotiations is over. "Reacting diplomatically and legally to an act of cyberwar is inadequate," says Stewart Baker, a partner at Steptoe & Johnson and a former assistant secretary for policy at the Department of Homeland Security. "It's an act of war, we need to treat it as such and respond with our own acts of war."

An example of a true cyberattack was the Stuxnet malware that destroyed centrifuges in Iran's nuclear facilities. Discovered in 2010, Stuxnet was designed by the U.S. and Israel, according to media reports.

We are not under attack by China. The country is not our enemy. It is our economic and political rival. There is no evidence China wants to destroy anything. What it wants is information that provides a trade advantage, and at the moment there's no better way to get data from U.S. competitors than to let your spies loose on the Internet.

Most experts assume the U.S. also hacks China's computers to gather intelligence. The Brookings Institution, a Washington think tank, has identified two growth areas in the U.S. defense industry, drone manufacturing and the development of malware capable of exploiting software vulnerabilities not yet known to the developer.

Governments have always spied on each other, so it's no surprise that China, the U.S. and many other countries are using the Internet to steal information. Where China goes too far is in hacking U.S. companies. By law, the U.S. government cannot break into the computers of private companies for the sole purpose of taking intellectual property. China has no such restrictions.

What We Can Do

So the U.S. is within its rights to use every diplomatic, political, legal and economic tool at its disposal to pressure China to stop hacking private companies – or to at least negotiate an informal agreement that sets limits. While it's true China holds $1.2 trillion in U.S. debt, the U.S. is also the biggest buyer of Chinese goods. The U.S. is not without leverage here.

The Obama administration has already put China on notice. On Wednesday, the White House released its strategy for preventing the theft of U.S. trade secrets. The plan includes ratcheting up diplomatic efforts and making prosecution of foreign companies a top priority.

Such pressure could eventually lead to informal agreements that start small and grow in scope as trust builds. A starting point for the U.S. and China could be a ban on the destruction or disruption of critical infrastructure or technology driving the global economic system.

In the past, nations have reached understandings governing maritime transportation, air transport, the behavior of navies and international trade well in advance of formal treaties on these subjects, according to a recent paper by Richard Clarke, a former White House adviser on cybersecurity and cyberterrorism, entitled "Securing Cyberspace Through International Norms." For example, the U.S. and Russia are in discussions to establish a cyber hotline in order to prevent cyberspace activity from escalating into a conflict.

In the meantime, the U.S. should move much faster to adopt regulations for securing critical infrastructure and corporate networks. A good start would be passage of the Cyber Intelligence Sharing and Protection Act (CISPA), which would establish rules for sharing cyberthreat information between private industry and government agencies. Such information is important in strengthening defenses.

Eventually, China and the U.S. will draw lines in cyberspace that neither will cross. To get there, we should avoid nonsensical discussions of war that paint China as the enemy, and look for areas of agreement from which we can move forward.

In 2010, Cisco Chief Executive John Chambers told reporters at the company's reseller conference in San Francisco, "We don't focus on other companies. We focus on market transitions."

The statement was a half-truth. Chambers should have said companies other than Microsoft.

On Monday, the eve of Microsoft's first Lync User Conference, Rowan Trollope, general manager of Cisco's Collaboration Technology Group, posted a blog that that explained why Lync was inferior to Cisco's platform for unified communications and collaboration.

"I'm quite sure some of it will generate controversy but that's OK - it's a conversation worth having in our opinion," Trollope writes.

But as sometimes happens when brands or political campaigns "go negative," the whole thing is blowing up in Cisco's face, as analysts point out the weaknesses in Cisco's arguments.

The real takeaway, in fact, is that Cisco seems to be scared of what Microsoft is selling.

Cisco's Claims

Trollope's post isn't super nasty, at least not by Apple-v-Android standards. But he takes some shots at Microsoft Lync, calling it "a solution that's primarily been developed for a desktop PC user experience" and thus "less able to meet these wider post-PC requirements than one that has been designed and optimized for them from the outset."

An example of the latter, Trollope says, would be Cisco's UC&C, which is a set of integrated products, such as messaging, Internet telephony, video conferencing and data sharing. All the products are accessed through a single user interface.

Another of Trollope's criticism is that with Microsoft, customers need to go out and buy all sorts of different devices instead of getting everything from a single vendor. "And, in our opinion, that could lead to increased complexity, cost and risk, not to mention the hours spent trying to figure out `who's on first' when troubleshooting an issue."

And finally this:

"There are other important topics that we think should also be discussed. Does your collaboration vendor have any conflict of interest with other BYOD device vendors? Can you move from an in-house deployment to a cloud-based service and get the same functionality? We would encourage you to explore these points with us and any other vendors you are considering."

This is all pretty garden-variety competitive marketing, and certainly far less aggressive than what Microsoft does with its anti-Google "Scroogled" campaigns.

Nevertheless, analysts were quick to cry foul and to point out flaws in Cisco's arguments.

Cisco's Hypocrisy

A large part of what Trollope called a "frank and direct conversation" was a "little far fetched and hypocritical," Gartner analyst Steve Blood says.

Cisco claims Microsoft's Surface tablet represented a conflict of interest, since Lync would also support competing tablets from Apple and Google. Cisco seems to have forgotten its own entry into the tablet market with Cius, which failed miserably and was pulled last year. "It wasn't worried about a conflict of interest then," Blood says.

Cisco also has other conflicts when it comes to hardware. While its UC&C products work on other vendor's systems, they run best on Cisco's Unified Computing System. And when it comes to partners offering Cisco UC&C in the cloud, its UCS server is the only hardware option, Blood says.

Trollope claims Lync is more complex and expensive because customers need to get phones, video equipment, voice and video gateways and networking gear through hardware partners since Microsoft doesn't make those products, while Cisco sells its own integrated hardware and software.

Art Schoeller, analyst for Forrester Research, isn't buying Trollope's argument. "Each account is different in what they have, what they want, and what capabilities are important to them and what model appeals to them more," he says.

While Cisco arguably has a stronger hosted platform than Microsoft, Cisco's biggest resellers are also selling hosted Lync and Office 365, which is "a recognition by Cisco's partners that in some instances, the Microsoft solution is something they would want to propose in place of Cisco," Blood says.

The biggest problem Microsoft has in offering Lync in the cloud is with voice communications. In many countries, as soon as voice hits the cloud, it becomes a regulated service, much like that of a carrier. Microsoft and Cisco are solving the problem by partnering with carriers. "Currently, Microsoft promotes Lync on premise, if a customer wants deeper voice capabilities like conferencing," Schoeller says.

Cisco Feels The Competition

Cisco is going on the offensive because Microsoft is becoming a serious competitor, which is good for companies in the market for unified communications products. However, Cisco would do better to focus on customers, rather than spend time attacking the competition with "ill-prepared, and weak arguments such as this," Blood says.

Forrester certainly struck a nerve when it released a survey on Wednesday that found a majority of customers using Oracle's e-Business Suite, PeopleSoft and Siebel business applications had no interest in switching to the company's next-generation Fusion Applications. Those laggards are complicating Oracle's efforts to reverse a slowdown in application revenue, Forrester said.

Oracle's Response

In a three-page counterattack, Oracle tore into the market-research firm. "This is a speculative note based on misconceptions and wrong hypotheses," the company thundered.

Despite Oracle's ostensible outrage, its counterattack is unconvincing. The company claims Forrester did not talk to enough of its customers to back its claims, as if the firm was doing a random survey of all of Oracle's customers.

Forrester never said it was doing that kind of survey. Instead, the respondents came from 180 of the firm's contacts that were responsible for choosing IT products and had knowledge of Oracle applications. "While nonrandom, the survey is still a valuable tool for understanding where users are today and where the industry is headed," the report says.

Commonsense would tell you that there are more reasons for Oracle customers to stay with the applications they have than to move to Fusion, which has a different code base. Such an undertaking is expensive, takes a long time and draws IT staff away from other pressing projects. With the older applications still being upgraded and working just fine, why would anyone want to make a major change?

(See the full text of its rebuttal below.)

The most damaging part of the survey was Forrester's finding that 65% of customers using the older business applications had no plans to switch to Fusion. Another 24% were on the fence.

Oracle complained that the survey only covered U.S. and European customers. Likewise, it noted that more than 40% of the respondents were in manufacturing, government, education and healthcare – industries it claims aren't representative of Oracle's overall customer base. For instance, Oracle cited an IDC report noting that Fusion doesn't yet fully support manufacturing operations, implying that manufacturers might reasonably be less than interested in making the switch to immature applications.

Ellison and company also moaned that many questions were phrased in a "negative way," as if that somehow disqualified the responses. Such questions included "What do you dislike most about your firm's most important Oracle applications?" and "Why doesn't your firm plan to use Oracle Fusion Applications?"

Who Do You Believe?

The report also claimed that Oracle has no clear strategy for migrating customers to Fusion. The company disagreed, saying it has always told customers they could adopt pieces of the product portfolio at their own pace and that everything – old and new – would work together.

Forrester also said that customers staying with the older applications were missing out on innovation. Again Oracle cried foul, saying that at Oracle OpenWorld last year, the company discussed future releases for E-Business Suite and PeopleSoft, as well as roadmaps for all its applications. Examples of innovation include iPad certification in PeopleSoft and new mobile capabilities in Siebel, Oracle said.

In 2006, Oracle promised customers it would always support and update its growing portfolio of business applications. By then it had swallowed PeopleSoft and Siebel Systems and it had to reassure customers that their investments were secure. Fast-forward seven years, and Oracle's promise, which it called Applications Unlimited, is coming back to haunt the software maker.

While trying to hold on to PeopleSoft and Siebel customers, Oracle was just getting started on its own Fusion Applications. The company marketed Fusion as its next-generation application suite that would bring together the best of its e-Business Suite and the software it had acquired for billions of dollars. Oracle believed Fusion would eventually spark a mass migration from the older applications to the latest and greatest from its own stable. But a new survey from Forrester Research has found Oracle was wrong, and customers are holding the company to its word and sticking with the older software with which they're familiar.

Oracle's Dilemma

By refusing to switch to the more expensive Fusion apps, Oracle customers are making it difficult for the company to grow application revenue, according to Forrester. "In recent years, Oracle's application revenue growth has underperformed both the overall software market and SAP, resulting from slowing growth in existing apps and too little revenue from its Oracle Fusion Applications," according to the report, entitled "Oracle's Dilemma: Applications Unlimited Versus Oracle Fusion Applications."

Forrester's numbers are sobering. The survey of Oracle clients found 65% had no plans to move to Fusion Applications and another 24% were on the fence. The biggest barriers were Oracle's muddled application strategy and the immaturity of Fusion, which became generally available in November 2011.

At the same time, recent acquisitions of software-as-a-service companies, such as Taleo and RightNow Technologies, are not bringing in enough revenue to take up the slack, according to Forrester. Oracle customers show little interest in trying the SaaS products, with only 11% of survey respondents interested in making the move.

Making matters worse, Oracle is in danger of losing business from some of its customers. Forrester found 29% of the companies it polled were planning to move to another vendor's SaaS product or packaged application. The main reasons for their unhappiness with Oracle were high licensing costs, high maintenance costs and difficulty in upgrading.

Impact on Oracle Customers

Oracle's lackluster revenue growth could eventually have an impact on customers. When revenue from a vendor's products are flat or declining, the company often treats the software as a "cash cow, milking maintenance revenues and cutting back in its investment in enhancing and supporting them," Forrester says.

While there are no signs that Oracle is heading down that path, the company is unlikely to let Applications Unlimited customers stay where they are forever. Oracle has sunk too much money and resources in Fusion Applications to let customers ignore them, Forrester says.

Oracle is expected to step up efforts to get customers to move to Fusion or its cloud infrastructure products. How the company will do that is not known, but it could decide to make life harder for companies that stay with the older technology. Forrester is advising companies to start preparing for the added pressure they will be feeling from Oracle.

For Oracle, Forrester believes it's "make-or-break time" for its applications business. The company's Fusion Applications are its main strategy for growing software revenue and defending against a number of fast-growing SaaS competitors.

Forrester believes that won't be enough.

Oracle is likely to kick-start its growth strategy with more acquisitions, and the likeliest candidates will be fast-growing SaaS companies. In the meantime, Applications Unlimited customers should ponder their next move.

What if two-thirds of the people on the Web were invisible, secretly visiting websites and searching for products and services while leaving Internet giants like Google and Facebook in the dark about what they were doing?

With no information on those people's activities, there could be no targeted advertising for them and much of the multi-billion-dollar ad Web market would collapse, forcing big changes in the business models of today's biggest and most powerful Internet companies.

While such a scenario won't come to pass overnight, a new study by market researcher Ovum indicates that millions of people could start "vanishing" from the Web within a few years, causing major disruptions to the Internet economy. The reason so many people may go data dark? Privacy concerns.

The Future Is Here

Signs exist today that the procession of media stories showing Internet companies compromising user privacy in favor of advertisers are having an impact on more and more people's psyche. Recent examples of the kinds of stories that over time erode trust include Facebook acknowledging to USA Today that it keeps a running log of the Web pages each of its 1 billion monthly active users visited over the previous 90 days.

Other examples include a half-dozen federal class-action suits filed against Google and Viacom, accusing the companies of collecting data on children visiting sites such as Nick.com and NickJr.com and then sharing the information. And Canadian and Dutch authorities reporting that WhatsApp, an instant-messaging smartphone application ranked as one of the world's top five best-selling apps, violates international privacy laws by forcing users to provide access to their address books.

The regularity of such bad behavior over the last several years has taken its toll on the reputation of the tech industry. In a survey of Internet users in 11 countries, including the United States, Ovum found that only 14% believed Internet businesses were honest about their use of people's personal data. Worse for the future of online advertising, 68% would choose to block all tracking, if that option was easily available.

While Web browsers and some Internet companies, including Facebook and Google, provide tools for limiting the sharing of data, they are not easy enough for most people to use- and they are not marketed in a way to maximize use, Ovum analyst Mark Little says. As a result, many people are open to alternatives, and entrepreneurs are moving in.

Examples include Personal Inc., Respect Network, Allow and Abine. This segment of the tech industry is growing with a message that goes beyond just blocking companies from online tracking. They also market personal data vaults and reputation management that searches and deletes data on the Web.

These tools are not widely used today, but in time, as privacy concerns continue to grow and these products get easier to use, consumers will likely start taking control of their personal data. "It's not just about blocking because you are concerned about your data and how much people know about you," Little said. "People are also blocking because they are feeling like companies are getting rich on them. They're feeling exploited."

People's Changing Attitudes

An indication of changing attitudes toward guarding personal information can be seen in the ballooning popularity of Snapchat, which lets users send pictures and videos that self-destruct in 10 seconds. As of last weekend, the app was the third most popular photo and video app in the U.S., according to analytics company App Annie. Snapchat claims that 50 million "snaps" are sent every day.

Snapchat is popular mostly with teenagers and young adults, who often use the service to send naughty pictures. However, there is also a growing unease among young people about having a permanent social record on Facebook or Google+, Bloomberg Businessweek reported this week. It makes sense that this age would be most sensitive to having personal information on the Web, since they are under constant surveillance by people who can have major impact on their lives, such as teachers, college admissions officers and parents.

The desire for impermanence and control of data is also growing among older Web users. A 2010 survey by researchers at the University of California at Berkeley found that roughly nine in 10 people aged 18-to-24 years and 45-to-54 supported a proposed law that would require websites and advertising companies to delete all stored data about a person on request.

Zuckerberg Was Wrong

In 2010, Mark Zuckerberg, co-founder and chief executive of Facebook, said he believed people's attitudes toward sharing personal data had become much more open. "People have really gotten comfortable not only sharing more information and different kinds, but more openly and with more people," he said in an onstage interview with TechCrunch founder Michael Arrington. "That social norm is just something that has evolved over time."

Zuckerberg was wrong.

As people become more aware of how Internet companies are using their personal data and how it affects their lives, they will start sharing less and demand more control. When that happens, a lot of Internet companies will have to dramatically change their relationship with users or make do with a lot less data to sell to advertisers.

Hewlett-Packard is forcing its Chinese suppliers to limit their use of student and temporary labor. The new rules, reported by The New York Times Friday, indicate HP is joining Apple in taking a stand on labor abuse, as the tech industry grows increasingly concerned about being tainted by Chinese practices.

HP's Rules

HP wants to separate itself from the use of student labor in assembly factories when sudden spikes in orders lead to labor shortages. With the help of local governments, manufacturers round up high school students, vocational school students and temporary workers. Students complain that school administrators order them to do the work, which often involves long hours and lower pay and have no relevance to the students' studies. As an incentive, factories will pay school administrators a bonus for sending them cheap labor on short notice.

Wisely, HP wants no part of this, and has told its suppliers that all work on its orders must be voluntary and students and temporary workers must be free to leave without repercussions. In addition, the work has to be related to a student's studies, a rule that likely will give most students a way out if factory work isn't to their liking.

By imposing the rules, HP hopes to avoid the kind of scandal that sullied Apple's reputation last year. Labor abuse at supplier Foxconn led Apple to join the workplace-monitoring group Fair Labor Association, which inspects Chinese factories making computers, iPhones and other devices for Apple.

Lessons Learned From Apple

Apple's troubles had an impact on HP, Intel and other electronics companies. Many started to look at overhauling their relationships with foreign factories and workers, according to The Times.

"The days of easy globalization are done," an Apple executive who requested anonymity told the newspaper. "We know that we have to get into the muck now."

How much impact the actions of HP or other tech companies will have on workplaces in China is unclear. Some manufacturers ignore Chinese laws on labor practices in order to meet customer demand amid labor shortages in the country.

Nevertheless, HP has decided to try and avoid having its reputation dragged through the mud of China's labor injustices.

Everyone can relate to having a boss whose expectations do not jibe with the experience of workers in the trenches. That disconnect is happening today between CIOs and IT departments struggling with Big Data.

What IT Workers Say

A survey of more than 300 IT department employees has found that the majority of those immersed in Big Data projects are failing to meet their objectives. Even worse, they are often the last people CIOs turn to for advice on Big Data and advanced analytics projects, according to the poll conducted by Infochimps and SSWUG.org, a community site for IT professionals.

Now there's certainly a self-serving element to the survey. Infochimps peddles Big Data in the cloud as a simpler way to get the job done than trying to manage it in-house. Nevertheless, the survey provides additional insight on what has been said before: that all the vendor hype around Big Data masks that fact that it's really hard and the technology is difficult even for IT pros.

The greatest challenge with Big Data is getting at the data trapped in various business applications across an organization, the survey found. Pooling this huge amount of information is necessary in order to run the necessary analytics to find ways to cut costs and run a more efficient business. But before that can happen, all the data has to be converted into a usable format.

"Predictive analytics and other Big Data novelties are downright sexy compared to the slog of gathering, normalizing, and cleansing data, but without clean data, your Big Data initiatives are likely to take longer, cost more, and deliver fewer benefits," Patrick Gray, president of IT consulting company Prevoyance Group, says in a blog on TechRepublic.

Another top reason for the failure of Big Data projects is overreaching. CIOs are looking for analytical platforms that meet an entire organization's needs. "Unless they understand specific use-cases first, many will find such an approach falls short," Kaskade says.

Other headaches for IT workers are a lack of expertise and not being given enough time. "Big Data is complex, and projects often take longer than planned due to education demands and challenges related to new technologies, corporate culture and integration," the survey says.

Big Data Versus Reality

Infochimps is not the first to get the Big Data rundown from IT pros. Gartner research director Svetlana Sicular wrote a blog last month saying companies' view of Big Data was headed from the "peak of inflated expectations" into the "trough of disillusionment."

The biggest disappointment was with Hadoop, an open-source framework for the heavy computational work needed with Big Data. While companies have ambitious plans for their data, they are struggling to build the analytics to deliver the results they want. "Formulating a right question is always hard, but with big data, it is an order of magnitude harder, because you are blazing the trail - not grazing on the green field," Sicular says.

Most companies taking on Big Data are not giving up. Infochimps found that 81% of the companies covered in the survey have placed Big Data and advanced analytics projects in their top five IT priorities for this year.

All the problems confronted so far are to be expected, given the immaturity of the technology. And as coaches like to say, "No pain, no gain."

Oracle's footprint in the tech industry just keeps getting bigger and bigger. Since 2005, the company has spent $50 billion on more than 80 acquisitions to extend its reach from business software to hardware to cloud computing and networking. On Monday, Oracle announced plans to acquire Bedford, Mass.-based, Acme Packet for $2.1 billion. If the deal closes by July as planned, Oracle will be facing off against Cisco in providing gear for transmitting voice, video and data over the Internet.

IT And Telecom

The Acme Packet deal reflects how information technology and telecommunications are coming together in ways that are forcing big shifts in the competitive landscape. As the industry enters the post-PC era, mobile devices will increasingly dominate the enterprise and corporations will need to work with carriers and IT companies to establish better communications between employees' smartphones and tablets and business systems.

Oracle sees this blending of IT and telecom, and Acme will make the second-largest maker of business applications a player in the new market. More than 1,900 global service providers and enterprise customers use Acme's gear. The days when separate companies could provide IT and networking gear are over. Carriers and enterprises are now looking for full-service tech vendors that can provide products for building all-Internet communication systems.

While hardware still matters, software's role in driving network capabilities is growing. "Although hardware is still important in many applications to provide needed performance, software is more and more critical for both differentiating and monetizing network capabilities," Dana Cooperson, analyst for market researcher Ovum, said in a statement. "Performance without monetization is only half the equation."

Oracle is certainly in a position to provide a complete software and hardware package to carriers and other large communications and content providers. Because of its $7.4 billion acquisition of Sun Microsystems in 2010, Oracle has the hardware to carry Acme technology. As an added bonus, Acme is likely to boost Oracle hardware sales, which have been in a slump.

Oracle vs. Cisco

Oracle has been encroaching on Cisco's market for a while. Last July, the company bought Xsigo Systems, marking Oracle's move into network virtualization, a potentially game-changing technology for moving data more efficiently in and out of data centers. The deal was announced a week after VMware agreed to pay $1.26 billion for Nicira, another network virtualization startup. Oracle did not disclose terms of the Xsigo deal.

Within the area of Internet communications, Cisco, still the networking leading, has been moving aggressively. While Acme is seen as a pioneer in the technology, some analysts believe it has lost some of its mojo as Cisco and the rest of the industry turn to mobile communications. The Wall Street Journal reported that analysts for Mizuho Bank are among the skeptics.

“While we appreciate Acme’s pioneer status and historically dominant position in fixed wireline networks, we argue that the company’s current wireless strategy will achieve limited success over time,” Mizuho said in a recent note. If Muzuho analysts are correct, then Oracle will have to correct Acme's weakness in wireless technology while also contending with increasing competition from Cisco and others.

Nevertheless, Oracle has the cash to get the technology it needs through acquisition. Cisco is in the same position, so expect to see more mergers and acquisitions throughout the year as competition heats up. "Oracle and Cisco can both afford to be aggressive with M&A whereas many of their peers cannot," Cooperson said. "Expect the buying spree to continue."

[Update: On Thursday, Uber announced that it had reached an agreement with the California Public Utilities Commission "confirming that Uber is legal in California and suspending the prior complaints and fines levied against the company." Uber also said its agreement "states that ride-sharing — or rides provided by drivers not specifically licensed to drive a limousine or taxi — is legal too."]

California regulators have shown that they are open to shaking up the taxicab industry. The state Public Utilities Commission has lifted a cease-and-desist order imposed on ride-sharing service Zimride and withdrew a $20,000 fine. The action was taken after the CPUC reached an agreement with the company that allows it to operate while the commission considers permanent rules for operating services that use mobile apps to match motorists with people needing a ride.

Over the last year, e-hailing services have battled regulators, politicians and the taxi industry in New York, Washington, D.C., Chicago and San Francisco. While regulators have said rider safety is the priority, skeptics have claimed resistance has more to do with protecting the taxicab industry.

"The regulators are pushing change and they're pushing technology," Matthew Daus, president of the IATR, says. The IATR is a professional association of municipal, county, state and federal transportation regulators.

Zimride operates the Lyft service in San Francisco. Its drivers are recognizable through the large, wooly pink moustache attached to the front of their cars.

The CPUC agreed to let Zimride operate for the time being, as long as the company continued with the safety measures it already had in place. Those included $1 million in excess liability insurance covering each of its drivers, who also have to undergo a criminal background check and have a clear driving record with the Department of Motor Vehicles. In addition, Lyft conducts in-person screening and vehicle inspection.

Unaffected by the agreement are Zimride rivals SideCar and Uber. (Update, Uber has now reached its own agreement with the CPUC.) Each of them still faces a $20,000 fine and a cease-and-desist notice the CPUC imposed on all three companies last November for running unlicensed taxi services.

Taxi Industry Gripes

The Zimride agreement is only temporary and no one can predict the CPUC's final rules. However, they are likely to take under consideration the gripes of taxicab companies.

To the taxicab industry, whether a company owns a fleet of cars or not, if it dispatches vehicles to pick up passengers, then it’s a taxi service. As a result, the same rules and regulations that apply to a taxicab company should also apply to e-hailing services.

The IATR is sympathetic to that argument. "That's not American," Daus says of favoring one competitor over another. "Everybody should have a level playing field."

Who plays in that field will depend on how regulators eventually define a taxicab service, in light of companies like Zimride. E-hailing services do not want to be a part of that heavily regulated industry.

Don't Call Me A Taxi Service

Indeed, Zimride does everything it can to not appear like a taxi service. Instead of fares, the company collects voluntary "donations" from passengers to give to drivers after Zimride takes its cut. The company also likes to talk about the "community" of people helping it provide transportation alternatives.

Don't let all this happy talk fool you. Zimride is a business that has raised $7.5 million in funding since it was founded in 2007. Investors include Mayfield Fund, Floodgate Fund and K9 Ventures.

On Wednesday, AOL's TechCrunch reported that Zimride had closed a new, $15 million round of funding led by Founders Fund. The company has declined comment.

Investors spend money on businesses, not communities, and Zimride holds the promise of solid returns. On the same day of the PUC deal, the company said Lyft would start operating in Los Angeles on Thursday.

If regulated as a taxi service, Zimride will find it much tougher to grow than if it is seen as only a mobile app maker that arranges rides. While the CPUC has given it a temporary green light, the potentially game-changing decisions for the taxi industry have yet to be made.

Facebook and Google deserve the criticism they have received over the years for fumbling user privacy. But the two Internet giants, along with Microsoft and Yahoo, also deserve kudos for defying police requests for users' online communications.

Defying Federal Law

The companies have chosen civil disobedience over following federal privacy standards set by an outdated Electronic Communications Privacy Act. Passed in 1986, long before the Internet became a force in American life, the ECPA does not provide email and online photos, video and documents the same protections as digital content stored in your home computer.

While police need a court-issued search warrant for the latter, a much-easier-to-obtain subpoena is all that is needed under the ECPA for content stored on the Internet. Subpoenas are often issued by prosecutors, who are much quicker to give cops the green light than a judge.

In criminal cases, Facebook, Google, Microsoft and Yahoo demand that police have a search warrant before handing over private online communications. The defiance stems from their belief that the Fourth Amendment trumps the ECPA.

"We believe a warrant is required by the Fourth Amendment to the U.S. Constitution, which prohibits unreasonable search and seizure and overrides conflicting provisions in ECPA," David Drummond, chief legal officer for Google, said in a recent blog post.

Google's argument has the support of the industry. "A change in technology should not mean a change in the principles behind our laws against unreasonable search and seizure," Ed Black, president and chief executive of the Computer & Communications Industry Association, said in a statement.

Legal Precedent

The legal precedent cited by the companies is in United States v. Warshak, a 2010 federal appeals court ruling that found police violated a man's constitutional rights by reading his email without a warrant, according to political newspaper The Hill. Legally, the tactic is risky because the ruling does not apply to police outside the court's jurisdiction in Tennessee, Ohio, Michigan and Kentucky.

Based on statistics provided by Google, the government is more often than not using subpoenas over warrants. In the last six months of last year, 68% of the requests Google received from U.S. government entities were through subpoenas. Only 22% were through search warrants.

How much of a fight these companies put up in resisting subpoenas is not known. Google, which is the most vocal, says it won't budge until there's a warrant ordering them to surrender a user's search query information, Gmail messages, documents, photos and YouTube videos.

Spotty Privacy Records

The bravado has a self-serving element. After all, people would be less apt to store their digital assets with Google or the other companies, if they were not willing to fend off government abuse.

Within their own operations, Google and Facebook have been criticized often for practices that sacrifice user privacy for building a better platform for advertisers. For the next 20 years, Facebook has to undergo third-party privacy audits, as part of a settlement with the Federal Trade Commission. In Europe, Google has crossed swords with regulators over a uniform privacy policy adopted last year that allowed the sharing of user information across its services.

Because these companies act like Dr. Jekyll and Mr. Hyde when it comes to privacy, we would all be better off if Congress was to fix the ECPA, and let Internet companies go back to their ad-driven businesses. Vermont Democrat Patrick Leahy, chairman of the Senate Judiciary Committee has said that updating the law is one of his top priorities. Republican leaders are less gung-ho about adding a warrant requirement, because of concerns raised by law enforcement.

Until these differences get settled, Google, Facebook, Yahoo and Microsoft will remain the first line of defense between users, government and law enforcement. While it's not the optimal way to protect privacy, we should be thankful that they are taking a stand.

Department store surveillance cameras are not just watching for thieves. Some are also tracking customer activity. Knowing the ebb and flow of the number of shoppers, the path they take through the store and the products they touch can provide valuable information for boosting sales. While customers may find this level of scrutiny creepy, retailers see it as survival in a low-margin, fiercely competitive business.

Customer Data For Marketing

Retailers and vendors say technology is not being used today to personally identify shoppers. Software companies such as Prism Skylabs and RetailNext blur faces or use heat maps in providing visualizations of customer goings-on. In 2010, The Global Association For Marketing At Retail warned marketers against recording or storing facial data without consent. "While technology imposes few restrictions on data collections in retail settings, marketers should safeguard consumer privacy," the group said in publishing a voluntary code of conduct for collecting in-store customer data.

The Federal Trade Commission has said it does not have a problem with gathering aggregate information on shoppers. "We would be very concerned about the use of cameras to identify previously anonymous people," Mark Eichorn of the FTC Division of Privacy and Identity Protection told Time magazine.

Not surprisingly, privacy advocates are taking a more hardline stance. For them, the use of cameras for anything but catching pilferers is wrong, because most people do not know they are being watched for reasons other than security. But do people really expect privacy when standing in an aisle and checking out a jacket? They certainly don't expect others to know who they are, but it's a reasonable assumption that others will see them handling the potential purchase.

Other Personal Data

Retailers are gathering lots of personally identifiable information today. Every time you have your loyalty or rewards card swiped, the store is recording your purchase in order to offer you future deals on the same product or something similar. Usually, the offers come via email or on the receipt, but they could also arrive by snail mail.

Loyalty cards help level the playing field between brick-and-mortar stores and online retailers, such as Amazon.com, one of the most advanced users of customer data. Amazon records every purchase for each customer, and is quick to email recommendations for products based on the customer's buying history.

In physical stores, tracking customers is another way of fine-tuning the business. Analyzing activity as a whole can lead to better decisions on staffing and on placement of product displays and ads. If online retailers track how people navigate their Web sites, why shouldn't retailers do the same in stores?

The use of video cameras to gather data for marketing and store performance is still new. A survey of 47 national and regional retailers found less than a third used surveillance to analyze shopping and buying behavior, according to the Loss Prevention Research Council. Only one in five used it to measure shelf and product placement effectiveness.

Tech Vendors And Privacy

Those numbers are expected to grow and tech vendors are starting to push the envelope in order to stand out from the pack. For example, Italian mannequin maker Almax SpA has started selling a dummy called EyeSee that has a camera embedded in one eye, Bloomberg reported. The mannequin comes with facial recognition software that can record the age, gender and race of passers-by.

EyeSee does not store any of the images. But you can see how Almax is trying to gather additional data through facial recognition, while still maintaining customer anonymity.

Tying facial recognition to a database of loyalty cardholders sounds like nirvana for retailers. Imagine identifying customers entering the store and then texting coupons or special deals on products they may be willing to try. This might even be OK, legally speaking, if people signed up for the service.

But retailers seem to be staying clear of facial recognition technology for marketing, according to Matthew Kovinsky, vice president of sales and marketing for San Francisco-based startup Prism SkyLabs. "We built in privacy to our product and we haven't actually heard a ton of perspective customers pushing us to do more one to one identification and tracking."

While that appears to hold true today, it's hard to imagine that will remain the status quo forever.

Such a dramatic move would inject Microsoft much deeper into the hardware business, giving it the chance to help drive the kind of innovation that has recently eluded the PC industry. But it would also blow up its longstanding partnerships with computer makers.

Shaking Up The Industry

Microsoft is only in discussions to join an investor group for a Dell buyout, and has made no commitment, according to The Wall Street Journal. If Microsoft were to follow through, than its investment would likely be around $2 billion. And Microsoft refuses to even discuss the report. "We do not comment on rumors or speculation," a company spokesperson said. Computer makers are also staying mum. Asus, Lenovo and Hewlett-Packard either declined comment or did not respond.

However, there's no reason why the rest of us cannot have a little fun exploring the implications of a deep Microsoft/Dell partnership.

First of all, this would simply be another step in Microsoft's push to shake up the moribund PC industry. Microsoft has good reason to take charge of its own destiny, rather than leave it to a bunch of partners that are moreKeystone Copsthan PC innovators. Since the release of the first iPad in 2010, tablets have joined smartphones in eating away at PC sales. During the last quarter of 2012, shipments fell more than 6%, marking the first time in over five years that the PC market has recorded a decline during the holiday season,according to International Data Corp.

So far, PC makers' most promising response has been the ultrabook, a thin and light laptop that is the Windows version of Apple's MacBook Air. Ultrabook specs, including size, weight and battery life, are dictated by Intel, which owns the name. So far, manufacturers have been unable to produce ultrabooks at a price low enough to attract consumers. As a result, PC makers are confusing customers with cheap alternatives with names like Sleekbook, which is made by HP.

That's clearly not good enough for Microsoft, which has made a radical overhaul of its operating system with Windows 8 and launched its own tablet after computer makers failed to slow sales of Apple's iPad, which dominates the tablet market today.

Microsoft + Dell

Microsoft has apparently grown tired of this bonehead behavior, and Dell offers a way out. For starters, the computer maker has deep expertise in PC manufacturing and support, neither of which Microsoft knows how to do well.

Besides the consumer market, Microsoft could steer Dell toward being more focused on the Windows platform for the data center, rather than "trying to do everything for everybody," said David Johnson, analyst for Forrester Research. In addition, Microsoft could get access to Dell's enterprise sales force.

Finally, a Dell/Microsoft partnership could lead to "converged infrastructure" that unifies the silicon, the hardware, the operating system platform and the management and operation tools. Such systems, which are a relatively new trend in the enterprise market, enable companies to use the same infrastructure for multiple purchases, such as running applications and storage. "It would be a potential game changer," Johnson said.

Of course, other computer makers would unlikely be happy with Microsoft working so closely with Dell - and perhaps giving it preferential treatment. However, Microsoft has already become a competitor with the Surface tablet, and largely gotten away with it.

Microsoft's hardware partners would likely have even bigger issues with a Microsoft-Dell hookup, but there's not really much they can do about it. "Once you decide you're competing with the OEMs (original equipment manufacturers), you're competing with the OEMs," said Michael Cherry at Directions on Microsoft. "You're already there. The investment just makes [Microsoft] a bigger OEM."

No one knows whether Microsoft will actually structure a deal to make Dell a premiere partner. But if it does, then Microsoft will effectively redefine its relationship with PC makers, creating a far different industry than exists now. And that could be a very good thing.

Best of all from Microsoft's perspective, investing in taking Dell private could give it most of these benefits for less than $3 billion, not the $24 billion and major complications a complete buyout might cost, if Dell were even for sale.

Ruthless competitiveness is what Oracle Chief Executive Larry Ellison uses to win in business. So no one should be surprised that how he defines the cloud depends on what's needed at the time. Inevitably, this sometimes shows the emperor has no clothes, or at least is down to his Armani skivvies.

While something in Oracle's massive portfolio may fit the industry definition of a cloud service, it is not the company's new integrated hardware and software bundle that's meant to provide the infrastructure for private clouds, according to David Linthicum, chief technology officer and co-founder of cloud consultancy Blue Mountain Labs. What Oracle is really selling, or in this case renting, is preconfigured application servers for the data center.

"Now we know how Oracle is addressing this shift in the market: by renting its stuff and calling it a cloud," Linthicum says.

Oracle's Cloud Strategy

Oracle's cloud strategy has been challenged before. Charles Babcock, who has covered the cloud for years for IT trade magazine InformationWeek, named Oracle the number one "cloud washer" of 2011. That term refers to companies whose cloud products are mostly old technology with the word cloud added to the name.

At the time, Babcock took aim at the Oracle Exalogic Elastic Cloud, "a name that contains so many contradictions of the definition of cloud computing that it threatens to render the term meaningless."

"It's an old-fashioned appliance that's been renamed 'a cloud in a box,'" he wrote.

Oracle Can't Move Fast Enough

Multi-billion-dollar companies like Oracle can't simply turn a switch and re-architect their technology to meet customer demand for something so dramatically different as cloud computing. This is why Ellison refused to acknowledge the cloud even existed until Oracle OpenWorld in September 2010. With startups eating away at his software business, Ellison needed to do something fast. Confusing customers with verbiage was the quickest way to buy time.

Fast-forward two years to this year's Oracle's OpenWord conference in San Francisco, and cloud became Ellison's favorite word to essentially describe renting Oracle data center technology, and letting the company handle the maintenance. Instead of real cloud computing, Ellison introduced what was mostly a new way of buying last-generation technology.

What's In Store

This year should get really interesting. Oracle is expected to start feeling even more pressure from true public cloud providers, such as Amazon Web Services, Google, Microsoft and Rackspace, as well as private cloud providers like Eucalyptus. If history is an indicator, then Ellison is likely to release the attack dogs in his marketing department to try to discredit rivals and confuse customers, while the company plays catch up.

For example, Oracle responded last year to tough competition from IBM in the hardware business by releasing ads with unsubstantiated claims that Oracle servers were much better. Three times the National Advertising Division of the Better Business Bureau pressured Oracle into removing the ads that appeared as full-page spreads in The Wall Street Journal and other publications.

The last ad, pulled in November, claimed Oracle's Exadata server would run five times faster than IBM's Power Server "or you win $10,000,000." The NAD found that the ad did not provide "any speed performance tests, examples of comparative system speed superiority or any other data to substantiate the message."

Ellison is sure to use this kind of bare-knuckles competitiveness in battling cloud rivals. It's in his DNA. Let's not forget that in January, Oracle Team USA, the America's Cup champion team owned by Ellison, was fined more than $15,000 for spying on the Italian team.

So cloud watchers should sit back and get comfortable. Ellison is likely to provide quite a show and a lot more smoke and mirrors.

Venture capitalists are betting that technology will play a major role in fixing a broken U.S. educational system. Accel Partners, Spectrum Equity and Meritech Capital Partners put their chips down last week with a $103 million investment in Lynda.com, which sells monthly subscriptions for training videos in business, technology and creative skills.

The investment was so important for Accel that the global firm was willing to woo the online education company for four years. For Lynda.com, which topped $100 million in revenue last year, the outside investment was the first in its 18 years of operation.

VC Excitement

What has VCs so excited is an educational system ready for technology's transformative powers. The soaring cost of higher education has made college only a dream for many high school grads, while those who borrow their way through risk landing in the poorhouse if they can't land a job. Student loan debt nationally exceeds auto loans and credit-card debt, according to Grading Student Loans, a scholarly blog published by the Federal Reserve Bank of New York.

VCs are gambling that online education startups will eventually remake the system through cheaper, more efficient alternatives to traditional options.

Last April, Kleiner Perkins and New Enterprise Associates helped Coursera make its debut with a $16 million investment. Coursera now has more than 2.3 million people using its free classes in science, business, economics, the arts and law. The coursework is provided by almost three dozen universities, including Princeton, Stanford and Duke. For now, Coursera is focused on building as large a user base as possible and plans to figure out how to make money later.

“Elite education is too expensive, and it’s available for too few,” Kleiner Perkins partner John Doerr told The Wall Street Journal. “I’m not saying accredited institutions will go away, but having great content available for free in the U.S. can transform community college education… and in the developing world as well.”

Other online schooling startups that received millions of dollars from VCs over the last few years have include Pluralsight, 2tor, Craftsy and Lore. In 2011, investment firms spent $171.8 million on education technology companies, which includes online learning, compared to $88.5 million in 2009, according the latest figures from GSV Advisors, which tracks such investments. In the first half of 2012, the amount of investment had already reached $105.3 million.

Direction Uncertain

How dramatically VC-funded startups will change higher education remains unknown. A lot of hurdles remain - with the biggest being accreditation.

To be truly disruptive, online education companies will have to deliver courses that are accepted by employers and recognized by State Universities and Colleges. California, often a trendsetter for the rest of the nation, believes online education can help the state lower college tuition that has risen by double digit percentages for the last several years.

Last Tuesday, San Jose State University signed a deal with Udacity to provide online versions of three math classes this spring for $150 each. While online courses have been part of college curricula for years, the California State University system is now trying to see if the online courses can help ease overcrowded remedial classes for local students. In other words, online education is being used to solve a real problem in California colleges: too many applicants for too few classes.

"Today our aim is to focus like a laser on entry-level classes that can be so difficult for our students," Mohammad Qayoumi, president of San Jose State, said during a public signing of the deal that was attended by California Governor Jerry Brown. "Every year nationally, 1.7 million students need remedial education. That's roughly the population of San Jose and San Francisco combined. We must innovate."

That innovation is exactly what the investments in online education technology companies are trying to create.

Having slogged through so much bad news of late, last week Hewlett-Packard marketers were quick to run to their laptops to make hay out of a closely watched market report showing that HP remained the word's top-selling PC maker. But in their rush to shine a positive light on their struggling employer, the PR folks left out the most important point: HP is fighting to stay king of an eroding hill.

For HP, Flat Is The New Up

International Data Corp. (IDC) found that HP's fourth quarter PC shipments last year remained roughly flat from the year before. But that was enough to keep it at the top with almost 17% of the market. Soon after the scrap of good news hit the Web, HP public relations went to work. "We believe HP's position as the market share leader demonstrates out ongoing commitment to deliver superior PC products and experiences across customer segments," the press release said.

Woo-hoo!

Ironically, in tooting its own horn, HP highlighted its biggest problem, which is its need to cling to dwindling markets. The IDC report found that global PC shipments fell more than 6% in the quarter and more than 3% for the year. It was the first time in more than five years the PC industry had recorded a year-on-year drop during the holiday season, according to IDC.

The reasons behind the decline are well known. People increasingly favor smartphones and tablets, both fast-growing markets where HP remains a non-player. Heck, even Microsoft, which helped to usher in the PC era, sees its demise and is pushing tablets and smartphones as the future of computing.

But for HP, staying flat in PCs was so exciting it had to churn out a press release. That's not a good sign. But given what else is going on at the company, the temptation is understandable.

Due to management bungling over the last few years, HP has fallen ever farther behind its rivals in taking advantage of game-changing trends in the consumer and enterprise markets. The company paid a total of $24 billion for Autonomy and EDS to become a player in big data software and IT services, respectively, only to see both deals go down in flames through huge write offs.

HP Battles Workers

Meanwhile, HP is chasing distractions when its focus should be on innovation. In Texas, HP is in a tussle with customer General Motors, which is in the process of giving HP services the boot. Eighteen employees quit HP at the same time and without notice to join GM's efforts to take its IT work in-house.

HP is asking the state court for permission to depose two of the workers; a move GM has called "retaliatory" and a "fishing expedition," according to Bloomberg. It seems HP can't understand why anyone would want to flee a company that has promised Wall Street that it will fire 29,000 employees this year and next.

Bright Spots

HP's current state is not all dark. Last week the company launched a services center for in-memory computing, an emerging technology that significantly boosts application performance by keeping all data in system memory rather than on disks. The announcement came the same day SAP said it was making all its business applications available on its in-memory database called HANA. HP plans to throw its support behind HANA and is also working on its own in-memory platform, codenamed Project Kraken," according to InfoWorld.

Kraken-like initiatives are what HP's PR team should be crowing about, rather than the company's managing not to shrink in the cratering PC market. Chasing hot new markets - not scrambling to be the last PC vendor to avoid extinction - is the only way to change HP's image as a dinosaur.

SAP has taken a big step ahead of rivals IBM, Microsoft and Oracle with the announcement on Thursday that its in-memory database called HANA is now ready to power the German software maker's business applications. The pronouncement is sure to darken the mood of competitors, who one analyst says will need several years to match what SAP has accomplished.

What SAP has done is to provide one database that can perform both business analysis and transactions, something its rivals are able to provide only by using two databases, according to Gartner analyst Donald Feinberg. "It's the only in-memory DBMS (database management system) that can do data warehousing and transactions in the same database. That's where it's unique."

For SAP customers, HANA-powered applications can speed up the sales process dramatically. For example, today when salespeople for a large manufacturer takes a large order from a customer, they may not be to say on the spot exactly when the order will be fulfilled. That information often comes hours later after the numbers are run separately through forecasting applications.

HANA To Power SAP's Business Suite

With HANA running SAP's enterprise resource planning applications - called Business Suite - salespeople will be able to take the order and get forecasting information in seconds. "This changes the way they do business. It really does," Feinberg said. "And that's the kind of value proposition that HANA brings to the table because of the fact that it's an in-memory database."

During a multi-site news conference in New York, Palo Alto Calif., and Frankfurt, Germany, SAP demonstrated HANA's speed using its manufacturing resource planning software. In Palo Alto, Hasso Plattner, SAP co-founder and board chairman, said the database will eventually be available in all products, whether on-premise or in the cloud. "All SAP products will go HANA," he claimed.

Competitors Racing To Catch Up

Oracle and IBM are expected to match SAP in time. Microsoft has announced that it is working on the same technology for the next version of MS SQL. "I do believe that every other vendor is going to go in that direction, but it's going to take them two to five years to do it, which gives SAP a huge head start," Feinberg said.

SAP is expected to make HANA generally available for business Suite over the next six months to a year. In the meantime, the technology will be available through what SAP calls "Ramp Up." That means customers can get the new product if they agree to put it into production with SAP's help. This gives the vendor the opportunity to work out any kinks and to establish a number of customer references.

HANA, was first made generally available as a standalone database in mid-2011, and SAP now claims to have almost 1,000 customers. The company believes those numbers can rise even faster once HANA powers all of its products.

At the same time, SAP promises to support customers that stick with the traditional databases currently running SAP applications, whether the databases are from competitors or the company's own Adaptive Server Enterprise (ASE).

In-Memory Databases

Many companies today offer in-memory databases for a variety of tasks. The databases are much faster than traditional technology because all data is stored in system memory where it can be accessed quickly. Standard relational databases write and read to disks, which is a much slower process. (http://www.mcobject.com/in_memory_database)

In unveiling HANA for Business Suite, Plattner took a dig at Oracle Chief Executive Larry Ellison, who predicted SAP's six-year effort to power its business applications with HANA would fail.

"I have to admit. I enjoy that he is not smiling," Plattner crowed. "And I know that there is a weekly meeting [at Oracle] which has the word HANA in it."

Those meetings are likely to go on for a while as Oracle and other vendors race to catch up.

Cisco grabbed some of the limelight at the Consumer Electronics Show by unveiling a cloud-based video platform for service providers like cable TV companies. A lot of hubbub was made over technology, which would deliver movies and TV shows on any device at any time. But in all the oohing and aahing over the new product, Cisco and partners left out one important detail: TV lovers will be paying a lot more for these services.

Videoscape Unity

Cisco's Videoscape Unity is a content-delivery platform for the living room. Software embedded in a set-top box would enable subscribers to watch content from a TV, tablet and smartphone. Cable operators would be able to deliver, for example, a Major League Baseball game on the TV and provide simultaneously stats on the players on an iPad.

In addition, the technology would be able to recommend movies and programming based on a person's TV habits, and a cloud-based digital video recorder (DVR) would let subscribers record content for viewing later from any device. The platform would also have a social media element. Viewers will be able to chat with friends on Facebook and Twitter.

For the pay TV industry, Cisco's platform provides a much better business model for squeezing more money from subscribers. In general, cable operators today charge more by adding channels to packages. However, the cost of content is high and good programming is scarce. Just think of how many times you've searched dozens of channels and found nothing you really want to watch.

Changing Business Models

With technology like Cisco's, cable operators will have to worry less about the number of channels they offer and focus instead on charging more for the services they provide. For example, European cable operator Liberty Global rolled out last year its Horizon platform that provides similar services to Cisco's product. Horizon represents a "whole brand new revenue stream," Balan Nair, chief technology officer for Liberty, says.

"With our Horizon product when we launched it, we didn't have anymore-new channels, but we charged quite a bit more for that product," Nair said during a panel discussion following Cisco's announcement. "And it was just based on the fact that you got a whole bunch of new features and some ancillary services."

Using technology to sell more products to subscribers is behind much of the excitement over products like Videoscape Unity. But whatever money comes in won't go only to the cable operators. Deals will have to be made with movie studios and TV networks, and possibly Apple, which currently dominates the tablet market.

With Apple, Nair made it clear that Liberty Global prefers not to offer its service through an app sold in Apple's App Store. "There's a whole bunch of other ramifications associated with that, especially in the economics of delivering that content," he said. Liberty would rather use a browser plug-in to deliver programming via the web.

New Content Deals

Content providers have already put cable operators on notice that they will need to sign licensing deals to let people watch on multiple devices. New contracts will also be needed, if cable operators plan to let people view programming on a tablet outside of the home.

Another issue is whose customer is the viewer? Is it the cable company, the content provider or the TV network? Also, who gets access to valuable information, such as TV habits, and how is revenue from services and advertising shared?

"Part of the challenge is how does this stuff get glued together so that it's intuitive and seamless to the user, understanding the fact that there are business models that are very important that need to exist or change or evolve," panelist Joe Inzerillo, senior vice president for multimedia and distribution for MLB.com, said. "This content, this professional content, is not created for free and that's sort of the elephant in the room. How do you get there?"

Until new revenue sharing and licensing agreements are made, it's unlikely cable operators and content providers will sign up for Cisco's or any other new platform for TV. For example, Cox is committed to Videoscape Unity, but it hasn't said whether it would include a cloud-based DVR, according to CNET.

Indeed, the use of such a service has been challenged before. In 2007, the TV and movie industry sued Cablevision for launching what it called a "networked DVR." Instead of having the video recorder functionality within the set-top box, the cable operator stored recorded programming on a remote server, reducing Cablevision's hardware cost by taking the DVR out of the box.

Cablevision won the suit on appeal in 2008, but that has made movie studios and TV networks even more cautious in letting cable operators use the cloud to provide content to subscribers.

In time, deals will get made as the home entertainment center evolves from static viewing to a more interactive experience. But as business models are built around services, as well as programming, consumers will have to look at their current cable TV bill and decide how much more they are willing to pay.

While Hewlett-Packard says it "continues to evaluate" the sale of underperforming businesses, the company's cash flow problem will make the shedding of assets unavoidable. So what's likely to head to the auction block? Everything from notebooks and desktop PCs to Itanium servers and tape drives that have been draining assets could be on the market.

A Breakup Alternative

For Chief Executive Meg Whitman, selling off pieces of the crippled tech giant would be a much better alternative to breaking up the company. Whitman has opposed the latter option, starting with her decision in 2011 to nix a proposal by her predecessor, Leo Apotheker, to spin off the company's personal computer unit.

Since then, Whitman has ignored Wall Street analysts who say shareholders would be better off if the company spun off the division that sells PCs and printers from the one that sells software, hardware and services to companies.

As a less dramatic alternative, getting rid of businesses draining the company's limited resources, would help HP make better use of limited cash. In fiscal 2012, HP's free cash flow dropped to $6.9 billion from $8.1 billion the previous fiscal year, according to The Wall Street Journal. That's a trend that could spell trouble if not stopped. Without cash, a company will find it difficult to develop new products, make acquisitions, pay dividends and reduce debt.

Getting rid of underperforming businesses is one way to improve cash flow and avoid splitting the company. "Everybody zeroes in on printers and PCs as the things they should potentially sell, and quite frankly, there's not really a logical buyer for either of those businesses," Crawford Del Prete, analyst for International Data Corp., said. "And, those businesses generate a significant amount of cash, which Hewlett-Packard needs right now."

HP-UX And More Must Go

A more logical sale would be the Itanium server business. HP has spent a lot of money trying to drive sales of its HP-UX Unix server that runs on that chip architecture, while the business continues to shrink. In 2010, Microsoft said it would drop support for Itanium and Oracle said a year later it wants to do the same.

Another candidate for jettisoning is HP's low-end IT outsourcing business, which was included in the 2008 acquisition of Electronic Data Systems. Earnings from the services business has been falling, and last August, HP said it would write off $8 billion in goodwill from the EDS purchase.

Last year, General Motors, a major HP customer, said it would move away from outsourcing IT and take some work in-house. The announcement made industry observers wonder whether HP can handle those large-scale deals, Del Prete said.

Within HP's Personal Systems Group that makes PCs, workstations, tablets and printers, the company could sell the low-performing notebook and desktop PC businesses, which have been trumped in the market by tablets.

The low-end printer business that primarily serves consumers and small businesses could also be sold. However, printers are still used in emerging markets, so HP would be just as likely to hold off to see how profitable those markets become. "HP has a plan to drive those businesses, so I'd be surprised to see them get out," Del Prete said.

Finally, tape drives used for long-term data storage is a candidate within the company's enterprise servers, storage and networking division. Such a low-margin business would be best left to IBM and others with larger stakes in the market.

HP likely has other losers within its product lines that it would be better off without. Whitman should act quickly to get rid of the chaff and focus resources only on the profit generators.

Hewlett-Packard has made if official. The Justice Department is indeed investigating HP's allegations that Autonomy execs tricked the troubled technology giant into paying way too much for the British software maker. In disclosing the probe in its annual regulatory filing with the Securities and Exchange Commission, HP has started the next chapter in its ongoing feud with Autonomy founder Mike Lynch - who denies duping HP.

Probe Expected

The probe was expected, given that HP announced last month it had proof that it had been conned in last year's $10.3 billion acquisition-turned-fiasco. At the time, HP said it had turned over the evidence to the Justice Department, the SEC and the U.K. Serious Fraud Office. "On November 21, 2012, representatives of the U.S. Department of Justice advised HP that they had opened an investigation relating to Autonomy," the company reported to the SEC Thursday.

HP claims Autonomy executives inflated the company's value by reporting some revenue prematurely or improperly. The alleged bogus reporting accounts for almost 60% of the $8.8 billion write down HP booked last month on the Autonomy deal.

Ex-Autonomy Chief Executive Lynch responded to the investigation Friday by continuing to deny any wrongdoing. On a website Lynch set up to counter HP's allegations, he reiterated his complaint that HP has yet to release any details of the alleged scam. "Simply put, these allegations are false, and in the absence of further detail we cannot understand what HP believes to be the basis for them," Lynch wrote.

Details Still Hidden

HP is still keeping the details of the allegations confidential among itself, prosecutors and regulators. Thursday's filing did not provide any new details. Nevertheless, Lynch is ready to tell his side of the story. "We will co-operate with any investigation and look forward to the opportunity to explain our position," he wrote.

Throughout the claims and counterclaims, HP stock continues to get hammered. From the beginning of 2012 to Thursday, the price has fallen 45%.

Officially, the Federal Bureau of Investigation won't discuss whether or not it is involved in the case. However, an unidentified source told Bloomberg that the agency is assisting the SEC in its investigation.

While Autonomy execs are under the investigatory microscope, shareholders are blaming HP for the deal that ended up wasting billions of dollars. In the SEC filing, HP lists 10 lawsuits, including four class-action suits.

Apotheker Still Blamed

HP CEO Leo Apotheker, who was fired in September 2011, led the Autonomy deal as part of a plan to get HP deeper into the high-margin enterprise software business, while reducing its dependence on selling low-margin PCs. Autonomy software searches, organizes and manages data within large companies.

Apotheker sealed the end of his short career with HP when he announced he was considering the sale of its PC business. Because he had no buyer, Apotheker's disclosure sent Wall Street analysts into a tizzy. To them, Apotheker appeared to lack a clear vision or roadmap for saving HP from its years of bad deals, management turmoil and strategic blunders.

Current HP CEO Meg Whitman was on the company's board when it signed off on the Autonomy deal. Nevertheless, she has distanced herself and other board members from the debacle by laying the blame on Apotheker and then mergers and acquisitions head Shane Robinson, who also left the company in 2011.

History aside, now that federal prosecutors are officially involved, the repetitive claims and counterclaims being tossed back and forth between HP and Lynch won't matter much. The companies, their customers and shareholders now have to hope for clarity in the courts, especially if charges are filed.

Apple's online retail store fell short of customer expectations over the holiday shopping season. Satisfaction with the company's website fell to a four-year low, according to ForeSee's annual Holiday E-Retail Satisfaction Index. One big problem, no customer reviews to help shoppers decide which product is right for them.

Apple's Wake Up Call

Apple's score of 80 is still considered very good, but the fact that the company did not make the list of top five online retailers should be a wake-up call. "It is a little bit of a yellow caution flag," said Larry Freed, president and chief executive of Foresee.

Apple's score fell three points from last year, because it did a poorer job helping customers wade through its growing list of products, Freed said. For example, outside of price, the difference between the iPad mini, iPad 2 and iPad with Retina display is not readily apparent for many consumers.

By comparison, Amazon topped the list for the eighth year in a row, despite having a vastly wider variety of products from multiple manufacturers. While Amazon is the equivalent of an online department store, Apple is more of a boutique shop.

Apple Avoids Social Reviews

"There are no product reviews on Apple's site," he said. "In trying to decide if the (iPad) mini is right for me or not, you're really forced to go somewhere else to get somebody's opinion."

ForeSee's index is based on more than 24,000 customer surveys collected between Thanksgiving and Christmas. A drop in the index is significant because each point on average represents a 14% decline in a site's growth rate in sales, according to Freed. For example, if a site had a 20% growth rate year to year in 2011, then dropping a point this year would cut that rate by 3%.

Apple's decline was not the worst among the 100 companies listed. The biggest drop in customer satisfaction was on J.C. Penney's site, which fell five points to 78. Among PC makers, Dell's site also dropped 3 points, but to a lower overall score of 77.

Like Apple, Dell's problem stems from providing too little assistance in figuring out which PC is the best fit among a wide variety of choices. "When you have too many choices, consumers tend to freeze and don't know what to do," Freed said. "Dell has that [problem]. They've got to simplify how you find what you are looking for."

Turmoil In Apple Retail Operations

Apple experienced significant turmoil within its management ranks in 2012, although it's difficult to say whether that contributed to its weaker performance in online customer satisfaction. John Browett, head of Apple's retail operations, was ousted this year after only six months at the helm. He had replaced Ron Johnson, who was responsible for Apple's highly successful retail stores. Johnson left Apple to become CEO of J.C. Penney.

Browett joined Apple from Dixons, a British consumer electronics retailer, where he had been CEO. Browett's troubles at Apple were mostly operational. His plans for cutting costs included reducing employees' hours and freezing hiring, decisions Apple later reversed.

While it would be unfair to pin Apple's latest ForeSee score on Browett, there's no doubt the consumer electronics maker needs to double down on improving its online customer experience to avoid turning the decline into a trend. Apple did not immediately return requests for comment.

Software-defined networks (SDNs) are poised to move from market hype to real technology in 2013. International Data Corp. predicts the market size of this disruptive technology will soar from barely existent this year to $3.7 billion by 2016. When a market is expected to grow that much in so little time, big technology players take notice and start shopping for what they won't have time to build for themselves. IDC is marking five startups that will be high on shopping lists: Big Switch Networks, Embrane, Plexxi, Vello Systems and Midokura.

To get an idea of why this market is so hot, let's review SDNs and why they are so potentially disruptive. Software-defined networking places control of network resources within a software layer that sits on top of routers, switches and other physical and virtual network devices. This solves the problem of having to use a control panel for each individual device in order to configure, program or perform other management tasks for the network. Making significant changes to a network today can take one to two weeks using the standard tweak-each-router method, but an SDN holds the promise of reducing the time to a few hours.

For an enterprise market where server elasticity is becoming a hot commodity, imagine what a enterprise infrastructure could do if the network were elastic, as well.

A key enabler of SDNs is the OpenFlow protocol created at Stanford University and the University of California at Berkeley, a standard now under the control of the Open Networking Foundation. Board members of the foundation include Cisco, Juniper Networks, Brocade, Citrix, Hewlett-Packard, Dell and IBM. Not coincidentally, these are the same companies that IDC believes could go on an SDN shopping spree next year.

Here's a look at each of the company's that IDC believes could be on the short list of some of these vendors:

Big Switch Networks

In November, Mountain View, Calif.-based, Big Switch Networks released its product suite called Open SDN. The suite includes a controller that can sit on top of roughly 1,000 switches to handle programming and set policy-based functions. Other suite components include a network monitoring application and data center network virtualization software for automated network provisioning.

Since March 2011, the company has raised $39 million from investors, including Redpoint Ventures, Index Ventures and Khosla Ventures. Its founders are Guido Appenzeller, who worked on the OpenFlow standard at Stanford, and Kyle Forster, who was the vice president of product management at Joost before starting Big Switch. They founded Big Switch in 2010.

Embrane

Founded in 2009, Santa Clara, Calif.-based, Embranereleased its "heleos" software platform a year ago. The product can be used to control a variety of network services, including load balancers, firewalls and virtual private networks. Heleos, which also provides wide-area network optimization, is targeted at cloud environments, whether public, private or hybrid.

Former Cisco executives Dante Malagrino and Marco Di Benedetto founded the company, which has raised $27 million in funding. Investors include venture capital firms Lightspeed Venture Partners, New Enterprise Associates and North Bridge Venture Partners.

Plexxi

Cambridge, MA-based, Plexxi introduced its product strategy this month. The company has built switches that can communicate directly with each other over high-speed fiber optic interconnections. This is meant to replace traditional network architectures that have an access switch communicating with other switches in order to connect two computers, according to MIT Technology Review. The use of access switches can create bottlenecks and unnecessary overhead; something Plexxi is seeking to work around.

Plexxi Chief Executive David Husak founded the company in 2010. Before Plexxi, Husak co-founded Reva Systems and founded C-Port Corp. and Synernetics. Plexxi investors include North Bridge Venture Partners, Matrix Partners and Lightspeed Venture Partners. As of June, the company had raised nearly $48.5 million.

Vello Systems

Founded in 2009, Menlo Park, Calif.-based Vello Systems has been profitable from the start. In each of its first two years, it had roughly $10 million in sales, according to Dow Jones & Company. Vello's first and only round of funding was for $25 million in 2011. The money came from financial institutions that were also Vello customers.

Vello focuses on using SDN technology for internetworking between cloud data centers. Essentially, the company provides cloud-switching software that optimizes latency sensitive applications, such as content delivery, storage replication, big data connections and cloud services. The company first focused on carrier networking and later expanded into corporate data centers and cloud service providers.

Midokura

Midokura is a Japanese startup that entered the U.S. market this year. Company Chief Executive Tatsuya Kato and Chief Technology Officer Dan Mihai founded the company in 2010.

Midokura's flagship product is called MidoNet, which is an intelligent software abstraction layer that manages the internetworking between the hardware infrastructure in an enterprise and the OpenStack cloud-computing platform used in public and private clouds. The company claims its technology reduces complexity, improves fault tolerance, optimizes network traffic and delivers higher availability of servers and services.

Settling the latest shareholder suit from Hewlett-Packard's Autonomy debacle should be easy. All HP has to do is show that it has actually built some marketable software out of the $10.3 billion acquisition. The question is, where is that software?

Stanley Morrical isn't convinced such software exists, so last week he sued HP in Federal court in San Jose, CA, accusing the company of fraud. Morrical is not buying HP's claims that Autonomy executives duped it into buying the British software maker last year through "serious accounting improprieties, misrepresentation and disclosure failures." HP has asked US and British regulators to investigate for criminality.

HP's Alleged Cover Up

HP says it will take an $8.8 billion write off from the purchase of Autonomy, most of it due to paying too much for Autonomy because of alleged shenanigans with Autonomy's accounting prior to HP's acquisition of the UK company. But Morrical says all of these allegations are covering up HP's incompetence in failing to upgrade Autonomy's software and release it as a sellable product.

“In an effort to conceal their own gross mismanagement, fraudulent conduct and potential exposure to securities claims, HP’s officers and directors have blamed the entirety of the $8.8 billion write-down on accounting issues," Morrical's lawsuit says.

HP did not respond to a request for comment.

The suit's allegations stem from HP's announcement in November 2011 that IDOL 10, a major upgrade of Autonomy's IDOL 7, was ready. In general, IDOL software searches, organizes and manages all data within an enterprise. The upgrade included integration with HP's data analytics application, acquired that same year with the purchase of Vertica.

Where's IDOL 10?

While claiming to have IDOL 10 ready, HP actually had nothing to sell, Morrical is accusing. Essentially, he claims, IDOL 10 was vaporware.

"You go out in the market and say it's available and it's not," Aron Liang, an associate at the San Francisco law firm Cotchett Pitre & McCarthy, which is representing Morrical, said. "So either they knew it and they're lying or they don't even know what they're selling, which in some ways may even be worse."

David Schubmehl, a tech analyst for International Data Group, said he was briefed on IDOL 10 in June. However, Schubmehl says he hasn't talked to any companies using the software.

"I can't confirm that anyone is actually using IDOL 10," Schubmehl said. "However, I have had briefings about that back in June and it certainly seemed to be part of their big data offerings."

In an interview with the San Jose Mercury News, an HP spokesman declined to comment on the status of IDOL 10.

The suit also accuses HP's leadership of corporate waste and failing to meet their legal obligation to act in the best interest of shareholders.

Buying Autonomy

HP made the accounting allegations against Autonomy in November of this year, roughly a year after agreeing to buy the software maker. Other tech companies and industry experts have said Autonomy was overpriced.

The deal was brokered by HP's then-CEO Leo Apotheker, who was ousted months later and replaced with ex-eBay Chief Executive Meg Whitman.

Whitman, who was on the HP board when the Autonomy deal was approved, takes no responsibility for the purchase and believes HP shouldn't either. But shareholders aren't buying it. Others who have filed suits over Autonomy claim they are the real victims and they want their day in court.

In the case of Morrical, he also wants to see some real software come out of the deal. If HP has it, then the company shouldn't have any trouble showing him.

In 2012, IT growth and innovation centered around mobile devices, cloud services, social networking and Big Data. 2013 is likely to see accelerated adoption in all those areas, as many companies move from experimenting and testing to deployment.

What follows are 2013 predictions for some of the fastest growing next-generation technologies in enterprise IT. If 2012 seemed like a tumultuous year, then hold on to your hats. Next year is going to be another bumpy ride.

1. Big Data

First up is Big Data. 2013 will see companies continue to spend much more on databases and business intelligence tools to drive innovation and boost operational efficiency. Big Data technologies will have the most impact in the financial industry as well as medical and scientific research. Corporations wanting to deploy business analytics will look to those industries for guidance.

International Data Corp. defines Big Data as new generation "technologies and architectures, designed to economically extract value from very large volumes of a wide variety of data by enabling high-velocity capture, discovery and/or analysis." In 2010, companies spent $3.2 billion worldwide in Big Data technology.

In 2015, Big Data spending will reach $16.9 billion, representing a compound annual growth rate of 40% or about 7 times the growth rate of the overall information and communications technology market, IDC says. Because growth will outpace the supply of talent, companies are expected to look to vendors for cloud-based services that can offload much of the work from inside IT staff.

While software and services are expected to make up the majority of Big Data spending, companies will be spending on infrastructure at a faster rate. Spending on storage will grow the fastest through 2015 with a CAGR of more than 61%, IDC says.

2. Software-Defined Networking

On the networking side, software-defined networking (SND) will enter the refinement process needed before products are ready for production use, according to Forrester. The maturation process will take roughly five years, as SDN components are tied together and technology added for integration with management systems, orchestration software, hypervisor management products and networking protocols. Forrester recommends that companies prepare for industry adoption of SDN by starting training for IT staff in 2013.

3. In-Memory Computing

While watching carefully developments in SDN, many companies are expected to take in-memory computing to the mainstream, with the help of vendors such as SAP and Oracle, Gartner says. "Numerous vendors will deliver in-memory-based solutions over the next two years driving this approach into mainstream use." As the name implies, in-memory computing brings data sets closer to computational engines, replacing the much slower architecture that involves pulling information from a database in a separate server. This opens up the possibility of real-time or near real-time results from transactional and analytical applications running against the same in-memory dataset. A mouthful to be sure, but the process could mean big advancements in how fast companies can analyze and act up on the data they gather.

4. Social Technologies Drives Enteprise Collaboration

In the front office, employees' use of social networks, such as Facebook and Twitter, is driving companies to build their own enterprise social networks to give workers secure areas for collaboration and sharing data. In 2013, IDC predicts these networks will move beyond the pilot stage and into production.

Gartner sees a similar trend with enterprise app stores for smartphones and tablets. Faced with vendors limiting stores to specific devices, companies will deliver private application stores to workers by 2014. This will avoid the multiple payment processes and licensing terms that would come from using public stores from vendors.

5. Windows 8 Doesn't Get Traction

On the desktop, Microsoft is not expected to win big in the enterprise with Windows 8 until well after 2013 - if ever. Gartner says 90% of corporations will skip large-scale deployment of the latest version of the operating system through 2015. Most enterprises and their PC management vendors are not ready to deal with the touch interface Microsoft has added to its flagship product. As a result, companies will wait until support for the dramatic OS change becomes widespread in the business technology market.

6. Gamification Wins

Finally, techniques used in building addiction to playing online games will get adopted to boost worker productivity. Measurement of performance, feedback and incentives will be used to engage employees and tie their actions more closely to business outcomes, Gartner says. The worldwide market for gamification technology and services will rise from $242 million this year to $2.8 billion in 2016. Within three years, 40% of the Global 1000 companies will use gaming techniques, a process called gamification, to improve performance and efficiency of their business operations.

Now that Oracle plans to gobble up yet another company for its growing portfolio of cloud-based software, the question is whether the company can chew everything it puts in its mouth.

The business applications maker said Thursday that it would buy Eloqua for $871 million. Eloqua's apps measure the effectiveness of marketing and sales projects. Oracle plans to make the company's software the centerpiece of its "Marketing Cloud" product line.

The acquisition will fill out the cloud-based services offering Oracle hopes will grab business from rival Salesforce.com. Earlier this year, Oracle closed a $1.9 billion deal for Taleo, a maker of online human-resource management software, and a $1.5 billion purchase of RightNow Technologies, which makes software for managing customer service.

Oracle's Challenge

Oracle suddenly has a lot of cloud software to make work well together if it wants to give customers a reason to lease multiple applications. The task won't be easy, given that the multi-vendor software was never designed "to fit together like Legos," said Andrew Frank, analyst for Gartner, said.

Taleo is a customer of Eloqua, so that portion of the integration challenge may be easier. But the extent of Oracle's success will be determined when its services go up against products from Salesforce.com, IBM and SAP. "Whether it fits together well is a question for the market to determine," Frank said.

On paper, the acquisition gives Oracle an apples-to-apples comparison with Salesforce.com, which provides integrated marketing, sales and service management applications. Oracle needs the same all-in-one portfolio, because its rival "continues to have a strong win rate against Oracle in CRM (customer relationship management)," said Richard Sherlund, a managing director of equity research for investment bank Nomura, in The Wall Street Journal.

Oracle is certainly willing to pay for what it wants. The company offered Eloqua shareholders a 31% premium over the Wednesday closing price. The transaction is expected to close in the first half of 2013.

Based in Vienna, Va., Eloqua was founded in 2000 and went public in August. Customers include Sony, Johnson & Johnson, VMware and Siemens.

Cloud-Based Marketing Apps Are Hot

Marketing applications are favorites of vendors building out cloud-based services, because the data-intensive software is a good fit for cloud deployments, Frank said. Data centers that run cloud services are built to handle lots of data and to provide the computational power needed to analyze that data.

In addition, marketing software needs to be integrated with other cloud services, such as social media. An added value is the fact that the applications are accessible to the mobile devices now expanding in the enterprise.

As a result, researchers like Gartner see an increase in spending on cloud-based marketing services, and Oracle doesn't plan to be left behind in the race to grab those dollars.

IBM believes technology's future lies in cognitive computing, which essentially means making computers think more like humans do. To IBM, that includes giving computers sensors that enable it to touch, see, hear, taste and smell - sensory input as one more piece in the puzzle to help solve problems.

IBM's Five in Five

IBM's progress toward cognitive computing is seen in the company's annual end-of-year predictions. Rather than its usual practice of prognosticating on where five technologies will be in five years, this year's Five in Five focuses on innovations that make it possible for computers to experience each of the five senses.

The projections mark the very beginning of what will be a long journey toward cognitive computing. The first step in building machines ablet to behave, think and interact like humans is to give them the same sensory abilities. That way computers can understand their environment, learn from it and act upon it. For example, if a robot could hear a train's whistle and feel the vibration on the tracks, it might be able to figure out that a locomotive is coming and get out of the way.

The Five Senses

Touch: IBM also expects smartphones and tablets to communicate using haptics, nonverbal communication that enables people to experience how an object feels. Haptic feedback is already use to for many things, including to provide tactile sensation when typing on a glass keyboard, but that's only the beginning. Eventually, devices pointed to an ecommerce site could vibrate to simulate the feel of a fabric’s weave, for example. (Click on the images to download the infographics.)

Sight: Vision will get an upgrade, too. IBM believes computers will be able to identify images and understand what they mean without the use of tags. This will lead to systems that can help doctors analyze X-ray pictures, magnetic resonance imaging (MRI) machine, ultrasound or computerized tomography scans.

Hearing: There will also be improvements in computers' ability to hear and understand sound. Greater sensitivity to sound pressure, vibrations and waves could lead to more-accurate landslide warnings, for example.

Taste: Computers with virtual taste buds will be able to calculate flavor, according to IBM, helping chefs improve recipes or create new ones. The systems will break down ingredients to their respective chemicals and calculate their interactions with neural sensors in a person's tongue and nose.

Smell: And, finally, according to IBM, computers will have an acute sense of smell in order to diagnose from a person's breath a coming cold, liver and kidney disorders, diabetes and tuberculosis. Similar to how a Breathalyzer detects alcohol, the computer will be able to check for molecular biomarkers pointing to diseases.

IBM believes it can blend the sensory innovations in computing with mobile devices, cloud computing and social media to create, "an unbounded set of possibilities in terms of what we can do," Kerrie Holley, an IBM research fellow, told ReadWrite.

How We'll Use Cognitive Computing

In time, cognitive computing will be able to do what people don't do well, such as understand the interactions of changing elements in huge systems. Examples include the global economy or weather patterns. With the help of a thinking, sensory-aware machine, we'll be able to cut through the complexity of these systems, helping us make more-accurate predictions and anticipate the consequences of particular actions.

In addition, cognitive systems can help us separate our personal prejudices and egos from the facts in trying to solve a problem.

"The machines will be more rational and analytic," Bernard Meyerson, chief innovation officer at IBM, said in a blog post. "We’ll provide the judgment, empathy, moral compass and creativity."

Think of it this way: The human-digital relationship will mirror the extraordinarily effective partnership enjoyed by Captain Kirk and Mr. Spock on Star Trek.

In 2020, the annual amount of digital data created, replicated and consumed will total more than 5,200 gigabytes for every man, woman and child on the planet, according to a new International Data Corp. report. That’s 50 times the amount of per-person data than in 2010.

Once it’s consumed, almost all of the rough information today effectively vanishes in the overall ocean of data. Yet within the data are tidbits of facts on customers, suppliers and business operations that, if linked, could prove useful or even profitable. Seeing the potential, some businesses are sizing up the trove – the data they control and other’s.

We are on the cusp of the data wars.

Developing high-value, competitive information from one’s own data will be expensive enough, but a lot of data that today is shared like almost-interesting dross will command a price. Formidable data walls will separate foe and ally alike.

Consumers will try to wall off their data too. As the value of their personal data rises, they will demand more compensation in the form of services if nothing else. This will open up new avenues of competition between companies.

Early Skirmishes Among Online Firms

Internet companies are becoming ever more protective of (some would say paranoid about) their customers’ data, keeping it from competitors. This month, Instagram, which is owned by Facebook, took steps to prevent photo-sharing on Twitter. Facebook and Google have been scuffling over data sharing for two years.

Because member information is the lifeblood of these ad-serving companies, it is at the forefront of this battle. Where once data was hoarded to appease a firm’s chief counsel or government regulators, it has an intrinsic value in predicting market trends and finding hidden inefficiencies.

As a result, virtually all industries see data-sharing policies tightening and pay walls becoming more common.

Data As Currency

Dave Reinsel, an IDC analyst and co-author of the EMC-sponsored “Big Data” report, said, "Anytime you have a form of currency [in this case, data], you start to find behavior like this. People begin to collect it. They begin to associate value with it."

Consumers, for example, may ignore the pact they sign with an online business that gives it the right to trade their information, but the business doesn’t. When you join Facebook, for free, the social network sells your personal information. You get to use Facebook, and your use generates more data for Facebook to sell. The same deal exists between account holders and Twitter, Google and uncounted others.

These companies have lots of leeway in the use of this data, but that is changing as people stake claims on their data. The European Union has adopted a "right to be forgotten" law that allows Europeans, at least, to permanently delete personal information from any site, assuming there’s no legitimate grounds to keep it.

Companies using Facebook today to interact with customers are expected to start building their own social networks in order to keep all the accumulated data for themselves. Businesses in the same industry could form consortiums to combine data in order to define industry-wide trends. Innovative models of collaboration based on data are expected to arise between companies to create new assets.

What's Needed?

Before many of these changes can take place, a lot needs to happen.

Today, 25% of the data floating about the digital universe might be valuable if analyzed, IDC says. Yet, only 0.5% of the universe is actually analyzed.

By 2020, IDC predicts a third of the data in the world will have value, but only if its tagged and then analyzed. Tagging data streaming from surveillance cameras, medical devices, social media and everyday transactions in a company's computers would be hard work, to grandly understate the idea, and would require a big investment (another sizeable understatement) in technology. Without it, that data gold will not be mined.

Companies are building the infrastructure necessary for data collection, analysis and the presentation of analytical results. IDC predicts spending will grow by 40% between this year and 2020, on the necessary hardware, software, services, telecommunications and staff.

Who knows? That might seem like a bargain when CEOs in 2020 sit down to count coup.

Not surprisingly, IBM execs believe they will soon be selling servers and supercomputers that process data much faster than today's technology allows. But they’re talking about much, much faster.

Rather than use electricity to move data between processors, IBM is preparing to use light, giving high-end customers enough power to keep pace with the growing information flowing through data centers.

IBM's Tiny Milestone

After more than a decade of research, IBM scientists have put nanophotonics technology onto 90 nanometers of silicon and integrate it with ordinary microprocessors.

That last part is critical. If IBM can integrate the new optical technology with ordinary processors, it avoids a layer of manufacturing expense and complexity, and it can make the new systems in conventional factories.

IBM is attacking a bottleneck in data centers and supercomputers. Moving data as electrical impulses - even over the comparatively small distances between chips - chokes off performance.

Solving the problem now means adding more processors, which raises costs.

Practical Nanophotonics

A proof-of-concept just two years ago, IBM’s nanophotonics system is a transceiver that receives electrical impulses from one chip, converts them to staccato light signals, and transmits those to another transceiver sitting alongside another processor. The receiving transceiver reverses the process.

Even with the conversions, the process creates a much higher chip-to-chip data flow - 25 gigabits per second per channel -and can also be used for chip-to-memory communications, another common bottleneck.

Just as with the far bigger fiber-optic systems used for global telecommunications, each beam of light can carry multiple data streams, each on a different wavelength of light. While not a feature on the current proposal, this would make it possible to send terabytes around otherwise conventional electronics with almost literally blinding speed.

"The technology is very universal, very flexible and versatile," Solomon Assefa, a nanophotonics scientist for IBM Research, said.

Short, Sharp Signals In Data Centers

IBM is preparing to put nanophotonics in servers, data centers and supercomputers, Assefa said. The technology is also expected to play an important role in exa-scale computing, a benchmark expected to be reached before the end of the decade. Today's supercomputers are measured in petaflops, equal to 1 quadrillion calculations per second. An exaflop is 1,000 times faster.

There will be millions of transceivers needed for an exa-scale system, he said.

Martin Reynolds, an analyst for Gartner, said IBM's technology appeared flexible, but noted that it's destined to remain in high-end systems for a while.

"The device has to have optical power [i.e. light] fed to it, as there is currently no practical way to generate light on the silicon," Reynolds said. "This demand requires sophisticated and expensive packaging techniques that, for now, constrain these devices to high-end systems."

So it will be some time before nanophotonics make its way into laptops or tablets. In the meantime, IBM hopes to beat competitors by getting it into data centers and supercomputers first.

IBM has launched a cloud-based office-productivity suite called SmartCloud Docs that some industry observers say is a competitor to Google Drive (nee Google Docs).

But the comparison is apples to oranges. OK, well maybe Fuji to Granny Smith.

Different Ends of the Market

While Google Drive started with consumers and gradually became a business product, IBM’s focus here is the enterprise.

SmartCloud Docs is only a small piece of the company's corporate collaboration services, called SmartCloud for Social Business. SmartCloud Docs adds the ability to create documents, spreadsheets and presentations.

"If customers, the press, or analysts position this as going toe-to-toe with Google Docs, or frankly, for that matter, parts of [Microsoft] Office 365, they're missing the fundamental differences between the vendors," Tom Austin, an analyst for Gartner, says. "IBM is into solution-selling."

That means IBM’s quarry isn't a department head or branch office. It wants to sell Social Business to the whole organization to tackle a core business operation. Instead, picture a medical research facility looking to collaborate with hospitals in other countries.

"What it's really doing is trying to penetrate the boardroom and the [C-suite]," Austin said.

Social Business also includes features for creating and managing blogs and wikis, a cloud version of the Lotus Notes email client, calendaring, archiving and mobile access to services.

IBM is far behind Google in releasing a cloud-based productivity suite. Google rolled out its services in stages, starting in 2006. But that’s not seen as a particularly bad mistake for IBM. Corporations adopt slowly, giving IBM time to adopt and adapt.

What About Microsoft?

Within the enterprise, Microsoft, which sells Office 365 as a cloud-based collaboration-and-productivity suite, is a much stronger rival to IBM.

For companies heavily invested in Windows and Microsoft’s on-premise Office suite, Microsoft is easing enterprises into its cloud by giving them the option of leaving some workloads behind the firewall, Forrester Research analyst TJ Keitt wrote in a recent report.

"Google's vision, on the other hand, is to leave the desktop behind for a cloud ecosystem based in the browser –- a vision for which many enterprise IT leaders aren't ready," Keitt wrote.

A Different Take

Unlike research rival Gartner, International Data Corp. sees Google and IBM as competitors, but says IBM has taken its product a step further when it comes to helping people work together.

For example, SmartCloud Docs can analyze versions of a document and help authors reconcile the differences, said IDC analyst Melissa Webster.

"That's a new level of functionality” in that feature segment, said Webster.

Before SmartCloud Docs, such features were primarily used for code-management, helping developers reconcile code. "It's a sophisticated problem to solve," she said.

Enterprise features like these set IBM apart; that and its focus on comprehensive technology packages.

Google has its own large-companies, though. It’s integrating Drive in Salesforce.com, Workday and other online apps.

"This resonates with IT shops that seek to move as many commodity application workloads into the cloud as possible," Keitt wrote.

So while SmartCloud Docs and Google Drive are similar in function, they are entering the market from opposite directions. And while they haven't met yet, the growing use of the cloud will change that.

While cloud computing giants Amazon, Google and Microsoft scramble to cut prices to lure customers to their cloud-computing infrastructures, their smaller rival Rackspace Hosting is heading in the opposite direction.

"We're not going the route of a race to the bottom," says John Engates, Rackspace's chief technology officer. Instead, Rackspace is betting that corporate customers will pay more to make the San Antonio-based company their IT department in the cloud - a strategy that's sure to face challenges as the number of competitors rises.

Rackspace's Success

Over the last several years, Rackspace has competed mostly with Amazon in the business of renting data center servers for pennies an hour to run websites and business applications. Since its initial public offering in 2008, Rackspace has grown its cloud infrastructure business to nearly 24% of revenue. The company has more than 180,000 business customers and topped $1 billion in revenue last year for the first time.

But the competitive landscape is growing more crowded. Besides Google and Microsoft, heavyweights Dell, Hewlett-Packard and IBM have also checked in for battle. Rackspace is banking that analysts are right when they say price alone isn't going to win business. Besides pitching better services at a higher price, Rackspace is touting OpenStack, an open source, cloud-computing platform it launched with NASA in 2010. Roughly 150 companies have joined the initiative, including Intel, Dell, HP, IBM, Cisco and AT&T. In fact, on Tuesday, HP released HP Cloud Compute, making it the first non-Rackspace vendor to build an offering with OpenStack (see HP Switches On Public Cloud, Thanks To OpenStack).

Rackspace and OpenStack supporters are hoping businesses will buy into their claims that the platform provides more flexibility than the various proprietary systems offered by Amazon and others. Theoretically, whatever you are running on one OpenStack implementation can be moved to any other OpenStack implementation, making it easier for customers to switch vendors.

Fighting Vendor Lock-In

"The challenge with proprietary (cloud technology) is that customers feel like they're going down a path that's sort of a one way street," Engates tells ReadWrite. "They're locking themselves in and there's no way out. If they choose to go somewhere else, they have to re-architect, rebuild or retool, and that's a challenge."

Over the last several weeks, Rackspace has rolled out the complete suite of OpenStack-based products, including servers, databases, infrastructure monitoring, backup, storage and networking. "We're in full production with all these services," Engates says.

Is OpenStack Immature?

OpenStack has its critics. Research firm Gartner says Rackspace and other supporters are not really interested in building an open cloud platform. Rather, they have pooled resources in order to battle Amazon's dominance in providing cloud-based Infrastructure-as-a-Service (IaaS). They also do not want to pay license fees for commercial cloud technology, such as VMware's vCloud stack.

Rather than being ready for prime time, OpenStack is an "early-stage project whose future, though promising, is still uncertain," Gartner says.

"Some people have been led to believe that because OpenStack is open source, it is an open and widely adopted standard, with broad interoperability and freedom from commercial interests," Gartner analyst Lydia Leong wrote in a recent analysis. "In reality, OpenStack is dominated by commercial interests, as it is a business strategy for the vendors involved, not the effort of a community of altruistic individual contributors."

Big Data

Nevertheless, Rackspace is plowing ahead with plans to use OpenStack early next year in helping companies manage the huge amount of data created each day. Rackspace has partnered with Hortonworks, which is using the Apache Hadoop data platform to power products for storing, managing, processing and analyzing large amounts of data. "Big data is an area that we think is still in the early days, but there's a lot of interest and a lot of excitement around it," Engates says.

Amazon also sees money in big data. Last week, the company introduced its first big data product at the first Amazon Web Services (AWS) re:Invent customer conference. Amazon is calling big data platform Redshift Data Warehouse as a Service, and business intelligence vendor Jaspersoft has announced support for the technology.

Like Amazon, Rackspace is taking its cloud services outside the U.S. Roughly a quarter of its revenue comes from overseas, mostly from Europe. It has data centers in London and Hong Kong and plans to open one in Australia "very soon," says Engates, who didn't know the exact date. The company is also considering a data center in Latin America.

With expansion overseas, a major platform redesign and new products on the way, Rackspace will need higher prices to fund its ambitions. The question is whether potential customers will still find them a good deal in a highly competitive market