New York City passengers won’t see much of a difference in service. The bases are locations for administrative work, so them being shut down inconvenienced drivers who needed to go for licensing tests and training, not riders. Furthermore, the TLC had decided to block Uber’s request to open a base in Brooklyn until it handed over trip information.

But the TLC required Uber to include vehicle license plate numbers if it wanted its bases reinstated. That means Uber will be giving that specific, non anonymized driver information at least to the New York City government. The company told the New York Business Journal it’s doing so “under protest.”

Uber initially cited trade secrets for not wanting to give up its data. It may not have wanted local governments to being able to send their taxis to popular Uber areas at peak times, for example. But as evidenced by the blog post on giving data to Boston and other cities, the company had a change of heart, realizing that it could offer data to cities as an olive branch and potentially ease Uber’s local regulatory conflicts as a result.

]]>If you’re a lawyer going through the discovery process, or you’re just into playing NSA with your employees’ communications, this one’s for you: Recommind, the e-discovery and enterprise search firm, just launched a new tool called Hypergraph for discovering and visualizing patterns in unstructured data.

Hypergraph, which is now part of Recommind’s Axcelerate On-Demand SaaS package, is a tool for finding and highlighting hidden connections between people, documents, messages and other elements. As with so many data visualization tools, the idea is to make it easier for even non-technical people to look at a mass of unstructured data and spot the links they need to find – such as, for example, a naughty employee’s telltale communications with the wrong people.

Here’s the sort of visualization we’re talking about:

Of course, Recommind does have competition in this space, notably from ZyLab, EMC and IBM. However, the company is pushing the ability of Hypergraph to parse and provide insights into big data, something it does through machine learning and graph analysis techniques. That in turn makes for more informative visualizations.

As Recommind CEO Bob Tennant pitches it:

“Unstructured data represents the great majority of the information universe, and it’s by far the hardest to analyse. Data visualization in combination with data mining can produce insight that would be hard to achieve any other way. To make a significant impact, visualisation has to be massively scalable, highly automated and simple to use. Hypergraph is the first solution that has it all.”

]]>Salesforce.com will set up its first European data center in the UK next year, the enterprise software-as-a-service firm said on Thursday.

The company has come under criticism for not having a European data center in the past, largely due to compliance issues – Salesforce is part of the EU-U.S. Safe Harbor framework, which means it’s allowed to handle European citizens’ personal data, but many customers would prefer the certainty that a locally sited data center allows. (We will be discussing such issues at our Structure:Europe conference in London on 18-19 September, by the way.)

Salesforce said last year that it hoped to open a data center in the UK in 2013, but this appears to have been pushed back a little now. According to a statement today, the new data center – the firm’s sixth — will be completed in 2014 in partnership with NTT Communications’ local arm, NTT Europe.

In a statement, Salesforce CEO Marc Benioff said Europe had provided the greatest revenue growth – 38 percent — for the company in the 2013 fiscal year:

“We are doubling down on Europe with the announcement of our new data centre in the UK, which will support continued customer success in EMEA.”

Robin Balen, NTT Europe’s wholesale data center business chief, added that the new facility would be “powered 100 percent by renewable energy sources.”

Innovation Challenge

Meanwhile, Salesforce has also teamed up with a group of European venture capital firms – Notion capital, Octopus Investment and MMC Ventures – to launch a €5 million ($6.6 million) Innovation Challenge for startups.

Startups are invited to pitch their enterprise cloud apps that could run (surprise!) on Salesforce’s platform. There will be pitching events through Europe between September and November, and the winners will get seed funding. Apps will need to be at least in the beta stage, with demonstrable “traction, customer success and user adoption.”

“This is a unique opportunity for innovative start-ups in the enterprise app market here in Europe to receive commercial support to allow them to compete on a global stage,” Octopus principal Luke Hakes said in a statement.

]]>Nasdaq OMX is offering a new cloud computing service for storing and analyzing financial trading data, and it’s built atop the Amazon Web Services cloud. The service, named FinQloud is comprised of a regulatory data retention product called Regulatory Records Retention, or R3, and an on-demand analysis tool for trade data called Self-Service Reporting, or SSR. Given the seemingly close partnership between AWS and Nasdaq, FinQloud looks like another step in AWS’s quest to prove itself ready for enterprise workloads and might suggest more such partnerships to come.

For its part, FinQloud is about what it sounds like. According to a Nasdaq press release announcing the service:

“The platform combines AWS’s secure, flexible, and cost-effective cloud infrastructure with NASDAQ OMX’s experience in providing technology and advisory services to exchanges, regulators and broker-dealers that operate within a complex global regulatory framework. FinQloud may be used by a variety of financial services firms managing data not only from NASDAQ OMX but also from any number of sources.”

In order to meet stringent regulatory compliance requirements, all data connections to FinQloud will pass through an encryption system housed in a Nasdaq data center before making their way to AWS’s infrastructure, and use of R3 will be contingent on clients showing appropriate documentation to regulators.

However, perhaps a bigger deal for AWS than FinQloud itself is what it says about how AWS — and other cloud providers — might go about getting into the business of targeting individual industries, especially heavily regulated ones. Close partnerships like AWS appears to have with Nasdaq, which technically is the one selling FinQloud, could help cloud providers establish toeholds in lucrative markets such as financial services without requiring them to stray too far from their general-purpose nature. Cloud providers supply the infrastructure, a market player provides the industry expertise, compliance features and a trusted face, and, in theory, everybody wins.

]]>In 2012, cloud computing will be old enough to do some great things — and to get into trouble.

By next year, more businesses will know a good bit about the notion of cloud computing — even if they themselves haven’t put corporate applications or data onto public or private cloud infrastructure. C-level executives and their operational troops have at least read up on the subject and faced enough sales pitches that they can differentiate the real from the bogus. And I would bet they all have a passing knowledge of the potential benefits (moving budget from capital expenditure to operational expenditure, the pay-as-you go model, etc.) from the pitfalls (IT cedes control of infrastructure to an outsider) of cloud computing.

In his top 10 cloud predictions for 2012, Forrester Research analyst James Staten said the upcoming year marks the beginning of cloud computing’s “awkward teenage years.” That means there will be some real maturation — and some embarrassing slip ups.

On the plus side, he agrees cloud-savvy customers will mean an end to annoying “cloud washing.” That’s the practice in which a marketer cloaks his or her company in the mantle of cloud computing whether it’s applicable or not.

Also good news is that cloud IT posers will be discovered and shunted aside as more people get trained and gain real experience on cloud deployments. IBM, Hewlett-Packard, EMC and others are all stepping up here. There are more openings for cloud experts than cloud experts to fill them, but new training and certification will address that imbalance: all good things.

Not so good is the prospect of cloud failures. The message here is: Things break; get used to it. “Your company will survive a major cloud outage,” Staten states. “The sooner you learn to deal with [that] the better off you will be.”

Also scary: There are some ugly regulatory issues on the horizon. Much as parents try to rein in unruly teenagers, governments will try to control the spread of cloud computing. Just as the U.S. Patriot Act gave international companies pause before deploying cloud assets in the U.S., other attempts to regulate, ban or spy on cloud computing assets will constrain growth. A few months ago, for example, a Deutsche Telekom exec proposed a German cloud that would be safe from U.S. snooping. This attempted imposition of national or other borders on cloud computing could kill off this adolescent before it’s old enough to vote. Bottom line, says Staten: “The Internet knows no bounds and neither should the cloud.”

And as often happens even with well-intentioned “good” teenagers, some executive in the cloud universe will overstep the rules, trying to test compliance regulations in the cloud and will end up in court (or worse) or at least unemployed.

Finally, a topic close to my heart, Staten reiterates the mantra that IT channel players – the companies that sell software, hardware and services to business customers — that haven’t evolved to embrace the cloud better do so now. Or else. It’ s not a new message, but it still rings true: “For the channel to survive it must add value around cloud services and there’s plenty of opportunity to go around.”

Just as human beings are made stronger by the trials and tribulations they faced as pimply-faced teens, the hope here is that cloud computing will face and survive these challenges, and come out on the other side stronger and more capable.

The four-year-old company focuses on helping large companies use private and public cloud computing services within existing compliance and regulatory frameworks, said Servicemesh CEO and chairman Eric Pulier.

As more large companies want to use cloud services, they’re looking for ways to make sure the new IT scenario falls within existing compliance and regulatory guidelines for their business. They want the same kind of service level agreements (SLAs) and assurances they can get in the traditional on-site data center world.

“If you have a business unit that used to take 15 to 20 weeks to fire up a workload and build an app on it, that took massive amounts of steps. With cloud services, the unit can now automatically fire it up, but if that workload has a regulatory mandate where the data can’t leave Vietnam or it can’t leave an internal cloud, Servicemesh can enforce that,” Pulier said. Similarly, if an organization has a cost cap assigned to it, Servicemesh will make sure that if it spends, say, $5,000 in a day, an alert is generated, for example.

Servicemesh’s most direct competitors include legacy systems management companies like BMC and Hewlett-Packard.

This is the company’s first dip into the venture funding pool. “We spent the last four years building this company through some extremely intense times to 90 people and profitability but we’ve been pretty much under the radar. Now we’re ready to scale with this first round of VC, “Pulier said.

Ignition Partners has built up an impressive array of cloud-oriented investments including Cloudera, Heroku (now owned by Salesforce.com) and Splunk.

]]>Amazon Web Services has rolled out a new offering, called GovCloud, designed specifically to run federal government workloads. The AWS region is designed to meet the myriad regulations that government agencies must meet when deploying new infrastructure, which have proven something of a hindrance in terms of letting the government adopt cloud computing services. And the timing couldn’t be better.

The GovCloud site explains that government agencies have had trouble processing and storing data in the cloud, which the government says can be accessible only by U.S. persons. But because AWS GovCloud “is physically and logically accessible by U.S. persons only, government agencies can now manage more heavily regulated data in AWS while remaining compliant with strict federal requirements.” Of course, GovCloud will feature high-level security for all this data.

Additionally, AWS is providing a variety of on-demand and reserved pricing options for government agencies, something that’s likely necessary given their sometimes strict budgetary requirements. Additionally, as Amazon CTO Werner Vogels notes on his blog post announcing the new region:

“[G]iven the restrictive nature of this new AWS Region, customers will need to sign an AWS GovCloud Enterprise agreement that requires a manual step beyond the usual self-service signup process. To make use of the services in this region, customers will use the Amazon Virtual Private Cloud (VPC) to organize their AWS resources.”

The timing of GovCloud is no doubt aligned with the upcoming October deadline by which federal agencies will have to offer initial reports about which of their data centers they’ll be closing down. The Office of Management and Budget has mandated that the government reduce its data center footprint by 38 percent by 2015, or 800 data centers, and offloading workloads to the cloud will certainly be among the tactics for pulling off that lofty goal.

The government is a massive source of IT spending, and AWS knows it has to play by the government’s rules if it wants to get a piece of that pie. Microsoft and Google are currently involved in a heated legal dispute over government contracts for their cloud-based collaboration services.

Interestingly, AWS won’t necessarily be limiting the GovCloud region to U.S. agencies. Writes Vogels on his blog, “We are certainly interested in understanding whether there are opportunities in other governments with respect to their specific regulatory requirements that could be solved by a specialized region.”

One thing that’s unclear is how, if at all, GovCloud is related to AWS’s previous work with government contractor Apptis on a project called FedCloud. I have reached out to AWS for a response on this question and will update this post with any feedback. According to an AWS spokesperson, government contractor Apptis will still be using AWS to provide cloud infrastructure to the U.S. government through its FedCloud portal and will utilize the GovCloud region.