Microsoft brings a “data culture” to the Internet of Things

Azure Intelligent Systems Service designed to manage data from any device.

Microsoft CEO Satya Nadella was in San Francisco today to talk about data and Microsoft's data platform. Nadella repeatedly spoke of Microsoft's "data culture"—using data and analytics to enable employees to get the information they need to understand their work, answer questions, and make decisions. At the event, he celebrated the recent launch of SQL Server 2014 and announced a pair of other products: a preview of Azure Intelligent Systems Service and general availability of Analytics Platform System.

SQL Server 2014 has been available to developers and others for a few weeks. Its headline feature is broad support for in-memory databases with an engine previously codenamed "Hekaton." As one would expect, in-memory databases are substantially faster than ones stored on-disk. The in-memory database engine is limited in terms of the programmatic features it offers, but when it can be used, it can make operations 10 to 30 times faster.

Microsoft said that SQL Server 2014 has been developed in a different way from prior versions of the database server. It was described as "born in the cloud," developed for Azure and the cloud first. It includes a range of Azure-related features, too, such as backups to Azure.

Azure Intelligent Systems Service is a cloud service that's designed to slurp in data generated from the Internet of Things—the increasingly ubiquitous devices and sensors—and capture it to let it be processed and used in meaningful ways with tools such as HDInsight and Power BI. ISS is available as a limited preview that interested companies can request access to.

47 Reader Comments

Microsoft said that SQL Server 2014 has been developed in a different way from prior versions of the database server. It was described it as "born in the cloud," developed for Azure and the cloud first. It includes a range of Azure-related features too, such as backups to Azure.

So does that mean the things that a database server do have been split between your site and the cloud?

Quote:

Azure Intelligent Systems Service is a cloud service that's designed to slurp in data generated from the Internet of Things—the increasingly ubiquitous devices and sensors—and capture it to let it be processed and used in meaningful ways with tools such as HDInsight and Power BI. ISS is available as a limited preview that interested companies can request access to.

You don't realize how well Microsoft did with Cortana until you look at the naming schemes for other services. Seriously, they need an easy name that rolls off the tongue. Right now it's just AISS, with the 'i' being easy to miss (and forget to type).

Might I suggest Azure's Big Computer? Send it to the ABC. Give the ABC a go. Sign up for ABC. Passes the smell test lots better than ASS. I mean AISS.

"The in-memory database engine is limited in terms of the programmatic features it offers"

Question from a dummy: why would there be any difference in the programmatic features, based on whether the data is on disk or in memory?

I imagine that there is no current way to prioritise, either through code, or some sort of rule, what data or data subsets are cached completely in-memory. I'm kind of talking out my lower end on that, based on some of what I've seen recently around the in-memory caching systems Microsoft have out there.

it's such a broad topic, that's it's difficult to condense into an article this brief. (where the DBAs at?)

the vast majority of the announcement was over my head, but the bottom line is this:

the data culture is about making big data meaningful and accessible to people who aren't data experts. MS is looking to differentiate by creating user-friendly tools, and has added capabilities to Excel and PowerPoint, that make big data and analytics useful to virtually everyone. Satya spoke about how their own HR, real-estate, and facilities people can make use of this data to save hundreds of millions of dollars in expenses.

via SQL Server's "in-memory" feature, processing time for huge data queries can be sped up significantly. NASDAQ claims they can now run a query of a data set containing a quintillion rows in just minutes whereas that same query used to take *days.*

a lot of this efficiency can be achieved without increasing hardware or writing new code. it just requires making some SQL Server configurations.

"The in-memory database engine is limited in terms of the programmatic features it offers"

Question from a dummy: why would there be any difference in the programmatic features, based on whether the data is on disk or in memory?

Several reasons. For one there have been a great many man-years of development gone into SQL Server's on-disk data structures and there simply may not have been time to reproduce all those features using the new data structures used by in-memory tables. Other features may just be difficult to optimize to the point where they weren't performing as well as on-disk equivalents and so never made it into the release since there is little point in a adding new in-memory functions that don't beat the existing code from a performance standpoint.

Can someone explain to me what "The Internet of Things" is supposed to be?

Thats when pretty much everything that runs on electricity is going to be hooked up to the internet, just because.

Pretty much this. The cheesy hacking movies from the 80s were ahead of their time.

Internet tech of today reminds me of the early experiments with radiation. Lots of enthusiasm and effort, but no one knew WTF they were doing in regards to safety.

Kind of, but we already know how much damage can be done by hackers, and yet they want to give them even more to hack for no good reason. There is no need to have everything networked. It will bring more headaches than benefits.

So when your devices can communicate with minimal intervention from you as a user. So stuff like having a bathroom scale that posts data to an internet server. Or when you turn on your new TV and it can automagically pull data from your NAS.

Can someone explain to me what "The Internet of Things" is supposed to be?

Computers have become so incredibly cheap that we can now have rudimentary microprocessors (think Apple II or Commodore 64) in a light bulb. Or gumstix sized computers in your toaster. Basically computers become ubiquitous, disposable, and are everywhere. Just like electrical wiring is.

With IoT (internet of things) the NSA can check with your toaster to see how well done you like your toast, for example.

Can someone explain to me what "The Internet of Things" is supposed to be?

Thats when pretty much everything that runs on electricity is going to be hooked up to the internet, just because.

Remember those internet fridges with built-in browsers on the door that could automatically order eggs for you when you ran out? Those sure changed the way we eat, didn't they?

Oh wait, no they didn't.

Straw man. Nobody has those fridges. They can't automatically order eggs until individual eggs are NFC tagged. It may well change the way we eat, but we won't know until it becomes generally available.

Can someone explain to me what "The Internet of Things" is supposed to be?

Thats when pretty much everything that runs on electricity is going to be hooked up to the internet, just because.

Remember those internet fridges with built-in browsers on the door that could automatically order eggs for you when you ran out? Those sure changed the way we eat, didn't they?

Oh wait, no they didn't.

Straw man. Nobody has those fridges. They can't automatically order eggs until individual eggs are NFC tagged. It may well change the way we eat, but we won't know until it becomes generally available.

That was my whole point. Back around the year 2000, there was a lot of articles around those new revolutionary internet fridges that were going to change everything, and in the end nobody bought them.

Can someone explain to me what "The Internet of Things" is supposed to be?

Thats when pretty much everything that runs on electricity is going to be hooked up to the internet, just because.

Remember those internet fridges with built-in browsers on the door that could automatically order eggs for you when you ran out? Those sure changed the way we eat, didn't they?

Oh wait, no they didn't.

Straw man. Nobody has those fridges. They can't automatically order eggs until individual eggs are NFC tagged. It may well change the way we eat, but we won't know until it becomes generally available.

That was my whole point. Back around the year 2000, there was a lot of articles around those new revolutionary internet fridges that were going to change everything, and in the end nobody bought them.

That was my whole point. Back around the year 2000, there was a lot of articles around those new revolutionary internet fridges that were going to change everything, and in the end nobody bought them.

Because they never existed as you describe them.

The way I kept hearing about them on the internet and in other media back then, they made it sound like it was about ready to go on sale at any minute. My point was that it's not a new concept and all sorts of attempts were made to try and capitalize on it, and it did not work.

Can someone explain to me what "The Internet of Things" is supposed to be?

Thats when pretty much everything that runs on electricity is going to be hooked up to the internet, just because.

Remember those internet fridges with built-in browsers on the door that could automatically order eggs for you when you ran out? Those sure changed the way we eat, didn't they?

Oh wait, no they didn't.

Straw man. Nobody has those fridges. They can't automatically order eggs until individual eggs are NFC tagged. It may well change the way we eat, but we won't know until it becomes generally available.

That was my whole point. Back around the year 2000, there was a lot of articles around those new revolutionary internet fridges that were going to change everything, and in the end nobody bought them.

it's such a broad topic, that's it's difficult to condense into an article this brief. (where the DBAs at?)

the vast majority of the announcement was over my head, but the bottom line is this:

the data culture is about making big data meaningful and accessible to people who aren't data experts. MS is looking to differentiate by creating user-friendly tools, and has added capabilities to Excel and PowerPoint, that make big data and analytics useful to virtually everyone. Satya spoke about how their own HR, real-estate, and facilities people can make use of this data to save hundreds of millions of dollars in expenses.

via SQL Server's "in-memory" feature, processing time for huge data queries can be sped up significantly. NASDAQ claims they can now run a query of a data set containing a quintillion rows in just minutes whereas that same query used to take *days.*

a lot of this efficiency can be achieved without increasing hardware or writing new code. it just requires making some SQL Server configurations.

Well, there were 3 products announced, that was only the first one. The next was basically an ms tech-based hadoop integrator/wrapper and the third is basically (to avoid buzz words) an embedded devices data store in beta testing that integrates with the other products but is hosted on Azure.

"The in-memory database engine is limited in terms of the programmatic features it offers"

Question from a dummy: why would there be any difference in the programmatic features, based on whether the data is on disk or in memory?

I imagine that there is no current way to prioritise, either through code, or some sort of rule, what data or data subsets are cached completely in-memory. I'm kind of talking out my lower end on that, based on some of what I've seen recently around the in-memory caching systems Microsoft have out there.

In memory DBs tend to be pure storage where as SQL Server live systems offer routines and scheduled tasks and such. I too am talking out of my lower end based on general information about in-memory cache systems.

"The in-memory database engine is limited in terms of the programmatic features it offers"

Question from a dummy: why would there be any difference in the programmatic features, based on whether the data is on disk or in memory?

Given the kinds of features they've omitted (subqueries, cursors, to name a few) I suspect the limitations are driven by two factors.

First, this is a new engine where everything has to be developed from scratch. As such, they've probably prioritized some features above others, and that some of the gaps will be filled in subsequent releases. I would be a little surprised if subqueries and common table expressions, for example, were never added.

Second, this is an engine designed for high performance and parallel computation. Features such as cursors, which tend to serialize database operations (by making them iterative, row-at-a-time operations), run directly counter to this high performance, parallel processed goal. So I would not be too surprised if the in-memory engine never supported cursors.