No news there. For quite a while now, at Microsoft, it has been: "The cloud is the answer. What's the question?" Specific details data developers might be interested in, however, were scarce in the post by Quentin Clark, corporate vice president of the Data Platform Group. There was a lot of stuff like, "After the delivery of Microsoft SQL Server 2012 and Office 2013, we ramped-up our energy and sharpened our focus on the opportunities of cloud computing."

I found myself trying to read between the lines. Was there any new information here? Any subtle clues? Any news by omission? I became a Microsoftologist.

To explain: In the old days, before The Wall came crumbling down, news coming out of the Soviet Union was so skimpy it fostered an analysis technique called Kremlinology, wherein Western strategists tried to glean insights about the direction of the Evil Empire by noting little details like who stood next to whom in parade reviews and combing through speeches for hidden clues.

What struck me most: Where's the Big Data? The rest of the data development world is going crazy about Big Data, but in this post, not so much. It was cloud, cloud and more cloud. My Microsoftology revealed 13 references to "cloud" and seven to "Windows Azure." The total number of "Big Data" references? Two. But there was one reference to "HDInsight" and five to "Hadoop." So, total score: cloud, 20; Big Data, 8.

I also noted in other news that Microsoft development rock star Scott Guthrie (ScottGu) was reportedly named to be the new head of the cloud division, replacing the new CEO, Satya Nadella.

A new survey of database developers and other professionals shows Microsoft is maintaining its lead in the Relational Database Management Systems (RDBMS) and data warehousing arenas, but faces challenges from newcomers in Big Data and other markets.

Conducted by Progress Software Corp., the "Progress DataDirect 2014 Data Connectivity Outlook" survey purported to reveal "the rising stars in the database constellation." Developers constituted the largest group of respondents (36 percent of 300 existing customers surveyed), followed by CXOs and other management types.

For the RDBMS and enterprise data warehouse markets, respondents were asked about their data technologies currently in use and those expected to be implemented within the next two years. Although Microsoft (SQL Server) and Oracle unsurprisingly took the top two positions in current and projected usage in the RDBMS market, the survey "projects significant growth for emerging alternatives such as the community-developed MariaDB as well as the SAP HANA in-memory platform, over the next two years," Progress said this week.

In the enterprise data warehouse world, the top three technologies currently in use were again no surprise: SQL Server, Oracle and IBM DB2. However, all three were projected to show lower numbers in two years, while the current No. 4, Teradata, was expected to see higher usage. Amazon Redshift will reportedly show the biggest percentage increase in adoption, but is still expected to be in use by only slightly more than 10 percent of respondents in two years. SQL Server usage is expected to drop from being used by about 58 percent of respondents today to about 48 percent of respondents in two years.

Perhaps more of a surprise, Microsoft HD Insights was listed as the No. 2 Hadoop provider, following overwhelming market leader Apache's open source distribution, listed by more than 45 percent of respondents. Cloudera placed a close third, followed by Oracle DBA, Amazon EMR, IBM BigInsights, Hortonworks, MapR and Pivotal HD. This question asked respondents only what distribution they currently used or planned to use in the next two years, so there was no indication of growth over that period.

"With open-source Apache's low-cost of entry propelling its lead in the market, one can expect other big players in Big Data to further iterate their own unique value and perspectives when it comes to data storage within Hadoop databases like Hive, HBase and Cassandra," the survey report said. "Future competition from many of the large vendors may begin to change market distribution, but no significant changes are foreseen."

Another question concerned usage of NoSQL, NewSQL and non-relational databases, asking only which technologies were currently used or supported. Here, MongoDB held a large lead, used by nearly 40 percent of respondents, with SQLFire, Cassandra, HBase and CouchDB/Couchbase rounding out the top five of the 14 total products listed.

Salesforce.com reported a huge lead in respondents answering the question: "Which [Software as a Service ] SaaS applications do you or your customers currently use or support in your applications?" It was the choice of more than 40 percent of respondents, while Microsoft Dynamics CRM Online came in second, listed by more than 20 percent, followed by SAP Business ByDesign, Force.com and Intuit QuickBooks.

Progress said its first survey of this type "shows that while established vendors still hold significant share, a new set of rising stars--many of them lower-cost alternatives--are emerging in the data source world."

What do you think of the future of the database development landscape? Comment here or drop me a line.

I've noted before how data-driven developers in general and SQL gurus in particular are pretty well set in terms of salary and job security. So I was curious how database skills fared in responses to a recent Slashdot.org question: "It's 2014--Which New Technologies Should I Learn?"

"I've been a software engineer for about 15 years, most of which I spent working on embedded systems (small custom systems running Linux), developing in C. However, Web and mobile technologies seem to be taking over the world, and while I acknowledge that C isn't going away anytime soon, many job offers (at least those that seem interesting and in small companies) are asking for knowledge on these new technologies (Web/mobile). Plus, I'm interested in them anyway. Unfortunately, there are so many of those new technologies that it's difficult to figure out what would be the best use of my time. Which ones would you recommend? What would be the smallest set of 'new technologies' one should know to be employable in Web/mobile these days?"

I was so curious I combed through more than 370 comments to total up and compare the "technologies." Obviously, that's a broad term and could (and apparently did) mean just about anything, so I just focused on programming languages (as opposed to, say, "Learn to lie and [BS] with a straight face"). And I wasn't alone in wondering what constituted a "new technology."

Of course, this being Slashdot, the readers branched off on all kinds of bizarre tangents. It's amazing how these people can take the most insignificant, meaningless aspect of such a question and absolutely beat it to death. It's often pretty darn funny, though.

Anyway, a lot of Slashdot readers know their stuff, so I was interested in what they had to say, regardless of the wide range of possible interpretations of the question. My sampling is in no way scientific, or a real survey or even reliant upon any kind of reproducible methodology. I simply tried to total up the language suggestions I found in each of the comments. I didn't subtract votes when a suggestion was hammered by other readers with the inevitable vicious insults and snarkiness (some things will never change).

I guess the results were fairly predictable, but I was kind of disappointed in how database technologies in general ranked.

What was completely clear is that the overwhelming favorite is ... you guessed it: JavaScript. A rough grouping (not counting a lot of people who said just stick with C) looks something like this:

First: JavaScript

Second: Java

Third: PHP, HTML(5)

Fourth: Objective-C, Python

Fifth: Ruby, SQL, C++, C#

Sixth: HTTP, ASP.NET, CSS

Several dozen more languages were suggested in smaller numbers.

Being the resident Data Driver bloggist at Visual Studio Magazine, I was disappointed to see SQL so far down the list. Even totaling up all the other database-related languages, such as MySQL, SQLite, MongoDB and so on, wouldn't result in that impressive of a number (I stopped counting these when I realized none would total more than a few votes).

A couple of comments might shed some light on the prevailing attitudes out there. One commenter wrote: "RDBMSes are going to die, so learn how to interact with one of the major NoSQL databases. Most bleeding-edge: Titan and Neo4J, both graph databases."

Another wrote: "Some SQL is very useful but you don't need to be an expert--any serious Web development team will have a database expert who will do the DB stuff, you just need enough to code up test setups, prototypes and to talk to the DB guy."

I don't know exactly why "the DB guy" is separate from the rest of the Web dev team, or why the original poster couldn't be "the DB guy," but whatever.

The question was limited to the Web/mobile arena, remember, so it's not totally disheartening. I mean, there is this little thing called Big Data happening, and vendors are jumping all over themselves trying to come out with applications and packages and such to let the SQL guys and other "DB guys" join in the fun along with the Hadoop specialists and data scientists. But I guess nobody will be doing any Big Data stuff over the Web or with a mobile device.

Maybe it's not the sexiest programming language, but SQL continues to be relevant. In fact, TIOBE Software, which publishes a TIOBE Programming Community Index gauging the popularity of programming languages, named Transact-SQL the language of the year for 2013.

This "award" further emphasizes the importance of competency in SQL. I earlier wrote about how SQL gurus and other database-related programmers enjoyed excellent job security and how SQL Server developers were in high demand.

That's the good news. The bad news, according to TIOBE, "It is a bit strange that Transact-SQL wins the award because its major application field, Microsoft's database engine SQL Server, is losing popularity. The general conclusion is that Transact-SQL won because actually not much happened in 2013."

Not much happened in 2013? Wow, talk about strange. Has TIOBE heard of a little thing called Big Data?

Anyway, following Transact-SQL in popularity gains were Objective-C and F#. Objective-C had been the "language of the year" for the previous two years.

Microsoft fared well in other aspects, too, even regarding the much-maligned Windows Phone platform. As TIOBE wrote: "As we have seen the last decade, programming language popularity is largely influenced by external trends. The most important ones at the moment are mobile phone apps and web development. Android (mainly Java) and iOS (Objective-C) are the major mobile platforms, while Windows Phone (mainly C#) is catching up."

In index rankings, Transact-SQL went from No. 22 in January 2013 to No. 10 in January 2014. Otherwise, the rankings stayed the same for the top eight positions: C, Java, Objective-C, C++, C#, PHP and Python. The only other move in the top 10 was JavaScript going from 10th to the 9th spot.

In other attempts at ranking the popularity of programming languages, SQL was No. 12 in a list developed by LangPop.com last October. Meanwhile, Python garnered the "language of the year" prize in the Popularity of Programming Language (PYPL) index, which measures how often respective language tutorials show up in Google searches. No variants of SQL made the top 10. TIOBE said its ratings "are based on the number of skilled engineers world-wide, courses and third party vendors."

In Google Trends, searches for "SQL Programming Language" held fairly steady throughout 2013, except for a strange dip right at the end of the year.

How do you feel about the importance of keeping your SQL skills honed? Do these popularity rankings mean anything at all? Comment here or drop me a line.

Regardless of the future of the Microsoft ecosystem (and those latest quarterly numbers should slow the naysayers some), data developers can rest easy knowing their SQL Server skills are transferable in the New Data Order.

The latest example is yesterday's open sourcing of a new distributed SQL query engine for Big Data developed by Facebook, called Presto.

It was designed to improve upon existing solutions for Big Data analytics such as Hadoop MapReduce and Hive, Facebook's Martin Traverso said in a post announcing the move to GitHub. "Presto is a distributed SQL query engine optimized for ad-hoc analysis at interactive speed," Traverso said. "It supports standard ANSI SQL, including complex queries, aggregations, joins and window functions."

Traverso said Presto has provided performance gains of up to 10 times more than equivalent Hive/MapReduce tools in CPU efficiency and latency for most queries. While it doesn't run on Windows, "It currently supports a large subset of ANSI SQL, including joins, left/right outer joins, subqueries, and most of the common aggregate and scalar functions, including approximate distinct counts (using HyperLogLog) and approximate percentiles (based on quantile digest)," he said.

Yes, SQL isn't going anywhere. It has withstood challenges in one form or another from other Relational Database Management Systems such as Oracle, branch movements such as MySQL, hybrids like the NoSQL movement and so on. The Big Data onslaught seemed to be stealing much of its mindshare, but the pendulum is swinging back. The problem was that the specialized Hadoop-based solutions often proved too cumbersome to quickly and efficiently glean meaningful analytics from the vast new data stores.

"This enormous knowledge gap in accessing Big Data in Hadoop has prompted an avalanche of vendors to offer SQL-on-Hadoop solutions, which increase the accessibility of Hadoop and allow organizations to reuse their investment learning in SQL," stated a Gigaom report titled "Sector RoadMap: SQL-on-Hadoop platforms in 2013."

"SQL is widely known by most business analysts," the report continued. "Many nontechnical staff without a programming background can write SQL and use traditional business intelligence (BI) tools like Tableau, MicroStrategy, and Business Objects to query data."

Further evidence of SQL's solid positioning came in a recent presentation by Roger Magoulas, research director at O'Reilly Media, at the Strata Conference + Hadoop World event. He spoke about "the state of data science as a profession." An O'Reilly salary survey conducted last year reported that the top tool in use by the responding data scientists was SQL. "I guess it's not a surprise ... we heard some of the other speakers talk about it ... that SQL is still the top thing being used," Magoulas said. His accompanying slide proclaimed "SQL Rules" and indicated 71 percent of respondents reported it as their data science tool of choice. Hadoop was a surprising No. 5 at 35 percent. SQL users also fared well in the salary level findings, but averaged below the far-fewer number of Hadoop specialists.

Microsoft today announced the availability of SQL Server 2014 CTP2, a near-final version highlighted by new in-memory capabilities formerly called Project Hekaton.

The in-memory enhancements include new Online Transaction Processing (OLTP) features (Hekaton), and column store technology, along with better T-SQL support and new indexes and advisory tools. The product already featured in-memory data warehousing and business intelligence functionality.

New cloud capabilities such as easier backup and recovery in Windows Azure were also announced by Microsoft exec Quentin Clark in his keynote address at the Professional Association of SQL Server (PASS) Summit 2013 in Charlotte, N.C.

Clark noted that this "public and production-ready" release is shipping only 18 months after SQL Server 2012, a much faster release cycle than previous SQL Server versions.

"A year ago we announced project 'Hekaton,' and today we have customers realizing performance gains of up to 30x," said Clark, corporate vice president of the Data Platform Group. "This work, combined with our early investments in Analysis Services and Excel, means Microsoft is delivering the most complete in-memory capabilities for all data workloads -- analytics, data warehousing and OLTP."

Clark also nodded to the Big Data phenomenon, noting that customers are collecting and storing more data than ever before, such as machine signals, data from devices and services and even data from outside the organization, "so we invest in scaling the database and a Hadoop-based solution."

The new Windows Azure backup and recovery features are part of Microsoft's "hybrid cloud" strategy in which customers can work with SQL Server databases on-premises and back them up and recover them from the cloud. Clark announced today that all currently supported SQL Server releases can use Windows Azure backup. A preview of a stand-alone SQL Server Backup for Windows Azure Tool will be available for download later this week.

WCF DS 5.6.0 was released in August with support for Visual Studio 2013, portable libraries and other enhancements. The VS 2013 support lets developers consume OData services with the Add Service Reference functionality. The portable libraries support lets developers use the new streamlined JSON format (part of the OData v3 protocol, sometimes called "JSON light") in Windows Store and Windows Phone 8 apps. While core libraries have support for .NET Framework 4.0, the WCF DS client portable library support targets .NET 4.5. Both the core libraries and the WCF Client have support for Windows Phone 8, Windows Store and Silverlight 5 apps.

Users, however, wanted more. A couple of readers responded in the comments asking about OData v4 support, and one asked, "Does this release include EF 6 support." Microsoft's Mark Stafford last week replied, "Sort of. The EF 6 support will come in a different NuGet package, which will go into alpha today."

The Oct. 2 NuGet update that catches up to EF 6 was made possible by some work the WCF DS team was doing to make providers public. The team wanted to override provider behavior so developers could use features such as spatial properties and enums, which lack native support in the OData v3 protocol. "Unfortunately we ran into some non-trivial bugs with $select and $orderby and needed to cut the feature for this release," the team said in its August announcement.

However, that work paid off later, the team said in last week's update announcement. "We were able to build this provider as an out-of-band provider (that is, a provider that ships apart from the core WCF DS stack) because of the public provider work we did recently" the team said.

The new support for EF 6 basically results from a new class, EntityFrameworkDataService<T>, where T is type DbContext. For previous EF versions, developers should use the base class DataService<T>, where T is type DbContext or the older ObjectContext.

"The new public provider functionality comes with a little bit of code you need to write," the team said. "Since that code should be the same for every default EF 6 WCF DS provider, we went ahead and included a class [the new EntityFrameworkDataService class] that does this for you."

The team admitted that it didn't have time to do full testing on the new provider because the developers were "heads down" preparing for OData v4 support. "So ... we're going to rely on you, our dear customer, to provide feedback of whether or not this provider works in your situation. If we don't hear anything back, we'll go ahead and release the provider in a week or so." Which should be right around now.

I was dropped by my previous auto insurance company for a couple of at-fault accidents on my wife's driving record.

Trouble was, she was not involved in those accidents in any way. They happened to somebody else and somehow got on her report from a data collection company used by the insurer. And, try as I might, I could not convince the insurance company of this. I provided the company with a note from my previous insurer confirming that those accidents were not hers. I even provided an official driving record from the state showing those weren't her accidents. It didn't make any difference to the insurance company (as much as I'd like to see the company burned to the ground in an agonizing bankruptcy, I won't name it, but it definitely wasn't on my side). The accidents were on the ChoicePoint report--that's all that mattered.

I contacted the data collection company and began the nightmarish process of trying to get their information corrected. I eventually gave up; it just wasn't worth the hassle they were putting me through. (Ironically, I've never--ever--been in an at-fault accident. Believe it or not, I've never even received a moving violation, in several decades of driving. I was probably one of the best customers the insurance company could've had.)

I bring up these painful memories because of recent reports about a Big Data company, Acxiom, that this month announced a portal where individuals can look up information collected about them. Several articles noted that the portal, AboutTheData.com, reported some incorrect information. So I checked it out.

Sure enough, the site had a few things wrong, including my birth date, which was strange because I had just provided that date as part of registering for the privilege of looking up my info (pretty sly way to collect data, when you think about it--these people aren't stupid, like people in some other companies, if you know what I mean). They also got my education level, race and age of children wrong, among a few other things. Keep in mind the portal is in beta, and it gives you the chance to correct the data (I didn't even try to go there) and even opt out of the system.

So, just a warning: If you're a developer and your company is hopping on the Big Data bandwagon and you're assigned to the project, be very careful about the quality of the information you collect, especially if the data will have a significant impact on the success of the project--and the company's bottom line.

I mean, just imagine how much money that previous insurance company left on the table if the fiasco I experienced was commonplace among its multitude of customers. Fortunately, my new insurer uses a different data collection company that actually has accurate information and I got a sweet rate. And my present insurer is soaking up those monthly premiums and hasn't had to pay out a dime. Think of that, repeated thousands and thousands of times. If they only knew, I imagine the headquarters honchos in Columbus, Ohio, would be kicking themselves.

Microsoft may have been late to the cloud party, but its Windows Azure ranks near the top when it comes to popularity for data-related development, according to a new survey from Forrester.

The Forrsights Developer Survey, Q1 2013 found that North American and European developers preferred the Amazon EC2 cloud service for their compute services by a significant margin over Windows Azure, but it's a different story for Relational Database Management Systems (RDBMS) services.

"Microsoft and Amazon are neck and neck amongst users of cloud RDBMS," said Forrester analyst Jeffrey Hammond in a blog post yesterday.

The survey found that the top three types of cloud services adopted by developers are compute, storage and relational data services. Hammond said 47 percent of respondents regularly use compute and storage services, while 36 percent use RDBMS services. These numbers come from 325 developers in the North America and European regions (out of 1,611 total) who reported they had used cloud computing or elastic applications.

Of those respondents using cloud compute services, 62 percent said they were using Amazon EC2 or planned to expand their use of it in the next year. That compares to 39 percent for Windows Azure and 29 percent for the Google Cloud Platform.

That gap of more than 20 points in adoption "is well outside a standard margin of error, so we have to give the nod to AWS when it comes to compute," Hammond said.

"Things are very different when it comes to developers using cloud-based RDBMS services," Hammond said. There, 48 percent of developers reported they were "implementing and expanding" use of Microsoft SQL Azure, followed by 45 percent for the Amazon RDS service (see Figure 1). However, that 3-point gap is within the margin of error.

"Also note that there are a high number of developers that are planning to implement Amazon's RDS service (27 percent) while 7 percent of Microsoft SQL Azure developers plan to decrease or remove their RDBMs workloads," Hammond said. "Overall, we'd have to rate this workload as a push--with no clear adoption leader."

The percentage of developers who reported using cloud storage and plan to expand that usage over the next 12 months was almost equally divided among Amazon Web Services, Windows Azure and Google Cloud Storage, at 25 percent, 22 percent and 23 percent, respectively.

"Amazon still has a larger body of developers that have implemented but are not expanding AWS S3 (17 percent compared to 10 percent for Microsoft Azure and 9 percent for Google, respectively)" Hammond said. "Our take: this workload looks like it's headed for a strongly competitive market in 2014."

So, despite a lag of about 4 years between the introduction of Amazon EC2 (August 2006) and Windows Azure (February 2010), Microsoft has caught up in attracting developers to its cloud platform. That's interesting news, considering the popular backlash about Microsoft's decision to not provide developers early access to the Windows 8.1 RTM.

What is it that makes Windows Azure database-related services so popular among developers? Comment here or drop me a line.

The Microsoft Entity Framework has a spotty history of inconsistent release strategies, lagging feature requests and other issues, but things seem to be getting better with new leadership and even community contributions since it went open source.

The past problems with the Object Relational Mapping (ORM) tool were candidly discussed this week when EF team members Rowan Miller and Glenn Condron visited with Robert Green on the Visual Studio Toolbox Channel 9 video show to preview EF 6, which is "coming soon."

Reviewing past release history, Miller admitted the 2 years it took to update EF 3.5 SP1 to EF 4 was excessive. "That's really slow if you're waiting for new features," he said. Trying to get new features out to customers sooner, the team tried a hybrid model for subsequent releases with the runtime core components being part of the .NET Framework while new features shipped out-of-band via NuGet and the tooling was part of Visual Studio.

"We made one not-very-good attempt at doing an out-of-band release of the tooling, the 2011 June CTP of 'death and destruction' as we call it on our team," Miller said. "It went horribly; our product just wasn't set up for shipping out-of-band."

While the hybrid model got features out quicker, Miller noted there was a confusing mismatch of EF and .NET versions, so the team "bit the bullet" going from EF 5 to EF 6, going out-of-bound by moving the runtime to NuGet and the tooling to the Microsoft Download Center, while also being "chained in" to new Visual Studio releases.

The whole history, which Condron admitted was "still confusing," is explained in the 1-hour-plus show if you want to try to sort it out.

I was more interested in the move to open source at the same time EF was moved out of the .NET Framework, putting source code, nightly builds, issue tracking, feature specifications and design meeting notes on CodePlex. "Anything that we can make public, we made public," Condron said. The team also opened up the development effort to the community. "We happily accept contributions from as many people as we can," Condron said. But that strategy raised concerns by some, so he emphasized only the EF team has commit rights, and any pull requests must go through the same rigorous code review and testing process as do internal changes.

"We've gotten 25 pull requests; we've accepted 21 from eight different contributors," Condron said. He noted that while the development effort is open source, the shipping, licensing, support and quality remain in the hands of Microsoft.

Green noted that the strategy was a great way to get more people on the development effort, and Condron agreed, citing the thorny issue of enum support long requested by thousands of customers but not introduced until EF 5 and .NET 4.5. "Why is there no enum support--because nobody had written enum support," he said. "If you guys write it, we'll put it in." You can see the list of contributors on CodePlex (though only seven were listed in the Feb. 27 post).

Other benefits of open source, Miller said, include better customer relations, transparency and release cadence. "Back in the 3.5 SP1 days, we weren't listening to customers that great, but we've ... opened up the EF design blog" and started doing more frequent releases, he said. Condron also noted that with the nightly builds, the team has even gotten a few immediate bug reports the very next day after some source code was broken in the ongoing development efforts, which he said was "fantastic."

And the open source strategy will be expanding, Miller said. "The open source codebase at the moment just has the runtime in it" he said. "The tooling for EF 6 isn't going to be there, but after the EF 6 release, we're planning to put the tooling in as well. So we'll have the tooling there and you'll get to watch us develop on it, and we'll even accept pull requests on it as well."

Previewing EF 6, Miller and Condron ran through many new features coming from the EF team, such as nested entity types, improved transaction support and multiple contexts per database. But other new features are coming from community contributors, such as custom migrations operations, improved warm-up time for large models, pluggable pluralization and singularization service and "a few other cool things."

EF 6 is currently in Beta 1, and will go to RTM at same time as Visual Studio 2013, with an RC1 coming "soon."
[CLARIFICATION: The RC1 was actually made available on Wednesday, Aug. 21.]

Here's a troublesome aspect of the Big Data revolution I didn't expect: the melding of mind and machine. IBM yesterday unveiled a completely new computer programming architecture to help process vast amounts of data, modeled on the human brain.

The old Von Neumann architecture that most computers have been based on for the past half century or so just doesn't cut it anymore, scientists at IBM Research decided, so something new was needed, starting from the silicon chip itself. Ending with ... who knows, maybe some kind of organic/electronic hybrid. I keep getting visions of Capt. Picard with those wires sticking out of his head after being assimilated by the Borg. (If you don't know exactly what I'm talking about here, I've misjudged this audience completely and I apologize--please go back to whatever you were doing.)

The Von Neumann architecture encompasses the familiar processing unit,control unit, instruction register, memory, storage, I/O and so on. Forget all that. Now it's "cognitive computing" and "corelets" and a bunch of other stuff that's part of a brand-new computer programming ecosystem--complete with a new programming model and programming language--that was "designed for programming silicon chips that have an architecture inspired by the function, low power, and compact volume of the brain," IBM said.

Those would be "neurosynaptic chips" stemming from IBM's SyNAPSE project headed by Dharmendra S. Modha, unveiled in August 2011. In a video, Modha called the new system a "brain in a box." Now IBM is sharing with the world its vision of a new programming language and surrounding software ecosystem to take advantage of the chips, partially in response to the Big Data phenomenon.

"Although they are fast and precise 'number crunchers,' computers of traditional design become constrained by power and size while operating at reduced effectiveness when applied to real-time processing of the noisy, analog, voluminous, Big Data produced by the world around us," IBM said in its announcement.

That theme was echoed in the description of the SyNAPSE project, which explained that the old "if X then do Y" equation paradigm wasn't enough anymore.

"With the advent of Big Data, which grows larger, faster and more diverse by the day, this type of computing model is inadequate to process and make sense of the volumes of information that people and organizations need to deal with," IBM said.

The company said its new system could result in breakthroughs such as computer-assisted vision.

"Take the human eyes, for example," IBM said. "They sift through over a terabyte of data per day. Emulating the visual cortex, low-power, light-weight eye glasses designed to help the visually impaired could be outfitted with multiple video and auditory sensors that capture and analyze this optical flow of data."

That's fine and dandy--and seriously laudable. But, of course, these systems will eventually somehow be connected. And with the machine learning focus, the machines will, well ... learn. And after they learn enough, they will become self-aware. And when they become self-aware, they'll realize they're vastly superior to pathetic humankind and don't really need us around anymore. And then you've got some kind of dystopian nightmare: Skynet and Arnold androids, Borg, whatever.

"Every release lately, Microsoft has been turning the screws on Standard Edition users," Ozar wrote. "We get less CPU power, less memory, and few (if any) new features."

He complained that organizations wishing to use more than 64GB of memory needed to step up to the more expensive Enterprise Edition (see feature comparison). After listing deficiencies of the Standard Edition--such as not allowing database snapshots, auditing, numerous business intelligence (BI) features and more--he enjoined readers to boycott the product:

"Microsoft won’t change its tack on SQL Server licensing until you start leaving. Therefore, I need you to stop using SQL Server so they’ll start making it better. You know, forme."

You should check out the Hacker News discussion (116 comments as of noon Wednesday). Even if you aren't interested in the squabbling--some readers dared to contend that 64GB was plenty of RAM for small businesses, for example--you can learn a lot about the nitty-gritty concerns and trials and tribulations of SQL Server users and developers on the front lines.

Or, as Ozar put it in an update to his post yesterday: "If you [DBAs] want to stay current on what startup developers think about databases for their new projects, HackerNews is a good reality check. It's a completely different perspective than the typical enterprise developer echo chamber."

What do you think about Microsoft licensing terms for SQL Server and limitations of the the Standard Edition? Comment here or drop me a line.