Steve Jones is the editor of SQLServerCentral.com and visits a wide variety of data related topics in his daily editorial. Steve has spent years working as a DBA and general purpose Windows administrator, primarily working with SQL Server since it was ported from Sybase in 1990. You can follow Steve on Twitter at twitter.com/way0utwest

President of server tools at Microsoft, Bob Muglia, he overseas Windows Server, Systems Center, Hyper-V, and of course, SQL Server. Bob was the product manager when SQL Server was announced by Microsoft in 1988 and actually brought a box on stage of the Ashton-Tate/Microsoft SQL Server 1.0. I never used that product, but I did use the product a couple years later, in 1991 as the Sybase port, when it was Microsoft SQL Server v4.2 (on OS/2).

Bob's talk is talking about how Microsoft has grown up, and is scaling up. There was a rack on stage that was mostly full. Looked like 7 4u boxes, but Bob says it's a single server, so I'm guessing lots of disks in there.

The demo shows 64 CPUs being used. There's a workload that's almost pegging the CPU. An application controls the work being done, and the CPUs. When increased to 128 CPUs, a new high, the workload goes down. That's impressive, but we know there's more. Bob's hinted that 192 CPUs are now possible for SQL Server 2008 R2. The workload increases to peg the box, but then the CPUs are raised to 192CPUs and the workload comes down.

It's staged, but it's still impressive at the high end. If you have a big workload, and a big, big checkbook, you can go to 192 CPUs. Not many people can get there, but I do think that this will mean that the 64, 32, 16CPU boxes will come down in price.

A new benchmark world record, TPC-E 2012 tpsE, an overall record and a price/performance record with x64 and IA64 Windows and SQL Server. In the data warehouse space, the record on Windows is now TPC-H 3TB warehouse, 102,778QphH. With Microsoft's own Dynamics product, with 20,000 used, there is sub-second response. Marketing numbers, but they are still pushing the limits.

One very interesting note on the release of SQL Server 2008 might have slipped by. The press release, and the slide of the benchmarks shows a date available of 5/6/2010. I wonder if that is the expected release for the product.

As memory, flash disks become more prevalent, the data professionals become more important. I tend to agree with that. So many people have worried about easier SQL Server means less need for DBAs. I think it means more opportunities because we can do more things that have greater impact. Bob mentions he sees our role expanding because data is so critical to organizations.

Shifts in management

Data centers, traditionally are utilized < 15%, but they are well known ways to manage large numbers of servers. Virtualization is one of these. Easier management, higher flexibility in data centers and reducing costs. The database server is one of the last types of servers to be virtualized, but it will come.

Consistent and coherent access to data is important for all applications.

A demo of Hyper-V, Win 2K8 R2, Virtual Machine Manager 2K8, showing live migration of a SQL Server. Moving a virtual SQL server from one physical box to the next. The demo shows a live load on the server by running a stored procedure against it. We can see it running in the background as an app. By selecting "migrate" and then "next" and then "move" the virtual machine moves. The stored procedure continues running, with no interruption to the app running the stored proc.

Bob talks about Hyper-V being close to VMWare in performance. However I've seen people report performance issues with VMWare, so I'm not sure that this is a big deal.

Clouds

Private clouds, a way that can provide a computing resource. I like the idea of clouds inside a company. A computing resource that you deploy do. Bob mentions this is a way to decrease management costs and scale out an application as needed. The example given is the "giving" application at Microsoft, a way for employees to determine charitable contributions once a year. There's an elastic computing capability that allows the app to live on 2 computers most of the year. But when needed, at the once a year time when it's pushed to employers, it can grow to 24 computers for a few days.

This is a good way for elastic computing to live. Grow resources as they are needed. Microsoft is looking to have companies build private clouds inside their organization first. I agree with that, learn to scale these things to 10s of machines. Once we know that, then perhaps public clouds are more likely. That's for businesses that need to keep their data secure.

SQL Azure is being used inside Microsoft. I'd love to see detailed case studies, with code, from them on how this is being used and in what places. The idea is good for limited scope items. Or maybe it's much further advanced than I'm aware of. I think I have some research to do.

The role of the DBA

The data we manage is at the center of our organizations. The opportunity for DBAs is to take our skill sets and leverage them to solve new problems. Perhaps problems that you couldn't solve before, or didn't have time to solve.

I think that's the "spin", but it's also a truth. If you are spending time managing the details, the minutiae of keeping a server running, you are making a mistake.