I have recently been exploring the use IaaS (Infrastructure as a Service) to provide cloud-based virtual machines (VM’s) as opposed to laptop-based VM’s and also PaaS (Platform as a Service) for SQL databases. I like the idea of carrying around a lighter more portable laptop and using cloud services to help me day to day, in contrast to carrying a heavy weight workstation for Hyper-v usage.

I know we can deploy VM’s through the Azure Portal but I prefer an automated approach. Fortunately the new PowerShell cmdlets support Azure VM provisioning and also Azure SQL database provisioning (plus some other nice interfaces). This enables me to quickly spin up a SQL Server VM in Azure or SQL database in Azure. I am actually quite amazed what is possible with PowerShell and it is my new best friend. PowerShell ISE in Windows 8 is superb, I highly recommend this as a development environment due to the intellisense and cmdlets search pane integration.

Below is the PowerShell script I wrote to provision a VM. Unfortunately some manual steps are still required if you wish to manage the SQL Server instance remotely through Management Studio e.g. opening firewall ports, enabling TCP etc. This is all documented here http://www.windowsazure.com/en-us/manage/windows/common-tasks/sql-server-on-a-vm/. Fortunately it is a lot easier to connect to an Azure SQL database and we can automate the registration of the provisioned instance in Management Studio.

Azure VM Provisioning (SQL Server 2012 Evaluation Edition)

Step 1. Download and register the Azure publishing certificate (one time only event).

In order to use PowerShell with the Azure VM and SQL Database services you will need to download and import the publishing file. Fortunately, this is a simple process. I also store my certificate on Skydrive so I can access it everywhere I go in case I need it again.

Get-AzurePublishSettingsFile
Import-AzurePublishSettingsFile C:\...

If you don’t import the publishing file then you may see an error similar to below when attempting to access the Azure services.

An error occurred while making the HTTP request to https://management.core.windows.net/ae81ecb1-a8af-4fb7-87c5-4418babb4ff2/services/sqlservers/servers/<server>?op=ResetPassword. This could be due to the fact that the server certificate is not configured properly with HTTP.SYS in the HTTPS case. This could also be caused by a mismatch of the security binding between the client and the server.

Step 2. View available subscriptions and set the correct Subscription

Incidentally, you may also need to associate a specific Azure Storage Vault (ASV) Account with your subscription, for some reason the default value was null so I had to allocate a specific account. The Azure Storage account is required to host the VM disks which are provisioned during the creation of an Azure VM image.

Step 3. View the available Azure VM images and locations

Step 4. Create an Azure VM

I like my VM’s big! ExtraLarge! You can use the New-AzureQuickVM syntax, New-AzureVM or New-AzureVMConfig syntax, The New-AzureQuickVM automatically creates and provisions the VM which does not require any additional steps.

If you see a DNS error then you may be trying to provision a under a duplicate service name e.g. Error "DNS name already exists" is misleading as it refers to a duplicate service name so change this in the parameters. This servicename refers to the VM service by which you reference/connect e.g. <vmservice>.cloudapp.net

Step 6. Start the Azure VM

Start the Azure VM using the command below.

Start-AzureVM -ServiceName "<servicename>" -Name "<vmname>"

You can view the properties of your Azure VM’s by using Get-AzureVM –ServiceName <ServiceName>. I would also add that additional data disks can be simply added using the Add-AzureDataDisk syntax providing the ability to simply increase the capacity of the provisioned Azure VM instance.

It is also possible to automatically launch and connect to your VM instance using the standard Remote Desktop client, mstsc.exe, which just eliminates another step in the process to connect to your Azure VM.

mstsc $ENV:userprofile\Desktop\myAzureVm1.rdp

Of course you could parameterise all of this to make life even simpler.

Step 8. Connect!

Simply enter the username and password off you go, simple. You could create multiple PowerShell batch files for each Azure VM image type. I am sticking with the SQL Server 2012 instance for now.

Teardown

Cleaning up the environment is simple too, just a couple of PowerShell commands to stop and remove the provisioned VM.

Step 9. Stop the Azure VM

Stop-AzureVM -ServiceName "<ServiceName>" -Name "<VmName>"

Step 10. Remove/delete the Azure VM

Remove-AzureVM -ServiceName "<ServiceName>" -Name "<VmName>"

Step 11. Remove/delete the Azure VM disks

The VHD’s associated with the image are not automatically removed so you will need to issue the Remove-AzureDisk command. You can view the existing VHD’s and the associated image and container using the Get-AzureDisk command as shown below. You will notice that I only have one disk (VHD) associated to an image. The other VHD’s were from previous Azure VM deployments.

Removing (deleting) the VHD is simple. The –DeleteVHD parameter is required if you wish to permanently delete the image from ASV so use with caution!

Remove-AzureDisk –DiskName <diskname> –DeleteVHD

Step 12. Remove the allocated cloud service

Azure SQL Database Provisioning

On a related noted, it is also possible to provision a Windows Azure SQL Database using the new PowerShell cmdlets allowing me to rapidly deploy a cloud-based relational data store. You must register the Azure publishing certificate unless this has been done previously (as above in step 1.).

As I was working through this, I discovered a VERY useful command which pops up a dialog with the help options for a specific command, an example is shown below. Omitting the –ShowWindow syntax will output the help details to the console window.

Get-Help Set-AzureSqlDatabase –ShowWindow

The next step was to create a SQL Server authenticated connection to the server hosting the Windows Azure SQL Database. This is an important step as it establishes the context for the connection.

Interestingly, there are a host of DataServiceContext class options made available under the context of the connection such as ServerMetrics and DatabaseMetrics. This provides some interesting insight into the metadata for your Azure SQL database server such as throttled connections and failures. Unfortunately, the context commands are not documented right now so this is just exploratory and the exposed properties may be removed in the future.

Beyond the ability to provision a Windows Azure SQL Database using PowerShell cmdlets, I can also save time by automatically registering the Azure SQL instance in SQL Server Management Studio by invoking the SQL Server 2012 PowerShell command New-Item as below (thereby saving even more time!). The AzureSqlDbServer1 reference is the friendly name which appears in the SQL Server Management console.

Teardown

Removing (or de-provisioning) the Azure SQL database, instance and Management Studio registration is simple. The last command, Remove-Item, is a SQL Server PowerShell command to delete the Management Studio server registration and this must be invoked using sqlps as above.

Closing Remarks

The new PowerShell cmdlets for Azure are a fantastic way to easily provision either VM’s or a database in the cloud. I will be parameterising my scripts (and including try.. catch blocks) to quickly create an Windows Azure Virtual Machine or Windows Azure SQL Database as needed (one-click deployment made easy!). PowerShell ISE is also an excellent development environment which can be leveraged for not only Azure VM or SQL database provisioning but also for many more solution scenarios.

What is missing?

Provisioning Azure HDInsight clusters is not currently possible however this should be coming soon http://hadoopsdk.codeplex.com (refer to Programmatic Cluster Management).

PowerShell cmdlets for Azure SQL Reporting are not currently available.

The ability to provision an Azure VM image with the full business intelligence stack deployed i.e. SharePoint 2013, PowerView and Power Pivot integration.

PowerShell remoting to be automatically enabled in the Azure VM.

The ability to invoke SqlCmd and query Azure SQL databases through PowerShell (inc. support for Federations).

Loading IIS logs (text files with fixed width spaces) was fairly challenging due as the column definitions would alter throughout. I therefore used the .NET StreamReader class to read the metadata and detect changes in the fields.

The data flow is shown below.

A snippet of the code in the Load Data (Source Script Component) is presented below

Code Snippet

// Get variables

strSourceFile = Variables.vCurrentSourceFileName;

intAuditLogFileIdentifier = Variables.vAuditExecutionFileIdentifier;

try

{

// Create an instance of StreamReader to read from a file.

// The using statement also closes the StreamReader.

using (StreamReader sr = newStreamReader(strSourceFile))

{

String line;

int intNumberOfFields = 0;

string[] strListOfFields = null;

Trace.WriteLine("Log File: " + strSourceFile);

// Output the source file name as the first line (debugging purposes)

OutputLogFileRawDataBuffer.AddRow();

OutputLogFileRawDataBuffer.colRawData = strSourceFile;

// Read and display lines from the file until the end of the file is reached.

while ((line = sr.ReadLine()) != null)

Extracting the data from the file was relatively straightforward. I placed the string into an array based on the fixed spacing between fields. From the data, I was able to extract useful information such as browser type i.e. Safari, Chrome, IE and even browser version.

IP addresses were mapped to geolocation using the free GeoLite information http://www.maxmind.com/app/geolite (CSV data imported into the database). I converted longitude and latitude to the spatial geography data type and presented this against a world map (I wanted to validate that the GeoLite data correctly mapped to the IP address e.g. country/city to IP address).

1: USE BiKitchen;

2:

3: DECLARE @SRID int = 4326

4: DECLARE @pLat nvarchar(max)

5: DECLARE @pLong nvarchar(max)

6: DECLARE @g geography

7:

8: -- Check longitude and latitude for London

9: SET @pLat = (SELECTCAST(Latitude AS nvarchar(max)) FROM [GeoData].[GeoLiteCity-Location] WHERE locid = 13547)

10: SET @pLong = (SELECTCAST(longitude AS nvarchar(max)) FROM [GeoData].[GeoLiteCity-Location] WHERE locid = 13547)

11:

12: SET @g = geography::STPointFromText('POINT(' +

13: @pLong + ' ' +

14: @pLat + ')', @SRID).BufferWithTolerance(11000, 1000, 0)

15:

16: SELECT @pLat AS Latitude, @pLong AS Longitude

17: -- SELECT @g.Lat, @g.Long

18:

19: -- Map the geography type to base world map data

20: -- View the result in the spatial tab to validate coordinates

21: SELECT @g AS spatiallocation

22: UNIONALLSELECT geog FROM World_Borders

The star schema was built and deployed in SQL Server 2012 Analysis Services (UDM). I found named calculations to be incredibly powerful way of extending the data model and making attributes more meaningful for end-users

The data was presented using Excel 2010, a screenshot is shown below. I found slicers to be extremely useful

I thought it would be interesting to see what Excel 2013 had to offer so I tried to create a Power View report but this is not currently supported against the UDM. There are however some nice enhancements to chart types so I’ll be looking at this in more detail.

]]>https://blogs.msdn.microsoft.com/benjones/2012/07/19/building-a-data-mart-to-analyse-web-log-traffic/feed/4Redux: Using an SSIS package to monitor and archive the default trace filehttps://blogs.msdn.microsoft.com/benjones/2012/07/04/redux-using-an-ssis-package-to-monitor-and-archive-the-default-trace-file/
https://blogs.msdn.microsoft.com/benjones/2012/07/04/redux-using-an-ssis-package-to-monitor-and-archive-the-default-trace-file/#commentsWed, 04 Jul 2012 22:58:58 +0000https://blogs.msdn.microsoft.com/benjones/2012/07/04/redux-using-an-ssis-package-to-monitor-and-archive-the-default-trace-file/I’ve recently been asked for details regarding the SSIS package I wrote to copy and archive the default trace file which is generated by SQL Server. The contents of the file can be quite useful for troubleshooting or diagnosis purposes.

I’ve updated the package to work with SQL Server 2008 R2 and SQL Server 2012.

The detection of a new trace file is implemented using a Script Task which watches the \Log folder for new files. When a new file is detected it copies the previous file to an archive location. The logic is embedded in Visual Basic.NET (not my personal choice although I was limited in SQL Server 2005 when I wrote the package).

The archive process renames the file with the date and time and then copies the file to a chosen location. I should point out that I use expressions on a few variables to alter outputs such as the filename i.e. date_time_filename.

I also noticed that the service needs appropriate permissions to both access the \LOG directory and also copy to the target directory, in my scenario, this was \\server\share. When I was testing, I launch SSDT (SQL Server Data Tools) using Administrator privileges for testing purposes as a quick workround to permission issues).

Here is the code for the Script Task (apologies for the word wrap, the Live Writer plug-in seems to do this to fit it on the page). I have commented out some of the writeline commands I was using to debug the package when it was initially developed.

Code Snippet

'Disclaimer:

'The sample scripts and SSIS package are not supported under any Microsoft standard support program or service.

'The sample scripts and SSIS package are provided AS IS without warranty of any kind.

'Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose.

'The entire risk arising out of the use or performance of the sample scripts and documentation remains with you.

'In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits,

'business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages.

]]>https://blogs.msdn.microsoft.com/benjones/2012/07/04/redux-using-an-ssis-package-to-monitor-and-archive-the-default-trace-file/feed/2Redux: Using a C# script task in SSIS to download a file over httphttps://blogs.msdn.microsoft.com/benjones/2012/07/02/redux-using-a-c-script-task-in-ssis-to-download-a-file-over-http/
https://blogs.msdn.microsoft.com/benjones/2012/07/02/redux-using-a-c-script-task-in-ssis-to-download-a-file-over-http/#respondMon, 02 Jul 2012 14:11:25 +0000https://blogs.msdn.microsoft.com/benjones/2012/07/02/redux-using-a-c-script-task-in-ssis-to-download-a-file-over-http/A few people have asked for further information about the C# script task which I blogged about (quite a while ago). I mistakenly forgot to add the full source code, sorry everyone. Here is the link to the original blog post: http://blogs.msdn.com/benjones/archive/2009/03/29/using-a-c-script-task-in-ssis-to-download-a-file-over-http.aspx

I have since imported the SSIS package into Visual Studio 2010 (BIDS) and the code compiles without error. Some of the code below is truncated on the right (just a formatting issue I need to resolve) but the core of the code is there.

The files are extracted using an Execute Process Task (with 7-Zip) as shown below:

And the arguments are set using the expression (below). There are probably better ways of doing this but I found this worked well.

The .zip file is then archived using a File System task and the extracted file is renamed to .xlsx.

]]>https://blogs.msdn.microsoft.com/benjones/2012/07/02/redux-using-a-c-script-task-in-ssis-to-download-a-file-over-http/feed/0Are you interested in Data Science?https://blogs.msdn.microsoft.com/benjones/2012/07/02/are-you-interested-in-data-science/
https://blogs.msdn.microsoft.com/benjones/2012/07/02/are-you-interested-in-data-science/#respondMon, 02 Jul 2012 07:52:19 +0000https://blogs.msdn.microsoft.com/benjones/2012/07/02/are-you-interested-in-data-science/The University of Dundee is now gauging interest in a number of data science focused courses (see below).

]]>https://blogs.msdn.microsoft.com/benjones/2012/07/02/are-you-interested-in-data-science/feed/0Testing SQL Server HA using Hyper-V in Windows 8 Release Previewhttps://blogs.msdn.microsoft.com/benjones/2012/06/21/testing-sql-server-ha-using-hyper-v-in-windows-8-release-preview/
https://blogs.msdn.microsoft.com/benjones/2012/06/21/testing-sql-server-ha-using-hyper-v-in-windows-8-release-preview/#commentsThu, 21 Jun 2012 09:22:24 +0000https://blogs.msdn.microsoft.com/benjones/2012/06/21/testing-sql-server-ha-using-hyper-v-in-windows-8-release-preview/I often find the need to build and test SQL Server clusters to support native two-node or N+1 (multi-instance) scenarios and also validate SQL Server 2012 Always On configurations. I was previously running Windows Server 2008 R2 on my laptop but I a few issues using this as a day to day operating system e.g. no support for standby when hyper-v was enabled, no dual boot (hyper-v on/off) with bitlocker enabled, no Bluetooth support etc. Fortunately, Windows 8 meets all my needs and the experience has been excellent to date.

Note: for rapid provisioning of other images, I created a fully patched base OS image which was sysprep’d (this is incredibly easy). I now just copy the image if I want to create other server roles e.g. System Center 2012.

The only issue I have found to date is that host internet connectivity is affected (delayed) after defining multiple internal network adapters, therefore I switched to using Private network adapters. I only need internal adapters for host to guest connectivity e.g. copying files etc.

My hardware and software is defined below:

Laptop HP 8540w (16GB memory, 4 cores hyper-threaded)

Two internal SATA disks (hybrid)

Windows 8 Release Preview (x64) with Hyper-V enabled

The environment consists of the following virtual guest images:

1 x Domain Controller (also hosts the virtual storage)

1 x Primary Node (node 1)

1 x Secondary Node (node 2)

All servers are running Windows Server 2008 R2 Enterprise Edition x64 with Service Pack 1. The cluster role is enabled on node 1 and node 2.

Storage

The shared storage is provisioned using the iSCSI Software Target which I defined on the domain controller. I didn’t find a need to dedicate a specific storage server. The screenshot of the virtual storage is presented below

Cluster Validation is good, the only warning was inconsistent OS patch levels on both cluster nodes as shown below.

After running cluster validation, I created a cluster and the final configuration is shown below.

Below is a screenshot of the raw cluster configuration.

I’ll blog more following the SQL Server install. I also plan to repeat this for Window Server 2012 and SQL Server 2012.

]]>https://blogs.msdn.microsoft.com/benjones/2012/06/21/testing-sql-server-ha-using-hyper-v-in-windows-8-release-preview/feed/1Transposing Columns onto Rowshttps://blogs.msdn.microsoft.com/benjones/2012/06/07/transposing-columns-onto-rows/
https://blogs.msdn.microsoft.com/benjones/2012/06/07/transposing-columns-onto-rows/#commentsThu, 07 Jun 2012 10:38:50 +0000https://blogs.msdn.microsoft.com/benjones/2012/06/07/transposing-columns-onto-rows/After a long period of absence, I have returned to posting some blog articles. This one popped up last week from an Oracle DBA while I was onsite with a customer. The conversation went something along the lines of:

Oracle DBA: “How can I place columns values on rows? I have a limited page width and need to alter the output”

]]>https://blogs.msdn.microsoft.com/benjones/2012/06/07/transposing-columns-onto-rows/feed/3BI Service Applications in SharePoint 2010https://blogs.msdn.microsoft.com/benjones/2011/11/29/bi-service-applications-in-sharepoint-2010/
https://blogs.msdn.microsoft.com/benjones/2011/11/29/bi-service-applications-in-sharepoint-2010/#respondTue, 29 Nov 2011 01:29:14 +0000https://blogs.msdn.microsoft.com/benjones/2011/11/29/bi-service-applications-in-sharepoint-2010/One of my colleagues, Chris Bailiss, has written a number of great articles describing how BI service applications authenticate in SharePoint 2010. If you are interested then head over here to part 1 (link below) now: