Category Archives

The First Look series focusses on new products, recent announcements, previews or things I have not had the time to provide a first look at and serves as introduction to the subject. First look posts are fairly short and high level.

Today in First Look: Azure Stream Analytics. This service is currently in preview and provides low latency, high available and scalable complex event processing in the cloud. To do complex event processing SQL Server has StreamInsight; with Azure Stream Analytics we provide a CEP engine in the cloud.

Azure Stream Analytics is built to read data from a source, let’s say a device sensor and collect, process and take action on it in a blink of the eye. A simple example would be to read temperatures from a sensor, aggregate them to calculate an average temperature per 5 seconds and store that data into SQL server. Or a fairly more complex example would be to take output from a video camera that reads license plates, run the license plate through a list of license plates of stolen cars and immediately send a message to a police car nearby.

Because this solution is cloud based it is easy to get started. You can be up and running in literally minutes.

Like this:

This is the first post of my new first look series. This series focusses on new products, recent announcements, previews or things I have not had the time to provide a first look at and serves as introduction to the subject.
First look posts are fairly short and high level.

Today in first look: Azure Data Factory. This service was only recently announced as is available to all Azure customers in preview mode. To get a hold of it make sure you open the Azure preview portal. In your classic Azure portal click on your email address in the top right and choose ‘Switch to new portal’ or go directly to https://portal.azure.com.

So what is Azure Data Factory? I may be downplaying it a bit, but essentially Data Factory gives you ETL in the cloud. It connects to both on premises as well as cloud data stores and enables you to read data from the stores, do transformations and publish data in stores, while at the same time providing rich analytics on how the data flow is doing. The paradigm here is a factory floor: pieces of data enter the factory floor as raw materials, they undergo some treatment (transformations) and go out the door at the other end of the floor as finished product. The canvas of Data Factory closely resembles this floor and shows an assembly line for data. Here is very simple example, which retrieves data in hourly batches from a table in Blob Storage and stores it in a SQL Azure table:

Make sure to enter the path to where you stored your publish profile that you have downloaded.

Basically what this script does is iterate over all your subscriptions if you have more than 1 and look for VMs that are in a running state using get-AzureVM. Then for each VM that is running it will echo its name and then stop the VM using stop-AzureVM.

Save the script and then you can just run it and all of your VMs will be turned off. Pretty easy huh?

For my BI demos I use a maximum of four VMs and I made another script that starts them in the correct other (first the domain controller, then the SQL server and then finally the two SharePoint servers I need):

This script defines a function that wraps a check if the VM is already running and otherwise starts it. The bottom part of this script uses that function to specify which VMs to start in which order. I replaced the original names for security reasons.

This saves a lot of time. It saves me from logging into the Azure portal and starting / stopping each VM by hand. The best part is I can let this run in the background while presenting and nobody sees it J

The first two packages essentially contain some script tasks with complete samples on how to work with Piq, SQOOP and Hadoop jobs respectively. The other packages use the components provided and provide a quick start on getting data from Azure Blob Storage and getting data into Azure Blog Storage using SSIS.