A few days ago an announcement was made that there a PowerBI content pack has been published for Azure Active Directory! So let’s take that one out for a spin today and see what it can bring to the table.

Setting up the integration

This is one of the reasons I really like “cloud”! Integration almost has no entry barrier! Anyhow, in PowerBi, click on “Get Data”.

There are various articles/blogs/etc that compare logicapps vs flow vs functions vs azure automation, etc… Though there was one use case where I often struggled what to use ;

What to use when I want to retrieve a file from X on a Y timed interval?

Azure Functions? Great abstraction, though the output files have random names. Sometimes / Often I want to be able to control that.

Flow? Doesn’t allow the customization I was looking for. More to integrate existing / popular services.

Azure Automation? Very good and gets the job done. The only downside, you need to code a lot of logic yourself.

Azure Logic Apps? Shows potential, but doesn’t let you include custom functions. Or does it…?!?

You can link Azure Functions to Logic Apps and create some the flow I was looking for.

The Flow

So what do I want to do?

On a daily basis

Retrieve content from an authenticated API

Save the content to a Blob storage

And afterwards I’ll use other services to process that data. 🙂

The Proof-of-Concept

What do I want to achieve? On a daily basis I want to retrieve data from a service provider that serves sports data. And if you are looking for such a thing, check out MySportsFeed! So back to our proof-of-concept; how will this look in Logic Apps?

Something I had on my to-do for a while now was to post a proof-of-concept to you guys/gals about what BGP on Azure can entail… Now some of you might go; “BGP? What the hell is that?!?”. Check out the following “CBT Micro Nugget” as it is a nice high level description of what BGP is.

So why should you care? BGP can offer you a way to deal with advanced routing paths. This in turn can deliver resiliency to your business.

Proof-of-Concept Design

For today, we’ll be building the following setup ;

This will consist of the following components ;

Four virtual networks ; VNET001, VNET002, VNET003 & VNET004

Each VNET will have its own VPN Gateway. We’ll enable BGP on the VPN Gateway and give it its own (unique for, and private to, our deployment) ASN & peering address. The VPN Gateway will be set to “RouteBased”-routing and we’ll use a “Standard” SKU.

Each VPN Gateway will have two connections towards the “previous” and “next” gateway. The keys per connection pair will be set to the same key and we’ll also enable BGP on the connection.

We’ll deploy two systems into this PoC setup

System001 will reside in VNET001

System004 will reside in VNET004

To test our setup, we’ll execute the following scenario ;

Connect from system001 to system004 whilst our ring is complete =>the green path will be followed

Connect from system001 to system004 whilst having deleted the connections between VPNGW001 & VPNGW004 => the yellow path will be followed

Earlier we setup a basic IoT flow where we captured temperature & humidity and stored it to various outputs. My objective for this week was to create a new flow, that would leverage one of those outputs and do an anomaly detection on the data received. As this detection might take some time, I did not want to do this “in-line” with my current flow. So I’ve added a new one… which kinda looks like this.

The details of the Machine Learning part in combination with Stream Analytics will be for another post. This as I’m still struggling a bit to get it full operational. 😉 So today we’ll “just” cover the Machine Learning aspect of the flow.

Disclaimer

To be very clear up front… I’m by no means an expert at machine learning / big data / etc. In my quest to learn, I played around with the Machine Learning Studio of Azure, where I would like to share my experience on this. 😉

A few weeks back I posted a blog post on how you can leverage “serverless” components for IoT. Today I’ll show you what it would mean if we replace the Azure Functions component in that post by Azure Stream Analytics.

Flow

So the flow between device and event hub is untouched. Though we’ll replace the functions part with Azure Stream Analytics. So how will the end result look?

We’ll be using one Stream Analytics job to trigger three flows. One will store the data into an Azure Table Storage, another on will store it as a JSON file on to Azure Blob Storage and another one will stream it directly into a PowerBi dataset.

So let’s take a look at all the components from within this Stream Analytics Flow we’ll be using…

Up in the Clouds

Views are my own

The content of this blog will, at all times, portray my own views. At no time will this reflect the views of the organization I am linked to. Neither can the information provided be used as support statement.