Once you've got those two installed, it's time to get coding. As we ended our last post using Flow, one could easily reach the conclusion that doing this via Flow was quite a bit of work, trying to "shove a square peg in to a round hole" if you will. So, let's try and do it all with code, shall we?

And, of course, we'll now fill out the code accordingly. But first, let's think about the endpoints we're going to need.

Blob storageWe'll still need a place to store state for our function - the last version we saw from the feed - because Azure Functions, too, are stateless.

Alerting mechanismHow do we plan on sending an alert from our function? Sure, we can do all the logic to determine if one is needed, but what does sending one look like? To accomplish this, we're going to tie a webhook-enabled Flow to our Function - easy peasy!

Set up an Alert endpoint in Microsoft Flow

Create a new Flow like we did last time, but for the first step, search for Request and choose Request/Response - Request:

Notice the cool part of this Trigger when it gets created in your Flow: "URL will be created after save" - that's right, Microsoft Flow will give you a unique URL to hit this Flow with and kick it off. For ours, we'll want to send a message to the flow to use in the alert, to let's define our schema from this simple JSON object:

{
"message": "our message"
}

by clicking the "Use sample payload to generate schema" link at the bottom of the Request action.

Next, set up the alert step. Click 'Add an action' and search for 'notification':

You can choose mobile, e-mail, or heck, both. I'll choose mobile for now. Set it up so the message for the mobile notification is whatever we send through to the request for step 1:

Name & create the Flow and you're done!

After you click 'Create flow', expand the 'Request' trigger (Step 1) and notice you now have a URL filled in.

Click the 'copy' button and take that over to your local.settings.json in our new Azure Function.

Azure Functions also require storage of their own just to execute. You could either choose to reuse this storage to store your variables, or set up something separate. For me, I did it separate. Note that in our previous configuration for Microsoft Flow, you may have created a Blob Storage account (vs General Purpose). It's worth noting Azure Functions requires General Purpose Geo-redundant storage as its Webjob storage so make sure you choose the right options when setting this up.
The value for the Webjob storage you set up must be stored in the settings JSON file as well in the variable

"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=..."
It's also not advisable to use Storage Emulation with Azure Functions.

Next, add a local setting pointing to the same Azure Storage you set up for the Flow we created in the last blog post (or set up a new Azure Storage account if you're doing this for the first time). Put its connection string in to local settings like so:

var storageClient = CloudStorageAccount.Parse(STORAGE_CONNECTION_STRING).CreateCloudBlobClient();
var container = storageClient.GetContainerReference(@"functionvars");
container.CreateIfNotExists();
// get the blob that contains the last version we saw for this package. We're naming it the same as our package id
var blob = container.GetBlockBlobReference(NUGET_PACKAGE_ID);
// step 4 goes here

What happens if it's the first run? (No blob file exists yet) In my case, I chose to create & store the first version, but not send an alert. You can choose to send an alert (using the upcoming alert code) if you want.

if (!blob.Exists())
{ // if we haven't processed this package before, just set our baseline version
log.Info(@"First time we've seen this package. Storing version.");
blob.UploadText(version);
}
// step 5 goes here

{ // if the latest version of the pkg on nuget doesn't match the last one we've seen, it's new!
log.Info($@"Notifying!");
using (var notificationClient = new HttpClient { BaseAddress = FLOW_ALERT_ENDPOINT })
{
await notificationClient.PostAsJsonAsync(string.Empty, new { message = $@"New version of {NUGET_PACKAGE_ID} has been published to NuGet. Version {version}" });
}
// step 7 goes here
}

Update the blob in storage

// update the last seen version in the associated blob entry of our Storage account
blob.UploadText(version);

And that's it! The finished product should look something like this:

[FunctionName("Nuget6HourAlert")]
public static async Task Run([TimerTrigger("0 0 */6 * * *")]TimerInfo myTimer, Microsoft.Azure.WebJobs.Host.TraceWriter log)
{
dynamic nugetResults;
using (var client = new HttpClient { BaseAddress = NUGET_QUERY_URI })
{
var resultsJson = await client.GetStringAsync(string.Empty);
nugetResults = JObject.Parse(resultsJson);
}
// we only care about packages that *exactly* match the id, not others that might be returned from a search
dynamic targetPackage = ((IEnumerable<dynamic>)nugetResults.data)
.SingleOrDefault(i => ((string)i.id).Equals(NUGET_PACKAGE_ID, StringComparison.OrdinalIgnoreCase));
if (targetPackage != null)
{
string version = targetPackage.version;
log.Info($@"Package found. Latest version: {version}");
var storageClient = CloudStorageAccount.Parse(STORAGE_CONNECTION_STRING).CreateCloudBlobClient();
var container = storageClient.GetContainerReference(@"functionvars");
container.CreateIfNotExists();
// get the blob that contains the last version we saw for this package. We're naming it the same as our package id
var blob = container.GetBlockBlobReference(NUGET_PACKAGE_ID);
if (!blob.Exists())
{ // if we haven't processed this package before, just set our baseline version
log.Info(@"First time we've seen this package. Storing version.");
blob.UploadText(version);
}
else
{
var lastSeenVersion = new StreamReader(blob.OpenRead()).ReadToEnd();
log.Info($@"Last version we saw was {lastSeenVersion}");
if (!lastSeenVersion.Equals(version, StringComparison.OrdinalIgnoreCase))
{ // if the latest version of the pkg on nuget doesn't match the last one we've seen, it's new!
log.Info($@"Notifying!");
using (var notificationClient = new HttpClient { BaseAddress = FLOW_ALERT_ENDPOINT })
{
await notificationClient.PostAsJsonAsync(string.Empty, new { message = $@"New version of {NUGET_PACKAGE_ID} has been published to NuGet. Version {version}" });
}
// update the last seen version in the associated blob entry of our Storage account
blob.UploadText(version);
}
}
}
else
{
log.Info($@"No package found with id {NUGET_PACKAGE_ID}");
}
}

Publish your project to a new Azure Functions endpoint, and you're off and running!

The IDE-less approach

That was well & good, but what about an even simpler way to create a serverless execution? Let's see what we can do from right within the Azure Portal.

Open the portal, click + and search 'functions'. Create a new Azure Functions instance.

Next, open your new Functions instance and add a new Timer Function. For us we'll choose CSharp:

Since we're using the Windows Storage SDK in our function, we have to tell Azure Functions about that nuget package. To do this, we simply upload a project.json that looks like this:

Upon doing this you'll see the Functions console start a Nuget package restore.
When this is done, simply copy & paste the code from your Visual Studio function in to the web IDE.

It won't work yet, though, because we haven't set the settings we're using in our GetEnvironmentVariable calls. The bonus here is that we can keep these separate from our Function code by setting them in the function's Application Settings area a la every other Azure App Service offering.

To do this, click on the top-level Azure Function node in the left tree, then click 'Application Settings'

Next, put in the settings for PackageId, StorageConnectionString, and FlowAlertEndpoint and click 'Save'

Next, tune the timer for our function! Click the 'Integrate' option below your function in the left-hand tree and change the timer to match what was in the attribute of our function in Visual Studio:

Pro Tip
If you want to test out your function, set this timer to something low, like 5 or 10 seconds (*/5 * * * * *), and watch the log output in the editor area (bottom portion of the screen).

The changes you need to make to the out of the box template are minimal:

Replace the template's usings block with this:

using Microsoft.WindowsAzure.Storage;
using Newtonsoft.Json.Linq;
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Net.Http;
using System.Threading.Tasks;

Add the readonly variables above the method declaration.

Take the body of the function from Visual Studio and paste it in to the body of this function

Click 'Save' or 'Save and run' and your function is executing immediately!

I hope you've found these approaches to serverless computing useful and can see how marrying two such offerings provides a quick & easy way to go fr'om problem to solution and avoid all the infrastructure & setup headache.