Using WebDeploy in vNext Releases

A few months ago Release Management (RM) Update 3 preview was released. One of the big features in that release was the ability to deploy without agents using PowerShell DSC. Once I saw this feature, I started a journey to see how far I could take deployments using this amazing technology. I had to learn how DSC worked, and from there I had to figure out how to use DSC with RM! The ride was a bit rocky at first, but I feel comfortable with what I am able to do using RM with PowerShell DSC.

Readying Environments for Deployment

In my mind there were two distinct steps that I wanted to be able to manage using RM/DSC:

Configure an environment (set of machines) to make them ready to run my application

Deploy my application to these servers

The RM/DSC posts I’ve blogged so far deal with readying the environment:

So we’re now at a point where we can ensure that the machines that we want to deploy our application to are ready for our application – in the case of a SQL server SQL is installed and configured correctly. In the case of a webserver, IIS is installed and configured, additional runtimes are present (like MVC) and Webdeploy is installed and ports opened so that I can deploy using Webdeploy. So how then do I deploy my application?

Good Packages

Good deployment always beings with good packages. To get a good package, you’ll need an automated build that ties into source control (and hopefully work items) and performs automated unit testing with coverage. This gives you some metrics as to the quality of your builds. The next critical piece that you’ll need is to make sure that you can manage multiple configurations – after all, you’ll be wanting to deploy the same package to Production that you deployed and testing in UAT, so the package shouldn’t have configuration hard-coded in. In my agent-based Webdeploy/RM post, I show how you can create a team build that puts placeholders into the SetParameters.xml file, so that you can put in environment-specific values when you deploy. The package I created for that deployment process can be used for deployment via DSC as well – just showing that if you create a good package during build, you have more release options available to you.

Besides the package, you’ll want to source control your DSC scripts. This way you can track changes that you make to your scripts over time. Also, having the scripts “travel” with your binaries means you only have to look in one location to find both deployment packages (or binaries) and the scripts you need to deploy them. Here’s how I organized my website and scripts in TF Version Control:

The actual solution (with my websites, libraries and database schema project) is in the FabrikamFiber.CallCenter folder. I have some 3rd party libraries that are checked into the lib folder. The build folder has some utilities for running the build (like the xunit test adapter). And you can also see the DscScripts folder where I keep the scripts for deploying this application.

By default on a team build, only compiled output is placed into the drop folder – you don’t typically get any source code. I haven’t included the scripts in my solution or projects, so I used a post-build script to copy the scripts from the source folder to the bin folder during the build – the build then copies everything in the bin folder to the drop folder. You could use this technique if you wanted to share scripts with multiple solutions – in that case you’d have the scripts in a higher level folder in SC. Here’s the script:

Lines 2-3: you can use the $env parameters that get set when team build executes a custom script. Here I am using the sources and binaries directory settings.

Line 4: the subfolder to copy from the $srcPath to the $binPath.

Line 12-14: ensure that the target path exists.

Line 16: xcopy the files to the target folder.

Calling the script with $pathToCopy set to DscScripts will result in my DSC scripts being copied to the drop folder along with my build binaries. Using the TFVC 2013 default template, here’s what my advanced build parameters look like:

The MSBuild arguments build a Webdeploy package for me. The profile (specified when you right-click the project and select “Publish”) also inserts RM placeholders into environment specific settings (like connection strings, for example). I don’t hard-code the values since this same package can be deployed to multiple environments. Later we’ll see how the actual values replace the tokens at deploy time.

The post-build script is the script above, and I pass “-pathToCopy DscScripts” to the script in order to copy the scripts to the bin (and ultimately the drop) folder.

The DscScripts folder has all the scripts I need to deploy this application.

The FabrikamFiber.Schema.dacpac is the binary of my database schema project.

The _PublishedWebsites folder contains 2 folders: the “xcopyable” site (which I ignore) and the FabrikamFiber.Web_package folder which is shown on the right in the figure above, containing the cmd file to execute WebDeploy, the SetParameters.xml file for configuration and the zip file containing the compiled site.

Note the “__” (double underscore) pre- and post-fix, making SiteName and FabFiberExpressConStr parameters that I can use in both agent-based and agent-less deployments.

Now that all the binaries and scripts are together, we can look at how to do the deployment.

Deploying a DacPac

To deploy the database component of my application, I want to use the DacPac (the compiled output of my SSDT project). The DacPac is a “compiled model” of how I want the database to look. To deploy a DacPac, you invoke sqlpackage.exe (installed with SQL Server Tools when you install and configure SQL Server). SqlPackage then reverse engineers the target database (the database you’re deploying the model to) into another model, does a compare and produces a diff script. You can also make SqlPackage run the script (which will make the target database look exactly like the DacPac model you compiled your project into).

To do this inside a DSC script, I implement a “Script” resource. The Script resource has 3 parts: a Get-Script, a Set-Script and a Test-Script. The Get-Script is executed when you run DSC in interrogative mode – it won’t change the state of the target node at all. The Test-Script is used to determine if any action must be taken – if it return $true, then no action is taken (the target is already in the desired state). If the Test-Script returns $false, then the target node is not in the desired state and the Set-Script is invoked. The Set-Script is executed in order to bring the target node into the desired state.

A Note on Script Resource Parameters

A caveat here though: the Script resource can be a bit confusing in terms of parameters. The DSC script actually has 2 “phases” – first, the PowerShell script is “compiled” into a mof file. This file is then pushed to the target server and executed during the “deploy” phase. The parameters that you use in the configuration script are available on the RM server at “compile” time, while parameters in the Script resources are only available on the target node during “deploy” time. That means that you can’t pass a parameter from the config file “into” the Script resource – all parameters in the Script resource need to be hard-coded or calculated on the target node at execution time.

Here the intent is to have a parameter called $logLocation that we pass into the config script. When you see this script, it seems to make perfect sense – however, while the log will show the message “The log location is [c:\temp]”, for example (line 11), when the Set-Script of the Script resource runs on the target node, you’ll see the message “Log location is []” (Line 20). Why? Because the $logLocation parameter does not exist when this script is run at deploy time on the target node. The parameter is available to the Log resource (or other resources like File) but won’t be to the Script resource. You will be able to create other parameters “at deploy time” (like $localParam on Line 21). This is frustrating, but kind of understandable. The Script resource script blocks are not evaluated for parameters. I found a string manipulation hack that allows you to fudge config parameters into the script blocks, but decided against using it.

ConfigData

Before we look at the DSC script used to deploy the database, I need to show you my configData script:

Line 1: When running from the command line, you just specify a hash-table. DSC requires this hash-table to be put into a variable. I have both in the script (though I default to the format RM requires) just so that I can test the script outside of RM.

Line 3: AllNodes is a hash-table of all the nodes I want to affect with my configuration scripts.

Lines 5/6 – common properties for all nodes (the name is “*” so DSC applies these properties to all nodes).

Line 10/11 and 15/16: I specify the nodes I have as well as a Role property. This is so that I can deploy the same configuration to multiple servers that have the same role (like a web farm for example).

You can specify other parameters, each with another value for each server.

Line 7: I need a parameter to tell me where the DacPac is – this will be my build drops folder.

Line 10: I specify the node I want to bring into the desired state. I wanted to apply this config to all nodes that have the role “SqlServer” and this worked from the command line – for some reason I couldn’t get it to work with RM, so I hardcode the node-name here. I think this is particular to my environment, since this should work.

Lines 12-15: Log a message.

Lines 20-26: Use the File resource to copy the DacPac from a subfolder in the $PackagePath to a known folder on the local machine. I did this because I couldn’t pass the drop-folder path in to the Script resource – so I copied using the File Resource to a known location and can just “hard code” that location in my Script resources.

Line 28: This is the start of the script Resource for invoking sqlpackage.exe.

Line 30: Just return the name of the resource.

Line 31: Always return false – meaning that the Set-Script will always be run. You could have some check here if you didn’t want the script to execute for some specific condition.

Lines 32-36: This is the script that actually does the work – I create the command and then Invoke it, piping output to the verbose log for logging. I use “/a:Publish” to tell SqlPackage to execute the incremental changes on the database, using the DacPac as the source file (/sf) and targeting the database specified in the target connection string (/tcs).

Line 37: Invoking the DacPac is dependent on the DacPac being present, so I express the dependency.

The final resource in this script is also a Script resource – the Get- and Test-Scripts are self-explanatory. The Set-Script takes the SQL string I have in the script, writes it to a file (using sc – Set-Content) and then executes the file using sqlcmd.exe. This is specific to my environment, but shows that you can execute arbitrary SQL against a server fairly easily using the Script resource.

Line 73: When using DSC with RM, you need to compile the configuration (do this by invoking the Configuration) into mof files. Don’t call Start-DscConfiguration (which pushes the mof files to the target nodes for running the configuration) since RM will do this step. You can see how I use $applicationPath – this is the path that you specify when you create the vNext component (relative to a drop folder) – we’ll see later how to set this up. RM sets this parameter when before it calls the script. Also, you need to specify the parameter that contains the configuration hash-table. In my case this is $configData, which you’ll see at the top of the configData script above. RM “executes” this script so the parameter is in memory by the time the DSC script is executed.

When working with DSC, you have to think about idempotency. In other words, the script must produce the same result every time you run it – no matter what the starting state is. Since deploying a DacPac to a database is already idempotent, I don’t have too much to worry about in this case, so that’s why the Test-Script for the DeployDacPac Script resource always returns false.

Deploying a Website using WebDeploy

You could be publishing your website out of Visual Studio. But don’t – seriously, don’t EVER do this. So you’re smart: you’ve got an automated build to compile your website. Well done! Now you could be deploying this site using xcopy. Don’t – primarily because managing configuration is hard to do using this method, and you usually end up deploying all sorts of files that you don’t actually require (like web.debug.config etc.). You should be using WebDeploy!

I’ve got a post about how to use WebDeploy with agent-based templates. What follows is how to deploy sites using WebDeploy in vNext templates (using PowerShell DSC). In a previous post I show how you can use DSC to ready a webserver for your application. Now we can look at what we need to do to actually deploy a site using WebDeploy. Here’s the script I use:

You’ll see some similarities to the database DSC script – getting nodes by role (“WebServer” this time instead of “SqlServer”), Log resources to log messages and the “compilation” command which passes in the $configData and $applicationPath.

Lines 20-28: I copy the entire FabrikamFiber.Web_package folder (containing the cmd, SetParameters and zip file) to a temp folder on the node.

Line 30: I use a Script Resource to do config replacement.

Lines 32-33: Always execute the Set-Script, and return the name of the resource when interrogating the target system.

Lines 34-47: The “guts” of this script – replacing the tokens in the SetParameters file with real values and then invoking WebDeploy.

Line 35: Set a parameter to the known local location of the SetParameters file.

Lines 37-40: Create a hash-table of key/value pairs that will be replaced in the SetParameters file. I have 2: the site name and the database connection string. You can see the familiar __ pre- and post-fix for the placeholders names – I can use this same package in agent-based deployments if I want to.

Line 42: read in the contents of the SetParameters file.

Lines 43-45: Replace the token placeholders with the actual values from the hash-table.

Line 46: overwrite the SetParameters file – it now has actual values instead of just placeholder values.

Lines 51-59: I use another Script resource to execute the cmd file (invoking WebDeploy).

Lines 64-90: This is optional – I include it here as a reference of how to ensure that the site is being monitored using Application Insights once it’s deployed.

The Release

In order to run vNext (a.k.a. agent-less a.k.a DSC) deployments, you need to import your target nodes. Since vNext servers are agent-less, you don’t need to install anything on the target node. You just need to make sure you can run remote PowerShell commands against the node and have the username/password for doing so. When adding a new server, just type in the name of the machine and specify the remote port, which is 5985 by default. This adds the server into RM as a “Standard” server. These servers always show their status as “Ready”, but this can be misleading since there is no agent. You can then compose your servers into “Standard Environments”. Next you’ll want to create a vNext Release Path (which specifies the environments you’re deploying to as well as who is responsible for approvals).

vNext Components

In order to use the binaries and scripts we’ve created, we need to specify a vNext component in RM. Here’s how I specify the component:

All this is really doing is setting the value of the $packagePath (which I set to the root of the drop folder here). Also note how I only need a single component even though I have several scripts to invoke (as we’ll see next).

The vNext Template

I create a new vNext template. I select a vNext release path. I right-click the “Components” node in the toolbox and add in the vNext component I just created. Since I am deploying to (at least) 2 machines, I drag a “Parallel” activity onto the design surface. On the left of the parallel, I want scripts for my SQL servers. On the right, I want scripts for my webservers. Since I’ve already installed SQL on my SQL server, I am not going to use that script – I’ll just deploy my database model. On the webserver, I want to run the prerequisites script (to make sure IIS, Webdeploy, MVC runtime and the MMA agent are all installed and correctly configured. Then I want to deploy my website using Webdeploy. So I drag on 3 “Deploy using PS/DSC” activities. I select the appropriate server and component from the “Server” and “Component” drop-downs respectively. I set the username/password for the identity that RM will use to remote onto the target nodes. Then I set the path to the scripts (relative to the root of the drop folder, which is the “Path to Package” in the component I sepcified (which becomes $applicationPath inside the DSC script). I also set the path to the PsConifgurationPath to my configData.psd1 script. Finally I set UseCredSSP and UseHTTPS both to false and SkipCaCheck to true (you can vary these according to your environment).

Now I can trigger the release (either through the build or manually). Here’s what a successful run looks like and a snippet of one of the logs:

To Agent or Not To Agent?

Looking at the features and improvements to Release Management Update 3 and Update 4, it seems that the TFS product team are not really investing in agent-based deployments and templates any more. If you’re using agent-based deployments, it’s a good idea to start investing in DSC (or at the very least just plain ol’ PowerShell) so that you can use agent-less (vNext) deployments. As soon as I saw DSC capabilities in Update 3, I guessed this was the direction the product team would pursue, and Update 4 seems to confirm that guess. While there is a bit of a learning curve, this technology is very powerful and will ultimately lead to better deployments – which means better quality for your business and customers.

Happy deploying!

18 Comments

BigFan

October 27, 2014 13:48

First of all I want to say thanks for replying so quickly with this post on how to use DSC to deploy web applications. I like the use of the Script resource to accomplish what the Agent based components did, I have two questions for you though.

1. It seems like you have to hardcode the values for the setparamters.xml file in the dsc configuration script. How would you accomplish deploying this application to different environments Dev>UAT>PROD? with the latest RM update can you parameterize these values in the RM component?

2. I notice that you always use $false on your TestScript of the Script resource. Is there any risk to these configurations running more than once? what if a systems DSC LCM had a ConfigurationMode set to ApplyAndAutoCorrect?

1. The Script resource has some limitation - so while RM does allow you to pass in parameters (see the Custom Configuration section at the bottom of the "Deploy using PS/DSC" activity), you will have to use the Script resource "string hack" to substitute the parameter value. I did briefly look at implementing a custom WebDeploy Resource, but haven't followed that through.

2. I mention that both my dacpac and WebDeploy deployments are idempotent - that's why I can (in this case) safely return $false for the Test-Scripts. Of course if you were doing something that wasn't idempotent, you'd have to have more sophisticated logic in your Test-Script to see whether or not to run the Set-Script.

Thanks for the great questions - I definitely want to spend more time solving 1 - if I come up with something, I'll post it!

Sam

January 14, 2015 16:38

I have a quick question about the DSC/vNext release management deployments. The docs say:

"Environments without deployment agents are called vNext environments. You can only use these vNext environments with vNext Components, vNext Release Templates and vNext Release Paths to deploy without deployment agents."

Does this mean I can't use this technology with an older project (for example, an older .net MVC4 website?). I'm trying to figure this out before I really dive into the docs.

vNext in Release Management refers to the Release Management components - agent-based or agentless. The agentless components are called "vNext". This is just the deployment mechanism - you can use it to deploy whatever you want. So you'd be able to deploy MVC4/5/6 applications!

Good luck!

Arvind

March 25, 2015 17:15

Hi Colin,

Can we call the powershell scripts directly in the script resource in the DSC configuration?We are currently redesigning a deployment framework and would like to use some of the script which are already in place. So, just wondering if we could reuse then in DSC configurations.

@Arvind - you could call a script from within a Script Resource - though Release Manager lets you call "plain PowerShell" directly too. In fact I would suffers that as a first attempt. Call the DSC script to get your environment into as close a state to desired as you can get - then invoke the "plain" script as another step in the workflow. That way you get to use DSC and reuse your existing scripts.

Good luck!

Arvind

March 27, 2015 13:56

Thanks for the reply Colin!

As suggested, i tried to run the powershell script from the script resource, but i get the error mentioned below.

PowerShell provider MSFT_ScriptResource failed to execute Set-TargetResource functionality with error message: A command that prompts the user failed because the host program or the command type does not support user interaction. Try a host program that supports user interaction, such as the Windows PowerShell Console or Windows PowerShell ISE, and remove prompt-related commands from command types that do not support user interaction, such as Windows PowerShell workflows. CategoryInfo : InvalidOperation: (:) [], CimException FullyQualifiedErrorId : ProviderOperationExecutionFailure PSComputerName : bld-app

I tried few options to fix , but they didn't work. It works well from the commandline though.

Arvind

March 30, 2015 13:56

Hi Colin,

I was also trying the option of running the PS script directly in Release Manager, which you had mentioned, as i had issues running the PS scripts within the Script resource(mentioned in my previous post).But in the vNext Release template, i don't get to see any tasks/actions which are listed in the Agent-based Release template. So, in this case how do i run a PS script directly?

Some commands will work from the command line, but won't work when executed remotely. One example (if I remember correctly) is Write-Host. It sees this as an "interactive" command and won't allow it to be executed. Try to make sure that you don't have any "interactive" commands in your script.

As for your second comment, pretty much the only Task you can run is a PowerShell command. The same activity is used for both DSC script and "plain" PowerShell - the task is smart enough to figure out which type of script it is. So just add another activity and set the script path to your plain PowerShell script and it should run just like that.

Good luck!

Curt Zarger

April 27, 2015 18:28

Colin,

Great post. It's really helping me get my head around all of this changing technology and architecture. ... On that note, I'm working to understand the strategic technologies to use as I'm building up a new release capability.I've been hearing that MSBuild, PublishProfiles and WebDeploy are not strategic, though at a recent VSLive event the clear strategic replacements were not able to be described yet due to Microsoft restrictions at that time.Can you give me your thoughts on the current and future build/deploy/release picture?thxCurt

As to "strategic-ness", I can't imagine why anyone would say that MSBuild, PublishProfiles and WebDeploy are "not strategic". Have you tried to publish to Azure lately? You may do it from a GUI, but the underlying technology is WebDeploy. I don't think that is going to change any time soon, so if you're investing in IIS (and/or) Azure Websites, then you want to stick with these technologies.

The MSBuild tasks for Web Projects simply allow you to create WebDeploy packages out-the-box. I don't think anyone should be customizing MSBuild tasks - especially not with the new Build Engine that was released at //build. Instead, build tasks can invoke the MSBuild tasks - and you can do customizations in pre- and post-build scripts in the build flow.

As for PublishProfiles, they are also deeply used in deploying to IIS and Azure, even in VS 2015. No reason that they are going to disappear any time soon. By creating a PublishProfile that puts placeholders into the config file, which are later filled in with environment-specific values, you get a single build package that can be moved from environment to environment in your release pipeline, and that's a good thing!

I'd love to know the context of the VSLive event that you refer to - can you post the link to the recording?

Ahmed

June 9, 2015 20:03

Im using RM but divided many DSC steps in different files what i noticed it is like always trying to clean temp folder DTdownloads what happen is i get wmiprvse process locking my files. Any idea how can i avoid that ?

@Ahmed - the only think I can think of is to make sure your tasks are sequential and not parallel.

Tony Castro

September 22, 2015 15:38

I use a mix of MSBuild, WebDeploy and Powershell for my vNext component builds/releases. This allowed me to layer Release Management on top of my existing WebDeploy configuration (DeclareParams, SetParams, etc) and also integrate it into TFS. This reduced my scripting complexity in Release Management considerably. The first part is to have a TFS Build definition that can build and package (Create a WebDeploy deployment archive) along with any SetParams files and supporting msbuild/powershell scripts. This allows you to build the code once and deploy it to the various environments/stages in the Release Path. By having the build definition create the deployment archive you end up with that archive (environment agnostic) in the drop location (as well as supporting files) where Release Management can pick it up. The Release template can then simply just contain a minimal set of parameters (eg. environment) that tell WebDeploy what SetParams file to use. The Deploy Powershell script simply passes minmal arguments to WebDeploy to do the work (via sync action). All environment settings still reside in a SetParams file (one per environment, eg. prod.SetParams.xml). The bonus is that the logs in Release Management will contain all of the output from WebDeploy. You'll know exactly what files were updated.

Sounds like you've got a good thing going! I personally prefer the builds/scripts to have tokens and let the deployment tool (RM or Octopus etc.) fill in the values. That way I can change values in the deployment tool rather than having to commit another code change.

Let's imagine that you have a db connection string in prod.SetParams.xml. If you change the connection string, you'd have to change the file, commit and do a new build. If the value is in the deployment tool, you just change the value in the deployment pipeline (you can do this in RM or Octopus deploy for example). In my mind, the build shouldn't care about the values that you need for deployment. Different strokes for different folks I guess.

As long as your process is working and is automated, you've solved the problem! The method doesn't really matter that much!

Hi Colin, I really do appreciate your posts an it is a great resource for my work. I'm currently finishing a pattern for CI and Release Management with BizTalk installations. What I stumbled on is, as you, about the behaviour of the Script Resource. I'm really in a need för injecting parameters from the configData into the script. I'm using BTDF (BizTalkDeploymentFramework) that is built on MSBuild Tasks. What i discovered is if you use here-strings and the $($packagePath) opener you can inject the parameter value into the script. A reminder is not to use the { } on SetScript, se below. If you are using local parameters in the here-string you have to escape them with a backtick `. This is just a tip and something for me to reciprocate to all insights about Release and DSC.