Main menu

Post navigation

In a recent post of mine, I introduced quickly Sonar, and how we can integrate it with TFS. I gave a few tips for its installation. Now it is time to dive in the build process and add the logic we need to make TFS builds use Sonar conveniently.

Are your really sure?

Using Sonar as I’m proposing will deport some processing from TFS to Sonar, which means TFS loses control over a certain things: unit tests are no longer launched by TFS. You *have* to use Gallio Test runner because other test reports format are not *yet* supported. The best bet is then to let Sonar launch tests through Gallio, which may have side effects!

Code analysis can no longer be configured from your projects settings, nor from the build settings (I mean you don’t want to launch it twice), so deactivate it in your builds (pick the Never option), and let Sonar launch FxCop instead.

The tooling is not always up to date, MsTest with VS 2012 was not working when it came out, which means you can be stuck if you upgrade too early. We are in the open source world, we have no guarantee it will work seamlessly with other technologies, unless you contract for a commercial support with SonarSource.

You’re still reading? Ok, you’re a pioneer now, you’ll need a good army knife and some will to make things work in your environment. Hopefully you won’t regret because it will pay off.

Build server tooling

Some tools and scripts need to be deployed onto every build server. We aim at xcopy deployment, I advise to opt for a similar folder structure that you copy from server to server. I pompously named the parent folder “Soner.Net”, and here is its contents:

I cheated a bit with some products, took their installation from the “Program Files” folder on my local machine and copied them in this structure. It just worked for me.

I prefer having a private version of Java running then Sonar analysis. For this, you can just customize the sonar-runner.cmd file and update the PATH, and JAVA_HOME to point to the subfolder where you uncompressed your JDK (not JRE). Be aware of using absolute paths here!

The sonar properties

All your project parameters for the sonar analysis are in a sonar.properties file. You should place this file next to the solution file you want to analyze. I see two options for managing those files:

Create a sonar.properties file for each .NET solution and add it in the source controller

Sounds reasonable if you don’t have too many projects

Generate them automatically!

But this requires some build customization

The good news is that I’ve written a build Workflow Sonar activity that will generate the properties for you, or at least, help you to generate them. It is very simple, it takes a template file and replaces a few values for you. The path to the template file must be configured in the build process workflow.

Here is the template I’ve set up for using with TFS, a sample sonar-properties.template file:

As you can see, there are values that will be replaced at run time. You can see their description on the activity documentation page. The trick with TFS builds is that the output folder for projects is forced to a “Binaries” folder outside the scope of your sources! This template assumes it is running from a TFS build.

It should work locally

To test all these tools, fortunately, you don’t have to run builds. First, create such a properties file based on your values for a project you want to test. Make sure you comment the sonar.dotnet.assemblies and sonar.dotnet.test.assemblies properties since Sonar C# Ecosystem guess them right when TFS is not overriding the output paths. Then, in a command prompt, after having compiled the project, move to your project folder. From there, invoke the sonar.net-runner.cmd file. It should work.

Once this works for you, you are close to make it running into TFS builds, because all we need now is to launch this command line from our builds, and eventually generate the properties dynamically.

Modifying your build template

For this, you need to get the latest Community TFS Build Extensions, and deploy them into your build controller custom assemblies source control folder. See my guide here if you’re not at ease with setting up the solution for editing build templates. You may clone the DefaultTemplate.xaml and start editing it. Once your ready to inject the Sonar activity in your build template, locate the Run On Agent => Try Compile, Test, and Associate Changesets and Work Items activity, it contains the Sequence as illustrated below. You should be able to drag the “Sonar” activity as indicated.

A nice and very simple idea is to add a boolean workflow Argument named “RunSonarAnalysis”.

Then encapsulate the Sonar activity into an If activity.

If you add the proper Metadata (locate the Metadata Argument, edit it, and add the RunSonarAnalysis argument in the list) for this parameter, you’ll be able to control the Sonar execution from your build definition! That is the start of a real integration.

Finally, edit the Sonar activity properties, and you’re all set!

Now you can decide to run Sonar from your build definitions!

You may add to your workflow custom parameters (with Metadata), and pass them directly to the Sonar activity. This would allow you to pass values such as “active” or “skip” to enable or disable the plugins of your choice, on a per project basis.

It is not as complicated as it sounds to run Sonar from TFS builds. There are things that can be done better, so stay tuned for future improvements with this activity!

I recently posted about setting up a solution for editing build workflows (with TFS 2012), I’m going to write directly today about testing your custom activities, because I think good guides about writing them are already out there, and there (even though they talk about TFS 2010, the logic is quite the same). Today, we’ll have a look at how I we test activities without actually starting a real build.

Classic testing approach

The approach is very classic : set up a test context, exercise the test, and check the results (and clean up).

The tests rely on the ability of given by Workflow Foundation to host one’s own workflow engine. So we’ll need mainly two things :

Create the right workflow context to emulate the behavior of build (only what’s needed for the test of course)

Create an instance of the activity, passing necessary parameters and stubs to fulfill the test

During our test, the workflow engine will run the activity. The activity will use objects and value from its parameters and interact with the context. Then we can add checks about what has actually happened. Ready?

Workflow activities testing tips

Creating the activity and executing it

You can instantiate a Worklow activity with a classic new statement. Literal parameters can be passed in the constructor or affected (literals here are value types and Strings). All other objects must be passed in a Dictionnary<String, Object> structure at invocation time.

// constants (literals)

var activity = newSonar

{

// this is a String (a literal)

SonarRunnerPath = SonarRunnerPath,

// this is a boolean (a literal as well)

FailBuildOnError = FailBuildOnError,

GeneratePropertiesIfMissing = GeneratePropertiesIfMissing,

SonarPropertiesTemplatePath = TemplatePropertiesPath,

FailBuildOnAlert = FailBuildOnAlert,

// StringList is not a workflow literal

// the following line will cause an exception at run time

ProjectsToAnalyze = newStringList("dummy.sln")

};

Here all values are booleans or strings, except one StringList that will cause an error at run time (so we must remove it). Here’s how to invoke the activity (actually a workflow composed of one activity) and pass the StringList as an argument:

// object variables

var parameters = newDictionary<string, object>

{

{ "ProjectsToAnalyze", newStringList("dummy.sln") }

};

// the workflow invoker, our workflow is composed of only one activity!

WorkflowInvoker invoker = newWorkflowInvoker(activity);

// executes the activity

invoker.Invoke(parameters);

Tracking build messages

You may want to check what your activity is logging, you know, when you call the TrackBuildMessage method or use the WriteBuildMessage (or Warning or Error) activity. To do this you need to set up a recorder, or more exactly TrackingParticipant. Here is a TrackingParticipant derived class that is specialized to recording build messages:

To use it, all you need is to instantiate it and pass the instance to the workflow invoker:

var workflowLogger = new BuildMessageTrackingParticipant();

invoker.Extensions.Add(workflowLogger);

After the test, you can get the build “log” by calling the .ToString() method on the worfklowLogger instance.

Setting up a custom IBuildDetail instance

During builds, activities regularly get the “build details”, an IBuildDetail instance that contains lots of useful contextual data. This instance comes from the worfklow context, and activities get it by using code that looks like the following:

Thankfully, it is an interface, so it is very easy to stub. I like to use the Moq mocking framework because it is very easy (yet not very powerful, but it is perfect for classic needs). Now we need to create a stub out of the IBuildDetail interface, customizing it for your needs, and to inject it in the workflow “context”. I’ll actually assemble multiple stubs together because I also need to set up the name of the build definition for my activity (yes, the activity uses the current build definition name!):

Now, the activity “thinks” it is using real “build details” from the build, but during tests we are using a fake object, with just the necessary values for the test to pass. So this is actually a pure classic stubbing scenario, no more.

Passing TFS complex objects

Unfortunately, not all of the objects and classes we need in build activities are interfaces, or pure virtual classes. Those are easy to stub. In the case of objects such as a Workspace, a VersionControlServer, a WorkItemStore, or a WorkItemType, you have to use more powerful stubbing frameworks such as Microsoft Fakes or Typemock.

Let’s use Fakes since it is available the Visual Studio Premium edition.

First, locate the assembly that our target type belongs to. The Workspace class belongs to the Microsoft.TeamFoundation.VersionControl.Client assembly. Right-click it in the References of your project and add a Fakes assembly:

Fakes processes all types and members of this assembly and dynamically generates a new reference which contain “empty” objects, with all overridable properties and members, and compatible with the original types. All types are prefixed by Shim or Stub, and methods include types of their signatures in their names. Here is an example that illustrates how to set up a Workspace “Shim”. When we call the GetLocalItemForServerItem method, it will return the value we want, that is LocalSolutionPath:

// our workspace stub

ShimWorkspace workpace = newShimWorkspace()

{

// we override String GetLocalItemForServerItem()

// and have it return a value of our own for the test

GetLocalItemForServerItemString = (s) => LocalSolutionPath

};

To pass the actual Workspace compatible object to our activity as a parameter, use its .Instance property. Since it is not a workflow literal, let’s use the Dictionnary like we did before:

// object variables

var parameters = newDictionary<string, object>

{

{ "BuildWorkspace", workpace.Instance },

{ "ProjectsToAnalyze", newStringList("dummy.sln") }

};

Ok, we covered a few techniques that should allow you to test most activities now. When I’m satisfied with the tests I’m currently writing, I’ll publish them in the Community TFS Build Extensions project. So keep an eye on them if you’re are interested for a full running piece of code, sorry to make you wait!

Hi! Today, I’ll briefly introduce Sonar (recently renamed SonarQube) and explain a few tips on how to deploy it on Windows, having in mind to integrate it with TFS just after.

Sonar in a nutshell

Sonar is mainly a Web portal that stores everything about your builds and helps you navigate into all this data. Quality metrics are gathered by plugins of various tools (that may not come with Sonar), into a central database. The Web portal is composed of customizabledashboards, made out of customizable widgets, which can display data in various forms, with the ability to easily compare with previous builds, or see the progression through the last days or month. A drill down logic starting from any metric (such as Line of Code, violations, unit tests and coverage, etc.) will allow you to pinpoint the projects, files, and lines of code that are at the origin of their values. Various plugins (there are commercial ones) are available: they can group projects and aggregate their data, or see stats per developer for example. You can define quality profiles and select the rules that you want to apply to your projects (each rule is tied to a plugin), and create alerts when those rules obey certain conditions (too many violations, or coverage too low for the simplest).

Why Sonar and TFS?

Because Sonar is a great complement to TFS. It is not always easy to get the exact report we want: you’ll find reporting services and Excel reports which have to be set up with date ranges, and solutions filters. So you may have spent quite some time to configure a SharePoint dashboard. You can’t easily set thresholds that fail your builds according to some various metrics conditions. I mean, if all of this is possible because TFS is highly customizable, it is not all centralized in a single fully featured UI, and requires to use various products or technologies. Builds do not compare to each other (only the duration, and the GUI is fixed). While Excel shines at connecting to the TFS warehouse or cube, you need to be an Excel dude in order to navigate, slice, aggregate, compare data about build results. Third party tools don’t store their data into the build reports in a structured way, so you won’t get their metrics directly in the cube. While all this is possible with, really, it is not there as easily as we would want, and that is why Sonar is becoming so popular in the .NET world (and not especially with TFS).

Keep in mind that TFS is about so much more than Sonar. TFS links Work Items to code, allowing you to get an insight of real semantics in your projects (bugs and requests influence for example). Sonar focuses *only* onto the quality of your code, instantly and over time.

So we all know that Sonar is a Java application so it is evil by essence (just kidding ), but it proves to be useful even in the .NET world, thanks to the hard work of a few pioneers to write java plugins that would launch our favorite everyday tools (FxCop, StyleCop, Gendarme) and tests frameworks (with Gallio and various coverage technologies), there it is, waiting for us.

The plan to integrate Sonar

For simplicity’s sake, I’ve not represented TFS components such as build controllers, agents, etc. What is important here, is that the TFS build calls something named “Sonar runner”. This Sonar runner launches a JVM with a bootstrap that launches each plugin you have configured in your Sonar server. Each Sonar plugin then launches the appropriate native tools, gets their results and publishes them into the Sonar server. The data is stored in the Sonar database.

Nevertheless, I will give you a few tips and configuration blocks samples that will help you. Naturally, I installed Sonar against a SQL Server 2008 R2 database , so create an empty database and configure the server sonar.properties this way:

You’ll need a JDBC jTDS driver for using SQL Server, which is included in the Sonar server distribution (cool!), in the extensions\jdbc-driver\mssql folder. I’m not used to creating SQL Server security accounts. Since I always go integrated security, I find that managing passwords is prehistoric and unsecure practice, but I guess I have no choice.

The LDAP plugin works well, you can also get the groups your users belong to in the Active Directory.

Here is the configuration that I used my AD (spent a few hours to make it working, so I hope it will help):

or How to edit a TFS 2012 build process template

A few weeks ago, I blogged about using VB6 with TFS 2012. The build process I proposed was relying on a MSBuild script that you had to check-in into TFS. This script was responsible for calling VB6 on the command line and generate your VB executables. I just felt like something was left unexplored here, and I wanted to provide a little bit more sophisticated alternative for building your VB6 apps. So let’s customize the build workflow for the sake of our VB6 projects!

This post is also a tutorial for editing a TFS 2012 build process template!

The plan

We will modify the Default Template in order to make it compatible with VB6 projects. In the Solutions to Build process parameter, we want be pass a list of .NET solutions (.sln files), but also .vbp files. We will rely on some external activities to invoke VB6 compilation commands.

We’ll try to support the Ouputs clean option of regular builds, because it is useful for continuous Integration.

Duplicate the Default Template, rename it something like DefaultTemplateVb6. If you want to be able to edit both files at once, follow this procedure I wrote that will avoid unwanted compile errors.

Build process changes

Open your build process template and go to Run On Agent => Try Compile, Test, and Associate Changesets and Work Items. Now look for the Try to Compile the Project TryCatch activity, you should double-click on its icon to “focus” the Workflow edition on this subcontent – it’s just more comfortable.

We are here at the heart of the loop that calls MSBuild on every project to build. We will need to store the result of the last VB6 compilation. For this, we need a variable in the scope of the Compile the Project block. Add a vbReturnCode variable as follows:

We want to run MSBuild on regular projects, and launch VB6 on .vbp projects. So we’ll add an If activity in order to filter VB6 projects. Insert it just before the Run MSBuild for Project activity.

Add a Sequence in the Then block, and drag the existing MSBuild activity in the Else block. Actually, recreate following piece of workflow:

Remember to add the custom activities in the toolbox in order to use them in the workflow (right-click and Choose Items…).

Cleaning projects

In order to support the partial clean of builds (Clean Workspace is set to Output), we have to filter the VB6 projects from the process. VB6 executables are produced directly in the BinariesFolder, which is emptied automatically. The following actions will prevent nasty errors when cleaning outputs (because our .vbp aren’t MSBuild files).

First, navigate to Run On Agent => Initialize Workspace, look for the “Clean Project” Sequence, then encapsulate the If File.Exists(Project) with the same filter as we did previously:

Finally, set the process parameter Solution Specific Build Outputs to True to avoid a big mess in the Drop folder. And voilà, you can now mix and build regular solutions or VB6 projects with the same build definition!

If you followed my small guide, or anyway if you have multiple build process templates in a single solution, you may have encountered this error when compiling with Visual Studio:

“obj\Debug\TfsBuild_Process_BeforeInitializeComponentHelper.txt” was specified more than once in the “Resources” parameter. Duplicate items are not supported by the “Resources” parameter.

Diagnostic

This is actually easy to understand. When you add a Xaml workflow file into your solution, Visual Studio sets its properties to be part of the compilation:

Visual Studio generates code and resources out of the Xaml language file, and each Xaml process template has its own class name and namespace (full name). The problem is that generally, TFS build process templates file have the same full class names. The consequence is that there are collisions. Generally, this class name is TfsBuild.Process (hence the name of the TfsBuild_Process_BeforeInitializeComponentHelper.txt resource), because we often duplicate existing build templates.

Solution

I’ve seen colleagues changing the Build Action setting it to None:

But you loose the controls issued by the compile process. There is something better do to, simply rename the class or namespace inside the Xaml file:

Open the file with the Xml Editor (right-clic => Open With… => XML (Text) Editor)

Look for the x:Class attribute on the first line

Rename the namespace to something meaningful to you (ex:TfsBuild.Process => TfsBuildUpgrade.Process)

On the same line there is another mention of the namespace, rename it there two:

And the build should still work This way you can put all your build files in the same solution, this is more comfortable! I consider good practice to fix every new build process template I create. Hope this helps!

I’ve been recently working for a client with lots of VB6 projects. The fun part is that we wanted to migrate from VSS to TFS 2012. Although VB6 no longer supported by Microsoft, there is no reason why TFS would not work for VB6, you can host Java in TFS right? If you have VB6 projects and want to plug them into TFS and have them built in a continuous integration perspective, then I hope this small guide will be helping you.

What to install

First, check you have VB6 with SP6, and the mouse wheel fix as well, I won’t spend more time here since you’re already using it.

You’ll need to install the Visual Studio 2012 edition of your choice, with the latest Update (at this time it is Update 2). Then the famous TFS Power Tools, which add nice check-in policies (and more). Finally, you’ll need the MSSCCI Provider for Visual Studio 2012, 32-bit version or 64-bit version.

If you still have Visual SourceSafe around

Warning, the MSSCCI provider has rerouted VB6 source control interactions to the Team Foundation Server source controller. To connect back to VSS, you need to perform some registry operations. Fortunately, small utilities will do that for you, by listing all MSSCCI providers available on your machine, and allowing you to choose which one is active. So you’ll be able to switch back and forth easily from VSS to TFS. This one worked for me => SCPSelector.exe.

SCPSelector in action

Unwanted prettify options for VB6

Visual Studio 2012 *doesn’t know* about VB6, it knows about VB.NET!

When you are merging files, you don’t want VS 2012 to make assumptions regarding your syntax, and even less *modify your VB6 code*. Make sure you uncheck those options in the TOOLS => Options => Text Editor => Basic => VB Specific menu:

How to map the sources

Local workspaces are great, but the MSSCCI provider is not happy with them. You’ll have to use the traditional server workspaces. Well, it’s not a big deal.

Ok, so let’s create server workspaces and map our VB6 sources from TFS. Now, I want to develop with VB6 but when I open the project with VB6, I get asked to add my project to TFS, doh! Actually, I’d want me and my users to open up any VB6 project as smoothly as possible. To achieve that, you have to edit the MSSCCPRJ.SCC files (or create them), they contain the necessary MSSCCI data to connect to the proper source controller. The bad news is that you can’t share those files! They are specific to your login and your workspace so adding them into the source controller is useless!

VbTfsBinding will do the work for you

I wrote a small utility that will generate all those files for you. Copy it at the root of your workspace and it will generate a MSSCCPRJ.SCC file for every .vbp file in your workspace. Now you can just open any VB6 project in your workspace, you should not be annoyed by any configuration message box.

To share the tool, you can include it in your TFS source controller, in a subfolder at the root of your workspace (or branch), and add a .cmd file that changes the current directory and launches “VbTfsBinding.exe /force”

Ok, now the basic source control feature of TFS are usable directly from VB6, but I would advise to always check-in from VS 2012. This just allows you to make sure you don’t forget files in your changesets, you can check all your pending changes with a glance, and I feel more secure that way.

Building your VB6 executables

Now the fun part. Our goal is to call VB6 on as many VB6 projects we want to build. The command line is:

Vb6.exe /m Projects.vbg /out Projects.vblog

Where projects.vbg is project group file which contains the list of projects we want to build.

Let’s follow the path of the Lazy. Let’s use a simple MsBuild .proj file to encapsulate the VB6 compilation logic, and rely on the default DefaultTemplate.xaml of TFS do the rest.

First, prepare your VB6 group file and check it next to your projects in the source controller (paths are relative). You can check the compilation with VB6 on your machine.

Then, add the following MsBuild file next to the .proj file, let’s call it Projects.proj:

In my previous post, we had a look at the problem : how to merge sibling branches with TFS, minimizing the number of conflicts. Yes, the number of conflicts you’ll get depends on the “technique” you’ll be using :

TFS baseless

Shelvesets

Other tools?

All your base are belong to TFS

But first, let’s answer the real question : what should be the base exactly for our 3-way merge?

Let me recap the scenario:

branch dev A from Main

Main evolves

branch dev B from Main

Main evolves

Both dev A and dev B have evolved as well

As a best practice, we integrate latest Main content into dev A and dev B

Now we want to test dev A and dev B with a single test campaign, we want to merge them all together, but leave Main intact

branch QA from Main, from the version that as been merge to dev A and dev B

The base we need is the latest version of Main that has been merged into dev A and dev B. You must have merged the same version of Main into both branches, of course. The QA branch needs to be branched from the same version of Main as well. These conditions are common practice and should not be a problem.

Here the base is not the most recent common ancestor, or it depends on what you call an ancestor. It is easy to understand : I want to merge “the diff between latest Main and dev B” into dev A. And dev A evolutions must be compare to the latest Main version that has been merged as well.

External tools

It is not possible to choose your base when you merge with TFS – btw, I’d be curious to know which VCS let you choose a custom base when merging.

So let’s perform our merge “outside of TFS”. Is that bad? In the end, you won’t have any merge history between those branches, but you we really need that? What looks important to me is to keep the dev history on the dev branches for a little while, for reference, and that the QA branch future merge into Main remains easy.

3-way folder merge procedure

Use a local Workspace that maps the QA branch. Map also in any workspace dev A, dev B, and the Main branch (the version your merged into dev A and dev B if ever Main has further evolved).

Merge dev A into QA with a baseless merge (easy when using VS 2012 and TFS 2012 remember last post?). Take the Source version for every conflict (easy merge isn’t it?), you can select all conflicts and choose the option.

KDiff3 is ugly, but the best merge tool I know at the moment. It is just quite clever and has nice features:

Note that you will also loose the renames during the process, which will break the history of the renamed files. You can perform the renames in the Source Control Explorer if you like (do this before resolving the merge, and rescan afterwards).

When finished, the local workspace (new with TFS 2012) is your friend, it will detect what has been changed, added and deleted in the Pending Changes window:

The final tradeoff is :

You have less conflicts than when using TFS (even with baseless merge as explained in the previous post)

You break the chain of history, partially

In my eyes dev history is not very important, I’d be glad to debate on this. I mean not as important as maintenance history!

If you have a large code base to merge, that should be worth it! Happy merging

or How to merge sibling branches with the least conflicts as possible

In this post series I’ll propose a simple procedure to merge feature branches among them without merging into Main. This is particularly useful when you want to test the features altogether (or by groups of features) without bringing yet all the content onto the Main branch. If you have a large code base, you also want to avoid conflicts as much as possible.

Team Foundation Version Control (TFVC) is a great version control system as long as your respect the branching and merging guide produced by the ALM Rangers. You can handle pretty complex situations and have branching trees such as this:

TFS baseless merges (really baseless?)

More on the merge command here. That will create a merge relationship between two branches with no obvious connection (read parent-child). Now the merge relationship is established, subsequent merges are then managed by TFS UI (Visual Studio) and history engine. Good.

But still, this king of merge is not very satisfactory because TFS is not very good at picking the most recent common ancestor “in a clever way”. Don’t laugh too fast, Git is not very clever either at picking up the best base.

Here is the best scenario you can get with TFS when merging sibling branches: the base is the origin of the branch you are merging from. When merging with the command line, you should use the /version parameter to select only wanted changesets, don’t take the changeset that created the branch, or the whole branch will be a conflict. Take only the changesets your are interested in, as explained by Buck Hodges here. In this precise case (schema below), I selected all the changesets but the one that created the branch dev B, in order to merge its contents into dev A.

I’m lying, the UI can do it!

Ok, sorry, but since this feature is new in TFS 2012, don’t blame me too much if put in my article a bit of TFS history!

So with TFS 2012, it is now very easy to perform baseless merges in the UI. You can edit manually the target branch in the merge wizard. Then choose to pick the changesets you want:

Then select the changesets you want to merge (without the branch creation):

Made with Brien Keller VS 2012 VM

Using Shelvesets

The trick is to use the TFS power tools to unshelve your content to a different branch. The command will “translate” server paths to another location (that you must have mapped in your Workspace):

They are very handy but don’t expect a top notch merging experience regarding to the conflicts that are generated. Before TFS 2010 SP1 baseless merges improvements, they were kings for moving fixes from branch to branch. Now, that baseless have improved, I’m not so sure, I go “baseless” more and more.

Next post I’ll talk about a solution, there is no magic, we’ll use external tools to achieve our goal. Till then, just think, what base is the best in such a case?

TFS 2010 and later versions use Workflows (WF) at the core of their mechanics. Workflows are intuitive to edit, and powerful as any high level scripting language can be. However, in the context of Team Foundation Build, they need a particular set up with Visual Studio if you want a smooth editing experience. This post will hopefully help you in setting up a cohesive environment in order to edit your build process templates. This is just a proposal, this is the way I prefer setting it up, feel free to keep whatever you want here.

Team Project organization

I advise to use a “test” Team Project and edit your templates there. Why?

Test your build in the test Team Project and when it’s operational, copy the file(s) into your production Team Project. You don’t mess with a renamed or temporary template , nor produce unwanted changeset noise in the production Team Project. Very simple.

Because you may need to use another build controller, but we’ll discuss that later.

So we’ll need a Visual Studio solution and project in order to edit the Xaml template properly. Where shall we store that Solution?

Solution setup

Notice there is no Xaml file at the project level, this is because they are added into the project « As Links », directly from the parent folder :

You can distinguish linked files from other in your projects because a of the tiny arrow on their file icon:

When I checkout the the Xaml template file, it is checked-out from its original location, this is fine.

We want to use a project for multiple reasons :

Compiling will help us to removing errors in the workflow file and it checks the references

It is necessary if you have custom assemblies, and frankly, you *will* have some at some stage. If you don’t go grab the latest Community TFS Build Extensions and you’ll have *very* helpful activities for your builds

We want “drag and drop” editing, if we have custom assemblies, this is the best way to do it

In order to compile nicely, you’ll need to add several references, this is where it gets not that obvious, nearly tedious, but you’ll need to do this only once in your life, so that’s ok 😉

Microsoft.TeamFoundation.TestImpact.Client : %windir%\assembly\GAC_MSIL\Microsoft.TeamFoundation.TestImpact.Client\11.0.0.0__b03f5f7f11d50a3a\Microsoft.TeamFoundation.TestImpact.Client.dll (this is bad, we should *never* *ever* have to reference something in the GAC, hope this will be solved in next version)

Now you can edit the worfklow and compile it.

The development cycle of builds (aka the everyday life of the build master) consists in:

Check-out the workflow template file

Modify the workflow file

Check-in the worfklow file

Launch a build that you have set up with *this* workflow file

Adding custom assemblies

We are now set up for throwing in some custom assemblies. First, add the assemblies (and their associated .pdb) in an “Assemblies” folder, create it if necessary, just under the BuildProcessTemplates folder. What is important here is that this folder is configured in the properties of your Team Project build controller (Builds -> Actions -> Manage Build controllers…)

The need for a separate build controller

If you are serious with builds (if you have many production builds you can’t afford to interrupt for with your developments), you’ll want a separate build controller. It is easy to deploy.

When you check-in some assemblies in the custom assemblies folder, the build controller restarts, and cancels the builds in progress. This is where it gets handy to have a separate controller for testing purposes. And be aware of your production build deployment timing.

NB : checking-in the Xaml files does not restart the controller.

Toolbox setup

In order to drag and drop custom activities from the Toolbox, you can do the following:

Create a new Tab in the Toolbox

“Choose items” and browse up to the assembly that contains the activities you want

They now appear in the Toolbox, you can drag them into your build workflows

Without the project references to the corresponding assemblies, drag and drop would not work…

Finally, all is set up, you should be able to edit your builds the easy way. Enjoy!

The official pages about Visual Studio shows a nice edition comparison page for Pro, Premium, Test and Ultimate editions. But what about Express editions ? What are their features and how do they compare ?

Yes, full support but limited to MsTest (can’t install other frameworks adapters)

No

Main constraints summary

No C++, no attach to process, no Console or Windows Native projects, No Forms nor XAML, No Call Hierarchy Window

No T4 support

No Windows Phone 8 Emulator on other systems than Windows 8 x64 at least Pro with SLAT enabled (can’t use VirtualBox nor VMWare)

No T4 support

Disclaimer : I’ve been very careful filling this table, it is the result of my very own experience which each edition. This comparison is not official and subject to change with updates. Please report to me any inconsistency.