On my project at work we recently upgraded our TFS 2013 on-premise installation to TFS 2018 and I am loving it. I got to play around with build and release definitions and we set up pipelines for most of our applications. We could finally automate everything that should be automated. After seeing the results of this I wanted the build definitions to also run the unit tests for the respective projects. We were able to set this up for most of our projects, but I was struggling with a specific problem: how to get our unit tests to run on our Mac agent.

We have a Xamarin application (iOS and Android) which has some unit tests for the shared code used between the iOS and Android application. The problem was that Mac as a build agent does not have the VSTest capability which is required to run the unit tests, unlike Windows. This is a problem when you want to build and deploy the iOS application (which has to be done from a Mac) and at the same time run the unit tests. We discovered this obstacle when trying to run our build which contained the Visual Studio Test task in its build definition. So my question was: How can I get our tests to run on both a Mac and a Windows agent? The answer was: convert the test project to .NET Core!

As you may know, .NET Core is a cross platform framework that lets you run your C# code on Windows, Linux and MacOS. By converting the test project to a .NET Core project, I would get access to the .NET Core command line tools from the Mac and thus be able to use the .NET Core task in TFS and run the tests with this.

Step 1: Converting the test project to .NET Core

In order to complete this step you’ll need to have the “.NET Core cross-platform development” workload installed into your Visual Studio installation. You can add it by using the Visual Studio Installer.

To convert the test project I replaced the content of the .csproj file with the following (I added some placeholder code for example purposes):

Then I deleted my packages.config file and AssemblyInfo.cs and re-added my NuGet packages and references to other projects.

Step 2: Assessing NuGet packages

Seeing as we use the SpecsFor framework, which in turn uses the NUnit framework, we needed to include the NUnit3TestAdapter NuGet package. Without it, TFS won’t be able to discover the tests. We were also using an older version of SpecsFor that we needed to update, which also prevented the tests from being discovered.

Step 3: Adjusting the build definition in TFS

After verifying that the tests were working by running them through the Visual Studio Test Explorer, I edited the build definition to use the aforementioned .NET Core task instead of the Visual Studio Test task.

The build definition looked like this:

The .NET Core task uses the test command and is targeting the test project’s .csproj file.

After a suspensful wait for the build to finish, it worked! Although, there was something missing. When you use the Visual Studio test task you get this nice little graph on your build details view in TFS telling you how many tests passed, how many failed and other useful info about your tests. This was missing. The only info I got about our tests now was by checking the build logs. I wanted this to be more visible and maybe even affect if the build would fail or not.

Here’s where some of the frustration of working with an on-premise installation of TFS kicked in. According to the documentation, there should be an option to check “Publish test results”, which should be directly under the Arguments field. However, this is only available in the version 2.* of the .NET Core task (notice how the screenshot above is using version 1.*). In our installation of TFS 2018 Update 2, version 2 is only available as a preview and does not include this checkmark. Sigh.

At this point I’m starting to think all this work has been for nothing. Luckily, I discovered that you can achieve the same thing as this checkmark by adding some arguments to the .NET Core test command:

Added arguments to publish the test results manually.

The added arguments tells the task to use the TRX logger, which produces a Visual Studio Test Results File. It also specifies the resulting file name, which will be used in a further step.

Since this task only produces the test results, we also need to publish the generated test results. We did this by using the Publish Test Results task, which is pretty self-explanatory. We set it up like this:

Publish the test results produced in the previous step.

Note how we have selected VSTest as “Test result format”. I was a bit confused by this since NUnit is also an option here, but seeing as we produced a Visual Studio Test Results File in the previous step, this makes more sense. We have also pointed to the produced file in the “Test results files” field.

And presto! The dearly wanted graph was finally there:

Holy pass percentage, Batman!

As a last modification I wanted the build to proceed, even if some of the tests were failing. At this point the build would break and stop if there were any failing tests in the build. To achieve this, I went back to the dotnet test command step and checked off “Continue on error”. This will mark the build as “Partially succeeded” if there were any failing tests and will be marked as orange instead of red or green. Neat!

Summary

The whole process of getting this to work really increased my interest for DevOps and pipelines. As a developer I always want to automate manually tedious tasks, and as a human I would rather much have a machine remember all the steps involved with builds and releases for me. This is also a great step to assure quality and a big bonus in projects where there aren’t any designated testers.

On a final note: I would love to hear from you if you have done something similar in TFS/Azure DevOps and what your thoughts are on best practice with automated testing. I am also curious to know if the “Publish test results” option for the .NET Core test command is available in later TFS 2018 updates, since it is not on the Update 2 version. So if anyone has the answer to this, please leave a comment.

Last year at Build 2017, Microsoft announced Xamarin.Forms Embedding. This technology allows you to use Xamarin.Forms inside your Xamarin Native application. Most people will probably choose Xamarin.Forms when they create a new project these days, but for those who are working with Xamarin Native projects targeting multiple OS-es this came as a dearly wanted functionality. As I am daily working on a Xamarin Native project I naturally embraced this technology and advocated for using this in our next big release. I want to share my experiences regarding this and give my thoughts on this approach.

Share ALL the code!

The most obvious advantage of this is the increased code reusage. You can create and define XAML pages which in turn can be initialized on each platform as a UIViewController (Xamarin.iOS) or as a Fragment or Activity (Xamarin.Android). As for our project, we started out with converting one of the static pages in our app (our Help/About page) to be a Xamarin.Forms XAML page, just to test the Embedding technology. And it worked like a charm!

But what if you have a more advanced page with native functionality, like using geolocation, maps or opening a link in a browser?

We solved this in 2 ways: plug-ins and dependency injection. There are lots of plug-ins for Xamarin.Forms for using f.ex. media (camera/video), geolocation, permissions, audio etc. We used these where we could and used dependency injection for the other cases. So for the latter we would pass in an interface to the page, which in iOS would be hooked up to an iOS implementation of that functionality and vice versa for Android.

Designing the pages were done, for the most part, with Xamarin Live Player. We did utilize some of the tricks I mentioned in my previous blog post, more specifically by creating a new project and designing the views there. Our project had too many dependencies to be able to use the Live Player with so this approach worked best for us, although there was a lot of copy-paste action.

Downsides?

I can’t really think of any downsides to this functionality. If you have a problem achieving the same functionality on both platforms with this you can always go back to using native functionality. Another thing to be cautious about is if you have a large Android app and you’re pushing the 64k reference limit. Using this embedding technique means you have to pull in some NuGet packages on both platforms, but I hear the Xamarin team has put a lot of effort into making their packages linker safe, especially with their new all-in-one plugin Xamarin.Essentials.

Xamarin Live Player is an incredibly useful tool for when you’re laying out your Xamarin Forms pages. The debug cycle for creating layouts is dramatically reduced when using the Live Player and I’ve personally been using it on my latest projects. During this time I’ve learned a couple of tips and tricks regarding this tool that I wanted to share.

File -> New…

Xamarin Live Player does not work well with a big project full of dependencies. As the documentation says, there are some limitations to it, although they are working on improving this (see James Montemagno’s comment on this post regarding SQLite). A workaround that I’ve been using is to create a new Xamarin Forms project without any dependencies or NuGet packages and use Live Player to create the pages that I want. When you’re done tweaking your margins and sizes, you can copy the layout back to your main project.

Provisioning? No thanks.

The Live Player is a bit of a hassle to use with iOS. Usually you’ll have to set up a certificate and a provisioning profile just to be able to debug the app on an iOS device, and if you just want to iterate on your page layout that seems a bit excessive. Luckily that has changed with the recent release of version 15.6 of Visual Studio 2017. You are now able to use the Live Player with the Remote iOS simulator. This is great as it reduces the friction involved with getting started with the Live Player for iOS. This functionality already exists for Android, but if you have limitations that prevent you from using the emulator (e.g. you need Hyper-V on) or don’t own an Android device this is a great substitute.

If you have any other tips or tricks regarding using the Live Player, feel free to leave a comment.

fastlane is an open source toolkit that helps you automate deployments for mobile applications in addition to helping you take care of all the tedious stuff that goes along with it. After learning that Visual Studio for Mac will be integrating with fastlane I decided to check it out and see how well it works with Xamarin.iOS.

In this guide I will show you how to build and deploy your Xamarin.iOS application to HockeyApp using fastlane. To follow this guide you will need a Mac with Visual Studio installed. As fastlane uses Ruby you might have to update your current version of Ruby, which should be preinstalled on your Mac.

We will be performing the following steps:

Installing fastlane

Installing Xamarin plugin

Configuring the project

Configuring the Fastfile

Deploying to HockeyApp

Installing fastlane

To get you started, go to this page and follow the instructions. Since we will be working with iOS, choose iOS as platform. Select HockeyApp as beta service. You’ll notice there are additional tasks that can be implemented, e.g. integration with Slack or HipChat to alert whenever a new deployment has been made. I won’t be covering these in this post, but feel free to check these out if you want. There’s also the “Increment Build Number” task, but since fastlane is primarily meant to work with Xcode this won’t do much for us with our Xamarin application. Don’t worry though, there are other tools we can use for this that I will be showing you later in this blog post.
Once you’re done, download the files and extract them into your project directory. Your Fastfile should look like this:

Open and run the install-script in the installer directory.

You might run into the following error when running the install-script:

Couldn’t detect shell config file (bash – ~/.bash_profile)

In that case, open up a new Terminal window and execute this command to open your bash profile:

touch ~/.bash_profile; open ~/.bash_profile

In the document that opens, insert the following line and save and close the document:

export PATH="$HOME/.fastlane/bin:$PATH"

Now we’ll do a test drive just to make sure fastlane has been installed successfully. Open a Terminal and change directory to the folder where you extracted the installer-folder (not the installer-folder itself, but the parent folder). Run the following command:

fastlane beta

If everything has been set up correctly, you will be prompted by the gym task for the path of your project file. As this is only applicable if you’re developing with Xcode, abort the task and let’s install our Xamarin plugin.

Installing Xamarin plugin

As mentioned before, fastlane is primarily meant to be used with Xcode. Fortunately it supports plugins and we’ll be taking advantage of the plugin fastlane-plugin-xamarin-build which does pretty much what it sounds like: it allows us to build Xamarin projects. We will also be utilizing it to set the correct certificate and provisioning profile for our selected build configuration.

Install the plugin by running the following command:

fastlane add_plugin xamarin_build

Answer the install prompts and hopefully the plugin should install successfully. You might, however, be faced with the following error when trying to install the plugin:

You can ignore this error. The plugin has actually been installed successfully.

Configuring the project

In order to perform the last step of deploying to HockeyApp we need to make sure that our desired build configuration for our Xamarin.iOS project exports a IPA file on build. As we are using the debug configuration for this guide we will need to configure this build. Open your iOS project’s properties from Visual Studio and navigate to “iOS IPA Options”. Make sure that “Build iTunes Packages Archive (IPA)” is checked.

Configuring the Fastfile

Now let’s update our Fastfile. We will first be verifying that the Xamarin build plugin is working properly, so update the Fastfile to look like this:

Make sure that “solution” is set to the directory of your solution file. This example is using the explicit directory, but you can also use a relative path. Also, make sure that “project” is set to the name of your Xamarin.iOS project.

Incrementing build number

More often than not you’ll want to increment your build number for each release. If you like to use an integer for your build number you can automate this into your Fastfile. By using the fastlane commands “get_info_plist_value” and “set_info_plist_value” we can easily retrieve the current build number from our Info.plist file and increment it. Insert the following code into your Fastfile and replace the paths with the ones to your project:

You can also use the “set_info_plist_value” command to set other values in the Info.plist like display name and bundle identifier.

Signing identity and Provisioning Profile

For the Xamarin build plugin to be able to build properly you need to tell it which signing certificate and provisioning profile it needs to use for the build. This can be done by using the command “xamarin_update_configuration”, which you can use to edit any property-value pair that is in your .csproj file. I will assume that you have already set up your signing identity and provisioning profile. The following code snippet sets the signing identity and provisioning profile which is registered to my app:

First we need to get the API token for the project. To do this, log in to HockeyApp and navigate to Account Settings. Navigate to API Tokens and create a new token for your selected app. To be on the safe side, select “Full Access” under Rights. Next, update your Fastfile and replace the existing “hockey” command with the following snippet and insert your API token:

Note that I am using a relative path here.

Your Fastfile should now look like this:

And there you have it! Run the “fastlane beta” command from the Terminal window to check that everything is working.

Final words

I’m using fastlane at work for a project that cannot utilize Visual Studio Team Services as our source code is located on-premise (TFS). Normally I would have used the CI/CD tools that VSTS offers, but fastlane has proven to be a great alternative. Seeing as parts of fastlane has now been integrated into Visual Studio for Mac I am sure that this is a solid framework and that it will continue to evolve to make the mobile developer’s life easier. I’m excited to see if more parts of Fastlane will be integrated into VS for Mac. Lastly I’d like to say that I have not touched Ruby before I heard about fastlane so excuse me if my Ruby code stinks.

In this post I will primarily focus on how to use VSTS to create and automate builds and releases. We will be using Microsoft Azure to host our web application and to setup our continuous deployment with. We will also be integrating VSTS with GitHub so that we have a repository to create a build and release from. I will assume that you already have an existing web application built on Node.js and Grunt and that this is already hosted on a GitHub repository. A basic understanding of how Azure and VSTS works is also preferred as I will not delve too deep into these.

Creating your Azure Web App

If you don’t have one already, set up a Microsoft account and log on to portal.azure.com. Here we will create a new web app hosting enviroment. From the main screen/dashboard, click “New” (+) -> “Web + Mobile” -> “Web App”. If you don’t have an Azure subscription you’ll be prompted to make one. Once that’s done, continue setting up your Azure Web App by filling in your app name, selecting “Subscription” and “Resource Group”. I will be naming my app nodegruntexample. Hit “Create” and you’re done with the Azure part.

Setting up your VSTS account

Log on to visualstudio.com with the same account you used for Azure. From your account page, click “Create new account”.

Pick a name for your account and select “Git” as source control. If you want to change the default hosting region or your default project name, click “Change details”. Otherwise you’ll have to settle for the default project name MyFirstProject.

Setting up your build definition (Continuous Integration)

Once your project space is set up, let’s create a build procedure. On the following screen, click “build code from an external repository” as shown in the following screenshot.

On the following screen, click “New definition”. You will now be able to select from a list of predefined build definitions. Lucky for us, there’s already a definition for NodeJS + Grunt. Just search for “Nodejs” in the search area and select “NodeJS With Grunt”.

The different steps for the build process is listed on the left. As you can see we have the “npm install”-command to install our packages and our additional grunt commands.

Before we start configuring our Node and Grunt tasks we need to select what repository to generate our build from. From the build tasks, select “Get sources”. Select “GitHub” and click “Authorize using OAuth”. You might have to allow pop-ups from your browser for the prompt to show up.

Once this is done, select your repository and what branch to get your code from. If you’re planning on using this build definition for your production release, the master branch would be the selection of your choice. If you have a testing environment you could point this to the branch you use for development or testing to e.g. continuously push out the latest changes to your testers. For this example, I’ve selected the “master” branch.

Next, select the “npm install” task. Make sure to set “Working folder” to where you would normally run the “npm install” task. The other default settings here should be fine.

Select “Run grunt task” and make sure “Grunt File Path” actually points to the location of your Gruntfile. Under “Grunt Task(s)”, add your optional tasks to run. I’ve added the “build” task, just in case. Under “Arguments” I’ve added the “–force” argument because of some strict linting I had to bypass.

Next, select “Archive files”. Under “Root folder (or file) to archive”, select the folder that you want to archive from. Normally this will be the folder where your compiled files are located, e.g. a “dist”-folder.

Now save your build from the “Save & queue” dropdown in the top right. No need to queue a build as we are nearing the essential step that will save us this trouble.

The last, but most important step, is to enable Continuous Integration. This is done under “Triggers”, as shown in the screenshot below. Enable “Continuous Integration” and select a branch under “Branch specification”. When a commit is pushed to this branch, it will trigger a build which you in turn can release – manually or automatically.

Build status badge

Have you seen those GitHub repositories with the little “badge” that tells you the status of the last build? VSTS offers this functionality by providing a HTML link you can copy into your GitHub repository. Select “Options” and check “Badge enabled”. Copy the link into your README.md and the badge will reflect your latest build status at all times.

The badge, in all its glory (if your build is successful):

Setting up your release definition (Continuous Deployment)

Compared to setting up continuous integration, this step will be a walk in the park. After saving your build, head on over to “Releases” and create a new release definition.

From the popup select the first template, “Azure App Service Deployment”. If you scroll down you’ll notice there’s another template called “Node.js App to Azure App Service”, but to my knowledge there is no difference between these two options seeing that Azure App Services support Node.js out of the box. Hit “Next” and select “Build” under “Artifacts” and your project and build definition from the previous steps. Last, but not least, hit the checkbox for “Continuous deployment” and click “Create”.

From the next screen, select your “Azure subscription”, authorize it and select your Azure Web app from “App Service name”. Now save it and you’re good!

The next time you push to your master branch, VSTS will create a build and automatically deploy it to Azure.

Final words

I’ve grown really fond of Visual Studio Team Services after discovering its true potential and I find myself trying to automate the smallest task now. I hope you’ve learned something from this blog post and I encourage you to play around with VSTS and check out the other predefined build templates.