Helix and the re-tooling of your Continuous Integration and Deployments

“Helix is a set of overall design principles and conventions for Sitecore development.” Like other architectures and design principles, Helix also has pros and cons. You might hear “saves the developer from having to create these modules from scratch” or “improve the efficiency of your projects, reduce costs and time to market”. Great right? Not so fast!

The concept of Helix and the Habitat implementation are great. When you get behind a set of design principles, its easier to follow them and also easier to overlook/overcome the cons. The intent of Helix is great but the long term goal of saving time or money is yet to be seen in real life implementations.

I personally can share my journey when we complete 6 – 9 sites on our Helix bases solution by next year. Until then lets talk about tooling or re-tooling.

In my opinion, Helix = Tooling, there is nothing else to it. By adopting Helix, you are committing to its architecture which WILL disrupt your current CI and CD processes. But since we are committed to Helix, its our job to re-tool and get back on track.

A lot of people choose to start off with Habitat instead of starting from scratch using Helix principles, because they do not understand how to start from scratch or how to convert an existing solution to a Helix based solution.

VSTS

First lets check out the .gitignore file to make sure that node_modules/ is un-commented. Just like Nuget packages being stored in the packages folder, all the Node.js plugins are stored in the node_modules folder. You do not want these checked into the source control for several reasons. You want the build server to download both the Nuget packages and the Node.js plugins while doing the build.

Add the npm step to install Node.js packages and specify the working folder:

Add the Gulp step and specify the location for the gulpfile.js:

Notice that we are passing in some Arguments. Yeap, here I am sending in some information which could be used in the gulpfile.js. I am mainly interested in the BuildNumber, so that I can send the NuGet packages to my Octopus server with the version needed. Since we did not specify the Gulp Task(s), VSTS will run the default task.

Now that I am passing in the parameters, I need a way to consume it in my gulpfile.js.

The above basically gives us access to the BuildNumber paraments from VSTS.

Now its next for OctoPack.

OctoPack

I love OctoPack because its so simple and just gets the job done.

To start, simply install OctoPack on the project you want to generate the NuGet package on by running the following in the Package Manager Console:

Install-Package OctoPack

Once installed, you need to create a nuspec file for that project with the same name as the project with a nuspec extension. For example, if you project name is Company.Project.Website, you need a nuspec file at the root of that project called Company.Project.Website.nuspec. Here is a sample nuspec file:

In the nuspec file, be sure to give a value in all the attributes including iconUrl. The main section however is the <files>. Specify the files which will be part of the NuGet package.

Once all of this is configured, you need a couple of things from your Octopus Server.

Octopus Server

I recently started used Octopus Deploy. I learned it by bothering two of my friends, the REAL Kam(ruz) and Google. I setup the environment and the deployments on Octopus are so easy. Some day soon I will do a session on Octopus deploy. Back to the topic of re-tooling.

We need to setup an API key in order to push our NuGet packages. Here is an article on How to create an API key. Also make sure that your Octopus admin URL is available to your build system, in this case VSTS.

Back to Gulp

Back in gulpfile.js, I tweaked the Build task to add additional parameters to execute the OctoPack. Here is a sample of that code:

Notice that we are specifying the Octopus Server URL with /nuget/packages, this is the destination for the NuGet packages we generate. Second, we are specifying the API key we generated, this is the means to authenticate, so please keep this key safely. Third, is the version number of the NuGet package, which plays a big role in Octopus deploys and we get that through the BuildNumber parameter processes using minimist NPM package.

I run a few other scripts in my Gulp to make sure all my dll’s from features and foundation projects are copied over to the project, minify css and js etc.

Once all of this is said and done, we should have a working Continuous Integration environment for a Helix based solution. My Octopus Deploy configuration, automatically deploys the new NuGet package to the dev environment.

The NuGet restore, NPM package install, Gulp build and OctoPack NuGet push is taking about 3.9 minutes on average and the auto deploy to development environment is taking about 5 minutes.

Setting up your local and a Continuous Integration/Deployment environments for Helix takes longer than your traditional Web solution but once you set it up properly you are good since you follow the FOLDER STRUCTURE. All scripts for building features, foundation projects and tenants are all dynamic.

Related

3 comments

The insistence on using gulp (and the consequence of forcing a dependency on node.js) is one of the biggest issues with Helix. There are many tools (msbuild, powershell) which are native to windows and do the same job, never did get the reason they added it to Helix.

Hi Akshay,
Thanks for this great blog post.
On my Helix based project, I am trying to do a continuous integration with Octopus deploy.I have a question whether I need to add octopack on all the projects in the solution? Is there a way to generate a consolidation package and then call octopus deploy?

This is a personal blog. Any views or opinions represented in this blog are my own and do not represent those of people, institutions or organizations that the I may or may not be associated with in professional or personal capacity including past, current and future employers, unless explicitly stated.