If you have created resources via the old Azure Portal or Service Management APIs, then you may notice that they belong to Default-XXXX Resource Groups when viewed in the new Azure Portal.

Current State

As you can see from the following view of one of my subscriptions in the new Azure Portal, I have a Storage Account that is part of a Resource Group named Default-Storage-WestUS. I originally created this Storage Account in the old Azure Portal but have re-used it for the pbubuntu VM. The VM and its associated Cloud Service were created via the new Azure Portal and were placed into a Resource Group named pbubuntu.

I’ve been cleaning up a number of orphaned Default-XXXX Resource Groups this week. I did however want to keep the portalvhds2knfjr67c7f3q Storage Account, since it held the VHD for my VM. I was looking for a way to move a resource from one Resource Group to another. I wanted to move the portalvhds2knfjr67c7f3q Storage Account into my pbubuntu Resource Group.

Trying to move the resource

I discovered the Azure PowerShell cmdlet Move-AzureResource and the Move a resource section in the Using Azure PowerShell with Azure Resource Manager blog post described exactly what I was trying to do. But when I tried to run the cmdlet it returned immediately (with no error) and no effect on my resources.

When I involved Fiddler in the debugging effort I noticed that there was no traffic being sent across the wire. No REST API calls were being made. Then I discovered this issue on GitHub – #379 Moving Azure resources doesn’t do anything. This was related to version 0.9.1 of the Azure PowerShell cmdlets – which is what I was running.

No problem I thought, I’ll use the Azure xplat cli, but quickly found that the move feature was not yet implemented on the resource command …

Ok – back to basics then.

I’d look up the REST API endpoint and payloads and use raw REST calls. I opened up the Azure Resource Manager REST API Reference on MSDN and tried to find the REST API for moving a resource. Nothing … This was not going well.

Diving into cmdlet source code

My next step was to download the Azure PowerShell code from GitHub and attempt to discover the REST API endpoint and payload from the Move-AzureResource cmdlet source code. I built the code and ran the MoveResourceTest test to start debugging the cmdlet.

I managed to extract the REST API endpoint (managementUri) and infer the payload structure from the ResourceBatchMoveParameters class. I confirmed my findings with Ilya Grebnov, the Architect and Lead Engineer of Azure Resource Manager.

Thanks to Ilya for getting back to me so quickly and for correcting my initial attempt at the payload.

Moving that resource !

ARMClient is a simple command line tool to invoke the Azure Resource Manager API and can be found on GitHub. It is fairly low level and allows you to interact with the raw REST API directly. ARMClient is great in that it manages the Azure authentication tokens for you. Have a look at the ARMClient: a command line tool for the Azure API blog post to understand this tool better.

ARMClient is available via chocolatey and I installed it as follows:

choco install armclient

I then created the payload in a file named move-resource.json. The targetResourceGroup is the full path to the Resource Group I’d like to move the resource to. In this case it’s the pbubuntu Resource Group in my subscription. The resources collection contains the full path to the resources I’d like to move. My collection consists solely of my portalvhds2knfjr67c7f3q Storage Account.

You will be required to authenticate yourself to use the ARMClient. You can do that as follows:

PS C:\> armclient login

Then I issued a POST to the REST API endpoint for moving a resource out of my Default-Storage-WestUS Resource Group. I included a reference to my move-resource.json file (you need to include the @ prefix) and switched on verbose mode to get as much detail as possible.

End State

And checking the new Azure Portal also confirms my portalvhds2knfjr67c7f3q Storage Account has indeed been moved into my pbubuntu Resource Group.

I’m hoping that the next release of the Azure PowerShell cmdlets will fix the Move-AzureResource cmdlet and that the Azure xplat cli will implement this functionality in the near future. In the interim, you can use ARMClient and the endpoint and payload as described in this post.

I was recently asked for help by an ISV. They needed to generate a SAS (Shared Access Signature) to grant time-limited access to resources that they were storing in Azure Blob Storage. They had developed their solution using PHP and deployed it to Azure Websites. I had generated SAS before using C# and .NET but never using PHP. No problem I thought … we have an Azure SDK for PHP and it’s available on GitHub !

But after pouring over the code I came to the realisation that there is no functionality to generate a SAS for Blob Storage in the PHP SDK. It seems as though this was first opened as an issue in 2012 but has not been resolved. So what to do?

I didn’t want to implement a solution that would place the burden of keeping up to date with Azure changes on the ISV. I had a quick look at the Python SDK and the weight lifted off my shoulders when I found sharedaccesssignature.py !!

I verified my approach with the Azure SDK team and got the thumbs up. I was good to go and I could sense the solution would be a matter of minutes away – little did I know …

Test Azure Website

I created a test Website in Azure called callingpythonfromphp and ensured that PHP was enabled (it is by default). I didn’t need to change the Python version setting since Python is installed by default (as are all the other languages). The switches you can see below enable http handlers for the respective languages.

You can check that Python is installed via the Debug Console in Kudu. Here you can see that Python 2.7 is available in D:\Python27.

I added two files to the D:\home\site\wwwroot folder of my Website – generate-fakesas.py and test-fakesas.php.

The test-fakesas.php file simply called Python and passed it the generate-fakesas.py script to execute.

The generate-fakesas.py was so named since all I’m doing is returning the string “SAS” and not actually generating a SAS. This will be output in the PHP script. In the final version it can be assigned to a PHP variable and utilised.

print ("SAS")

This was a quick test to check whether or not I could call a Python script from PHP in an Azure Website. The >> and << in the PHP script would allow me to visually confirm that the output of the Python script had been inserted where I expected.

Hitting the test-fakesas.php script via the Debug Console in Kudu gave me the expected result. I could see the SAS being output. I had successfully called into Python from the PHP script.

I then hit the test-fakesas.php script from the browser. Hmm – this was not good. The SAS string was nowhere to be seen.

Looking at the php_errors.log file in D:\home\LogFiles I found the following:

PHP Warning: system(): Unable to fork [D:\Python27\python.exe generate-fakesas.py] in D:\home\site\wwwroot\test-fakesas.php on line 1

The Fix

So there was a difference in behaviour between the Debug Console and the manner in which php-cgi was being launched. After a few emails back and forth to the Azure Websites team, they discovered that the source of this issue was to do with the fastcgi.impersonate PHP setting. It was set to 1 by default and needed to be switched off (set to 0). This resulted in the following solution on the Kudu Xdt Transform Samples wiki page.

This solution basically allows you to deploy a custom php.ini file for your Azure Website and override settings. When you are doing this ensure that ALL instances of fastcgi.impersonate=1 are changed to fastcgi.impersonate=0 and are NOT commented out.

Note that the applicationhost.xdt and php.ini file should be deployed to your D:\home\site folder.

And now the call works from both the Debug Console in Kudu and the browser.

Ok – so that’s all great. But I actually needed to generate a SAS.

Generating a genuine SAS

DISCLAIMER – I’m going to preface this entire section by saying that I am not a Python guru.

The recommended mechanism for installing Python packages is pip but when I ran pip via the Debug Console in Kudu to install the Azure SDK …

D:\Python27\Scripts\pip.exe install azure

I got the following error.

error: could not create 'D:\Python27\Lib\site-packages\azure': Access is denied

I found that the only way I could install the Python Azure SDK via pip into the Azure Website was by starting off with Python’s virtualenv. I found a great blog post that got me up and running quickly. I created a new development environment called myapp.

D:\Python27\Scripts\virtualenv --no-site-packages myapp

This creates an entire new and isolated Python environment and copies utilities and the Python executables into this environment.

Next activate the development environment.

myapp\Scripts\activate

And then install the Azure SDK.

pip install azure

I added the test-sas.php file to the D:\home\site\wwwroot\folder of my Website and the generate-sas.py file to the D:\home\site\wwwroot\myapp folder .

The test-sas.php file simply called Python and passed it the generate-sas.py script to execute. You can see that it is using the Python executable in the myapps development environment. This means it will also have access to the Azure SDK I installed.

The generate-sas.py was based on an example from StackOverflow. The script now will actually generate a SAS for read-only access to the images/flower.png blob within my imagesstoragepb storage. This SAS will be output in the PHP script.

Running this in the browser led to the successful generation of the expected SAS !!

So don’t be put off if a feature you need is not available in the PHP SDK. Check if you can leverage the Python SDK. And if you need to override the default PHP settings in Azure Websites you also now have a mechanism to do that.

Atom is a text editor from the folks at GitHub. I’ve been intrigued, but up until now it has only been available on the OS X platform. And since I currently don’t have a Mac I haven’t yet taken it for a spin. But all that has changed … Atom is now available for Windows.

Installing Atom

Installing Atom on Windows is really easy. It is available as a Chocolatey package. If you don’t have Chocolatey on your Windows machine, install it as per the instructions on the Chocolatey website.

Then simply run the following command from the command line to install Atom.

cinst atom

And you’ll be greeted by your shiny new text editor when launching Atom.

Add C# language support

Support for the C# language is not provided out the box with Atom, but this is quickly solved with an Atom package.

Atom comes with the Atom Package Manager which is easily launched by issuing the following command at the command line:

apm

The Atom package manager allows you to install Atom packages that can be used to extend the functionality of Atom. You use the apm install command to install packages. You can get help for any command by using apm help <command> as shown below.

We are interested in the language-csharp Atom package. This adds syntax highlighting and snippets for both C# and scriptcs specific grammars.

Install the package by issuing the following command at the command line:

apm install language-csharp

Add support for running scriptcs

So now we have C# language support in Atom, but cannot yet run our C# script files using scriptcs. To enable this we require another Atom package – atom-runner. This package allows you to run code or scripts from within Atom.

Install the package by issuing the following command at the command line:

apm install atom-runner

We then need to configure atom-runner and associate csx files with scriptcs. This will allow us to execute our csx files from within Atom. We need to add this configuration information to Atom’s config.cson configuration settings file.

The easiest way to open this file is to use Atom’s command palette. Press ctrl-shift-p to bring up the command palette and then type config. Hit enter to open the config.cson file for your user profile.

Add the following lines to the end of the file.

'runner':
'extensions':
'csx': 'scriptcs'

See scriptcs in action

Atom has now been configured to provide syntax highlighting and snippets for C# and scriptcs. It is also now capable of executing csx files from within Atom. So let’s see this in action.

Create a csx file and write a simple Console.WriteLine statement. I’ve created a file hello.csx in the C:\Labs folder and added the message “Hello from atom & scriptcs!” to the Console.WriteLine statement. Ensure that the file is saved.

Next bring up the command palette again (ctrl-shift-p) and type runner. Select the Runner: Run item and hit enter. This will invoke the Atom Runner and provide it with the path to the hello.csx file which is the active tab in the editor.

The csx file will be run by scriptcs and the output captured in the Atom Runner window.

Now you can write your scriptcs csx files in Atom with C# syntax highlighting and snippets. You can even execute your csx files from within Atom.

Add keybinding for Atom Runner

Starting the Atom Runner via the command palette just felt like too many keystrokes for me. So I decided to have a look at the keymap functionality within Atom in order to bind a set of keys to the run event of the Atom Runner.

Bring up the command palette again (ctrl-shift-p) and type keymap. Hit enter to open the keymap.cson file for your user profile.

Add the following lines to the end of the file.

'.platform-win32 .workspace .editor':
'ctrl-shift-r': 'runner:run'

This will map ctrl-shift-r to the Run event of the Atom Runner on the Windows platform. So this is all you need to use now to execute your csx files.

The Atom Runner has it’s own keymap file (%UserProfile%\.atom\packages\atom-runner\keymaps\atom-runner.cson) that is used by Atom but this is currently OS X specific.

Acknowledgements

I’d like to thank Adam Ralph for doing the hard yards and documenting the steps on the scriptcs wiki for how to get this up and running quickly.

And it seemed like it had something to do with having Visual Studio 2013 installed on the machine. Which I had but Morten didn’t.

Tomasz confirmed that Edge required that msvcr120.dll was available on the machine. This assembly is the Microsoft C Runtime library and is installed via Visual Studio 2013. Mystery solved 🙂

But I wondered how we may have solved this issue if we hadn’t got a quick reply from Tomasz …

Replicate the issue

First I needed an environment to replicate the issue. I really didn’t feel like uninstalling Visual Studio 2013 from my machine so I created a Windows 8.1 VM on Microsoft Azure. A Windows 8.1 image is now available to MSDN subscribers. It does not have Visual Studio 2013 installed so was perfect.

After installing scriptcs and the ScriptCs.Edge script pack I found that I was getting the same error as Morten. This was expected. So now the question was – how could I figure out what was going wrong?

From the error one could deduce that something is not being loaded and given the evidence that this works on a machine with Visual Studio 2013 but not on one without Visual Studio 2013 it seems likely we are looking for a file that is missing.

Running Process Monitor on the machine while testing the script pack showed that a specific file could not be found (msvcr120.dll) just after the edge.node module had been successfully loaded. This matched what we were seeing in the error message. So we had found the culprit.

There is a firehose of information that Process Monitor will display. I restricted this information via filters. I only displayed file activities via the Show File System Activity button the the menu. I further filtered the entries to only those produced by the scriptcs process by applying a filter as shown below.

Resolve the issue

To test that having the msvcr120.dll assembly would resolve the issue I copied it from my local machine (that had Visual Studio 2013 installed) and placed it in the same folder as the edge.node module on the Windows 8.1 VM in Azure. This was one of the folders searched so I assumed the assembly would be picked up from here.

Success !

You can see the Node js v0.10.28 welcomes .NET message in the console below. The msvcr120.dll assembly is also clearly loaded as can be seen in the Process Monitor screen.

It was great to see if I could resolve this issue by troubleshooting the process. I now have another tool that I can add to my troubleshooting belt.

And soon Tomasz will be including this assembly in the Edge NuGet package. So no need to copy around assemblies.

The last few months have been an incredible journey for me. And that journey has resulted in my first Pluralsight course, Introduction to scriptcs, being published on 2 May 2014.

The scriptcs project was started by Glenn Block and was heavily inspired by node.js. It aims to introduce a low friction experience to the world of C# and even better bring that experience to you across Windows, Mac OS X and Linux. If you haven’t looked at it yet, download it and start playing.

In this talk I introduced scriptcs and the Windows Azure Management Library, before showing how to combine these two awesome resources to script the management of your Windows Azure assets with the full power of C#.

Autoscaling has finally been built in to Windows Azure via Microsoft’s acquisition of MetricsHub. The Autoscaling funtionality from MetricsHub has been rolled directly into the Windows Azure platform. Other features that have also been rolled into the Windows Azure platform from MetricsHub include Availability Monitoring and Alerting.