Search

Problem

If you are trying to deploy an ARM template using Visual studio and you get the error below…..

AzureRmResourceGroupDeployment : A parameter cannot be found that matches parameter name ‘_artifactsLocationSasToken’

… do not spend 1 hour on trying to figure out why the parameter isn’t found. Just continue reading the solution below.

Solution

Scan your ARM template(s) if it has the correct formatting!!! Even when the deployment doesn’t mention any validation errors (because the deploy hasn’t reached this step yet)

So in my case I had one } too much , so the next part was seen as a parameter for the main resource. The compiler and syntax highlight didn’t complain but the deployment gave the ‘_artifactsLocationSasToken’ error.

Currently I am doing an investigation on how to publish a solution to the Azure marketplace. You have 2 ways to do this. A Virtual Machine offer or a Solution template offer.

A Virtual Machine offer will be a sysprep’d VM that will contain all your pre-installed software and just will be deployed as a new virtual machine. So you will have to prepare a VM and sysprep/generalize it and upload it to Azure.

A Solution template offer is somehow more advanced. Here you don’t have to configure a sysprep’d VM image but you just use an ARM template to rollout a brand new VM and then use a script extension resource to deploy your artifacts (aka your software to install). The cool part is that you can also change the UI interface on the Azure portal when configuring the deployment. This is done with the createUiDefinition.json file that has to be a part of the solution zip file you upload to azure.

So when you have created all your ARM templates and put it into a Solution zip file (aka just zip all the ARM templates in one file) you will upload it using the https://publish.windowsazure.com portal and make a staged publish. (aka test rollout/publish)

Now you have to wait for a couple of hours before it will be ready to test it out. And this is the part that is annoying. There is no way to test the custom UI interface without having to do a staged publish and wait for a couple of hours a again……

Or is there …. ??

Yes there is! Thanks to this link I was able to test my custom deployment UI.

All you will have to do is:

Create using the Azure storage explorer a new public container. (so set the public access level). For example named : “test”

Short post to give you something cool I tried out today. I think about 1 year ago Microsoft dropped the Network Analyzer tool and replaced it with the Microsoft Message Analyzer tool.

With this tool you can now trace not only network traffic, like you only could do with the network analyzer tool, but also many other trace datasources. One of them is also OMS. Yes you hear it right. You can now analyse your OMS query’s using the Message Analyzer tool !

In OMS when you are making search query’s you can use the BY command to group. When you specify multiply group columns and use the INTERVAL to generate a graph you will also get a nice feature exposed.

In the legend you can now select the lines you want to see by grouping. This could be very handy.

See picture below:

Drawback

One drawback when using multiply groups. If you use this query also in a custom view you will lose the legend. But this legend is useless anyway since the view space is too little to make it readable.

Just a short post to warn you for a nasty situation during designing your fantastic OMS dashboards using the brand new View designer. (Public preview)

When you add a Tile you will have the feature called “Data-flow verification”. This feature will enable you to put a message on the Tile when no data records are found in the OMS system.

This is a handy feature because you don’t want to show an empty dashboard…. But this could also raise an issue during design time.

Because … what will happen when you have setup the “Data-flow verification” to check the past x days for any data but have made a typo or the data isn’t flowing in any more…. Yes of course the dashboard will show the message you specified but you will get more (for free) ….

You CAN NOT open your custom view (dashboard) any more to edit it !!! So you are somewhat stuck here … ; – (

Be warned!

So here the steps to see what happens:

Open de View designer

Add the Ttile , and enable the Data-flow verification

Now look at the tile when you add the query, it will give you a error when it hasn’t got any data back. So this will indicate you are going to have this issue….

Problem:

O no I forgot my SCOM account passwords!! I don’t know the password of the Data Access, Data Reader and Writer account anymore. Resetting it in AD will force me to do a lot of tweaking to correct the accounts in SCOM.

Don’t worry we will find them for you.

Analyse:

SCOM stores the account passwords in the “Run AS Configuration -> Accounts” section. This account information is linked to a “Run As profile”. This Run as Profile can be assigned to a SCOM Workflow (Rule/Monitor/Task…) so that this workflow is going to run under the account security context.

Nice but we still can’t see the password on the accounts.

Solution:

But we can also do other things with the Run As profile. We can just assign them as a parameter to for example a script. In the script we can readout the account information and find our lost password.

In SCOM we can use the secure script provider (vbscript) aka “Microsoft.Windows.ScriptWriteAction”. The secure script provider streams the run as information as an input stream to the VBScript. So if you read this input stream at top of your script you will get the account information. This can be tricky sometimes.

Using the SecureInput parameter we can provide the Run as account information. For getting the UserName we use :

$RunAs[Name=”RUNAS_PROFILE_1“]/UserName$

And for the password we use

$RunAs[Name=”RUNAS_PROFILE_1“]/Password$

The RUNAS_PROFILE_1 is the internal name of the Run as profile in SCOM. You can use Powershell “Get-SCOMRunAsProfile” to get the internal names.

I hear you thinking, this is way too old, this is VBScript, we WANT PowerShell! And I agree completely.

So for PowerShell we can use the normal PowerShell script provider aka “Microsoft.Windows.PowerShellProbe”. We don’t have to apply a secureinput parameter but just very simple supply the RunAs as a normal parameter. And this will do the trick.

NOTICE: Please remember that the task output is stored in the SCOM Databases so it can be traced back not very secure I think. So use this only in emergencies. Or change the PowerShell script to write it to a file!!