If you have not used OMS or Log Analytics it is well worth spending some time investigating. You have the option of paid, trial and free tiers and a whole range of interesting preconfigured packs to play with.

Where Log Analytics gets interesting is when you start to increase the amount of information you are gathering and then use custom queries to dig for information, provide proactive notifications and automated actions and to train and develop models to display insights into your environment. Just imagine a machine learning model applied to data from your sys log server to map our network activity and threats.

For this article I am assuming that you already have OMS enabled and are collecting data but may never have looked into Log Analytics. You’ve probably clicked the Advanced Analytics button a few times and made some progress or gone “Whoa dude, strange things are afoot at the Circle K!” (The last bit might just be me :-))

Lets get cracking:

Head to your OMS Workspace that hosts your LogAnalytics Service for the VMs you want to monitor. At this stage it’s worth noting that there are a number of architectural options when considering your OMS Workspace design. This article does not go into the patterns you can adopt but as long as you have some VMs on premises being monitored and the data being collected you’ll be able to continue.

Select Log Search and then open up Advanced Analytics and “Hold On!”

When the Advanced Analytics page has loaded open a new tab and paste in the query you need for the results you are after. To test select Run.

The query you are looking to run is from the Update data. Therefore this needs to be your first input. You are then extracting data from here and narrowing down what you are looking for. Once narrowed down you need to decide how you want this data displayed, this is your summary. Finally we are placing all this information into a table.

If this is the very first time you have tried a free form query try the top most line first. Its likely you will get a lot of records but you will see all the data and then be able to narrow it down to what you are after.

I have copied the query below for you to use. Like everything if you know a better of of doing things please share I’d certainly be interested!

Having read Paulo Marques article Working with Azure Storage Tables from PowerShell I decided to make the edits to my auditing scripts and push the results into Azure Tables to act as a repository I have the ability to keep but also one that gives me more options. Moving forward we can look to update or pull this information out on demand or use it as a basis of a comparison. I find it quite useful to have an independent record of the starting and end state of an environment pre and post any work undertaken.

There are a number of ways you can audit an Azure environment. With most of my customers I have implemented OMS, often using a combination of paid and free tiers to achieve the reporting they need to meet their own requirements and standards.

I’m a big fan of OMS, this script represent only one way to gather information and a chance to try something new in PowerShell.

To get started you’ll need to follow the instructions in Paulo’s article to install the correct module and from there I suggest following his guide as this will give you a good understanding how the commands operate. Once competed it is a straight forward process to integrate this in to any auditing script you currently have. The example below already has a table created.

Last Tweets

I was lucky enough to join the Australian Institute of Company Directors swim team for the #PorttoPub swim in Perth Western Australia. The race was called off at the three hour mark due to the tough conditions. However it proved again to me that a good t…https://t.co/AMf3zGNVEx,6 hours ago