Latest Blog Posts

In an ideal world organizations should try to avoid creating custom images with their own special agents and configurations. This means a lot of image management as each time an agent is updated the image has to be updated in addition to the normal patching of OS instances. The Azure marketplace has a large number of OS images that are kept up-to-date which should be used if possible and any customization performed on top of that. I recently had a Proof of Concept where a number of agents needed to be deployed post VM deployment along with other configurations. Items such as domain join can be done with the domain join extension but for the other agent installs we decided to use the Custom Script Extension to call a bootstrap script which would do nothing other than pull down all content from a certain container using azcopy.exe and then launch a master script. The master script would be part of the downloaded content and would then perform all the silent installations and customization’s required.

A storage account is utilized with two containers:

Artifacts – This contains the master script and all the agent installers etc. This could use a zip file to enable a structure to be maintained of the various agents and the master script could unzip at the start

Bootstrap – This contains azcopy.exe (in my case version 10) and the bootstrap.ps1 file that does nothing other than call azcopy to copy everything from the artifacts container to the c:\ root, then launch the master script from the local copy

Below is my example bootstrap.ps1 file. Notice it has one parameter, the URI of the container which will be the shared access signature enabling access.

All the installers and the master script were uploaded to the artifacts container. For this container I wanted a shared access signature (SAS) that would give read and list rights. The idea would be some automation would generate a new SAS each week and write to a secret in key vault that only people that should deploy had access to. The SAS would have a lifetime of 2 weeks to have an overlap with the newly generated. In addition to generating and storing the complete SAS I needed a second version that was escaped for cmd.exe. This is because the SAS has & in it which was being interpreted during my testing breaking its use. I tried to use no parse (–%) but this did not work since it was being called by cmd.exe therefore the escape is to use ^&. The script below generates the SAS and the escaped SAS and writes both versions as secrets to key vault.

The actual template (note in the CSE extension at the end I need the single quotes around the URI or it once again tries to interpret it so you have to use two, i.e. ”, to get one ‘ when it actually executes):

I’ve finally got round to starting recording on my PowerShell Master Class. This will be a long course that will be free on YouTube. I’ll be adding modules over the next month and updating as needed. I’ll tweet (@NTFAQGuy) when I add new things.

It’s a quiet Sunday morning and what else is there to do but create a 40 minute video walking through what Infrastructure as Code and DevOps really means for an IT admin. I walk through Azure examples and cover key concepts and tools including Visual Studio Code, Git and Azure DevOps. Enjoy 🙂

Over the past two months I’ve been busy on some Data in Azure courses for Microsoft and Pluralsight. These are free and you just need to sign-up for a free account on Pluralsight. These will shortly be available via Azure as well but are available now through Pluralsight.

I may have seemed to be very quiet over the past few months but that’s because I’ve been working pretty much every night and weekend on 11 new courses for azure.com that will shortly be available via the site but are immediately available for free via PluralSight. If you don’t have an account simply sign up for a free account and you can then access my (and other peoples tracks).

Recorded two new videos this week. The first is an understanding of how tokens work with Azure AD and then one looking at conditional access (which can control the access to get those tokens for various scenarios).

Word of caution – I talk about terms of use in the second video. If you just enable this for ALL users it will break things that can’t accept it, for example the account you use for Azure AD Connect to sync to Azure AD so make sure you exclude accounts that can’t accept!

Cloud authentication where the authentication takes place against Azure AD

Federated authentication where the authentication takes place against the federated service, for example using ADFS against Active Directory Domain Services

When using the cloud authentication there are two ways to validate the password:

A hash of the password hash from AD is replicated to Azure AD (and no matter which authentication option used this is recommended to enable Azure AD to help detect leaked credentials and give a “break the glass” fallback authentication option if your primage configuration fails) and this is used for the cloud based authentication

The password validation is done against Active Directory Domain Services using Passthrough Authentication (PTA) which works by writing the username/password (in an encrypted form for each PTA agent configured) to a service bus instance which are then read by PTA instances deployed to Windows OS instances which take the entry, decrypt, authenticate against ADDS then respond with the result to then complete the authentication request

There are therefore three options for the authentication configuration

Password hash

PTA

Federation

The order I have them is generally the preference but there are some pros and cons of each (in addition to a few considerations) and I wanted to outline them briefly here.

Password Hash

Pro – Cloud scale/resilience since this is all native Azure AD with no other reliance during authentication

Pro – Provides breach replay protection and reports of leaked credentials since the stored hash can be used to compare against credentials found on dark web (visibility varies depending on Azure AD license, P2 provides best insight). Also enabled the ability to block banned passwords during password change. This benefit is for any configuration providing password hash is replicated and does not have to be used for the authentication

Pro – As above even if not using password hash for authentication if its stored and the primary method, e.g. PTA of federation fails (such as loss of connectivity to infrastructure) you can quickly switch to password hash based authentication

Con – If the ADDS account has been locked, restricted hours set or password expired it will not impact the ability to logon via Azure AD

There is a delay for new accounts or changes to be reflected from AD to Azure AD. This is typically a 30 minute replication window (except for passwords which replicate every 2 minutes). Therefore plan for a delay for new accounts/changes to be reflected in Azure AD

You may hear talk of a con is you want the authentication to occur against on-premises DCs however the way tokens and specifically refresh tokens work is only the first authentication would hit AD and after that future access in the same session would not re-authenticate via PTA/federation anyway as the refresh token would be used to acquire additional access tokens. I will cover this in a separate video.

Passthrough-Authentication (PTA)

Pro – If a concern with this method you don’t have to store password hashes in Azure AD (however this is a risk vs reward discussion and the benefit of having the hash greatly outweighs any downside IMO)

Pro – This is lighter than using federation and establishes an outbound 443 connection to Azure AD not requiring any inbound port exceptions

Con – Legacy authentication (pre 2013 Office clients) may not work with PTA

This is lighter than federation and easy to deploy multiple PTA instances on-premises for scale and resiliency but does still require deployments

When users authenticate, their password is sent to Azure AD (encrypted via HTTPS and then sent via PTA for authentication)

Federation

Pro – 3rd party MFA, Azure MFA Server and custom policies/claim rules (outside of the Azure AD 3rd party MFA integration like Duo). It is also possible to create a multi-site ADFS farm, then coupled with some type of geo-DNS solution you can authenticate a user to their closest ADFS “presence

Pro – Certificate based authentication

Single-sign on if on AD joined machine in corp network. This can be matched with password hash and PTA with seamless-sign on enabled

Password never hits the cloud, it is send to federation server. Both others the password is sent to the cloud

Con – Large amount of infrastructure required (proxy, adfs servers) especially when other federations moved to Azure AD. The OpEx cost is also a major consideration. Think about the maintenance (managing servers, trusts, certificates) and staff to operate this.

Note that for all scenarios I can still use features like Conditional Access. I try to start at the top of the options and work down if needed. I really consider the federation a legacy option that most organizations are moving away from since Azure AD would be used for the actual application federations moving forward.

PowerShell 5.1 marks the last major update to PowerShell we are likely to see built into Windows. The future of PowerShell has gone the open-source path with PowerShell 6 being available via GitHub and available not just for Windows but also multiple Linux distributions and MacOSX. This is made possible as PowerShell 6, or rather PowerShell CORE 6.0 is built on .NET Core (which is cross platform) instead of the Windows exclusive .NET.

The good news is PowerShell 6 can be installed alongside the PowerShell that is part of Windows/WMF. Download and install from https://github.com/PowerShell/PowerShell/releases. Once installed you can launch by running pwsh.exe. If you look at $psversiontable you will see you have the Core PSEdition instead of the standard Desktop.

I recommend installing this and running alongside the regular PowerShell and getting used to it. The good news is most regular PowerShell will run and if you execute get-module -listavailable you will see the built-in modules. For non-built in modules you will need to check if they are supported with PowerShell Core.

Tools like Visual Studio Code can be used with PowerShell 5.1 and PowerShell Core 6.0. Simply change the settings for Visual Studio Code to add pwsh, e.g. add the following to user settings (File – Preferences – Settings) (change to your specific PowerShell version). I added this just under the existing User Setting (the , goes after the existing line in the file).