Monday, February 18, 2019

SCIM is a protocol to basically sync users and groups (and other resources) meaning that you can have your users on Azure AD, Onelogin, Octa, etc. and keep provisioning the new ones, updating the current ones and deprovision the ones that have left your company to the third party applications.

The Problems
Since Microsoft has the most prominent Identity provider (Active Directory) they have implemented the protocol slightly different than what it was.
1 issue was that they were sending a list of values on a PATCH request for everything including the scaler values. For example lets say someone's displayName has changed. The nuget was designed to receive

Meaning that no one else other than Azure AD could have called your application with PATCH if they have done it right!

When Azure AD calls our endpoint, it breaks without even reaching the part that is in our hands. The error is pretty simple:

Failed to update User 'xxx@yyy.zzz' in customappsso; Error: StatusCode: BadRequest Message: Processing of the HTTP request resulted in an exception. Please see the HTTP response returned by the 'Response' property of this exception for details. Web Response: {"Message":"The request is invalid."}. We will retry this operation on the next synchronization attempt.

It got wrose!

In December 2018 the problem gor worse, because they have actually corrected their Azure AD's behaviour Advertised here but then didn't fix the nuget package. So your code would just break without you knowing what is wrong and why.

I have created a support case for Microsoft and basically wrote everywhere possible, but after a month still there is no update on the package so I've decided to write a small fix for it.

The Solution

The code for the package is not open source and basically there is no right way to do it other than Microsoft change their code and support the correct behaviour, but a simple way to correct their behaviour is to intercept every request using a middleware and change it to the expected format(if it is not already).

Introduction

The old way of accessing resources like blob storage was connection string. Later services have been added to Azure like Key Vault in order let people store keys, secrets and certificates in order to let developers do their job while protecting the production environments from the rouge ones :).

That being said, there is new ways to give access to platforms so they can access each other. For example in Azure Data Lake Analytics, you might want to access Azure Data Lake Store, Blob storage and Data Factory and to do that you need to give it access for each of them.

Where to find values that we need?

DataFactory

For the data factory, if we were about to follow the same lead, meaning that we needed a service principal Id and key that can be created using active directory, but it turns out that we don't need to as Azure have a newer concept called "managed identity" which has already been created when we have created the Data factory using Portal. We just need to make sure that the data factory identity have access to the DataLakeStor.
So what we need to set in the parameters is only the subscription and resource group in the release pipeline.

* We could also use managed identities to connect to Blob storage on Azure.

Data Lake Analytics

In any case if we need to have a service principal (like for the Data lake analytics), we can just create an app in active directory and make sure that app have access to the resources we need. Then we can read the Id and key as below:

Thursday, April 5, 2018

Introduction

It is going to be a short post :)

In Azure we usually use application insights or something similar to keep track of your logs. If for some reason you like to keep everything also in log files, for example in App_data you will have a lot of files in no time. Usually the size is not that much matter since you have 100 GB on your disk, but it is also annoying that you have to look into thousands of files (usually 1 file per day per scaled instance).

Where are my files

You probably know it already since you are trying to delete them :) but for people who have searched for something else, lets say your app service is in test.azurewebsites.net/ you can access the disk using test.scm.azurewebsites.net/

Then you can click on "Debug console" from top menu and click on CMD to see all folders and files. To visit the place that your files have been hosted just go to Site> wwwroot

The simple solution

One simple solution is to delete the old files that you know you are not going to need. For example you might have a policy to say you don't care about what has happen more than 30 days ago.

How?

Well, basically it is Windows so you can simply use command prompt's commands to do stuff.

Use this command to see the list of the files older than 30 days

forfiles -s -m *.* -d -30 -c "cmd /c eco @file"

If you were happy with the results, change eco to del to delete those files:

Wednesday, November 22, 2017

Introduction
All CMS systems log data versions and the person who has changed them. Umbraco is of course is not an exception regarding this matter :)
But how can we check and see how has done the change?

There are probably some add-ins for that, but what if you don't want to let other people see old data? What if you don't have time, or not allowed to install a package on the production server?
Right?

It is pretty simple :). you just need to query on the Database

Wrong text back in history

Lets say your client sends you a bug report and when you check it you see that a text field is not correct. For the sake of example, lets say you have several email subjects in your CMS and you have a bug report saying someone has received "Subject 2" instead of "Subject 1". What will you do?

Of course first action is to check CMS and then checking code, but what if everything is right? Isn't it possible that for a period of time an editor changed it to the wrong value and after a while corrected it? How would you check that?

With this query you will find all property names in the history that "%Subject 2%" has been assign to them with date and the one who is responsible. Also you can filter on any other fileds like nodeId (page id) and also you can check other data types (dataInt, dataDecimal, dataDate, dataNvarchar, dataNtext)