http://www.rhizohm.net/Ghost 0.9Sat, 15 Dec 2018 06:57:39 GMT60This is confusing enough that it warrants a post so you don't waste a bunch of time, as the documentation is confusing. If you have the evil grayed out "Export to PFX" experience, this post will save you.

The most important thing is to use IIS for everything.

Okay, so

]]>http://www.rhizohm.net/adding-a-godaddy-ssl-certificate-to-an-azure-app-service/b6464fcd-e564-482e-b537-dbc494df6af5Thu, 20 Jul 2017 05:23:39 GMTThis is confusing enough that it warrants a post so you don't waste a bunch of time, as the documentation is confusing. If you have the evil grayed out "Export to PFX" experience, this post will save you.

The most important thing is to use IIS for everything.

Okay, so let's get started. First, you create your certificate in GoDaddy. Then, you need to generate a CSR and provide it to GoDaddy. You do that in IIS Manager. Hit WINDOWS+R and type iismanager.

Next, click "Server Certificates" and select "Create Certificate Request" on the left. When filling this out, be sure that the common name you choose is the same as your domain. Also be sure to change the dropdown to 2048 bit length.

Once you generate the CSR, you copy/paste the contents for GoDaddy. They'll then generate a .zip file you download for IIS.

Here's the crux. Once you download the .zip file, it will contain a .crt file and a .p7b file. You go back to IIS Manager and choose "Complete Certificate Request."

(Do not try to directly import the files returned by GoDaddy or else you won't be able to later export the private key and you will enter a world of pain and suffering until you realize your mistake, but you are reading this blog post, so you either already went awry, or I saved you.)

When you click "Complete Certificate Request", it will be looking for a .cer file. But you can change the dropdown and give it your .crt file. Ha! Once you do that, you finish the import. You'll then see your cert in IIS Manager. At this point, you can right click it and hit "Export" and it will let you export a .pfx with a password. Yeah!

At that point, you upload the .pfx to Azure and bind it to your service. And, if you want to redirect all http to https, you'll need to modify your web.config as follows, adding this as the first rule in your web.config:

]]>I recently needed to deny the TRACE verb to all incoming requests to IIS on a website running as a classic cloud service in Azure. First I tried following this post and adding the following:

]]>http://www.rhizohm.net/another-way-to-deny-verbs-in-iis/20b02caf-a3fd-462d-9ae7-deae1c0aba33Thu, 18 May 2017 20:44:28 GMTI recently needed to deny the TRACE verb to all incoming requests to IIS on a website running as a classic cloud service in Azure. First I tried following this post and adding the following:

Once I knew that I could port all the content, I deployed Ghost on Azure by creating a new app service and walking through the UI steps to create a new Ghost deployment. Behind the scenes, what was happening was that a GitHub repo (https://github.com/AzureWebApps/Ghost-Azure) was being deployed.

Once it was deployed, I realized it was running at on a pretty beefy instance, and not the free instance. First thing I did was move it to the free instance.

I then forked the Azure Ghost repo to my own repo here: https://github.com/rhizohm/Ghost-Azure so that I could make mods and deploy. I then updated the deployment options in the Azure portal to point to my forked repo:

And, with that, every time I update the repo, a deployment kicks off to Azure. I grabbed another theme called Beautiful Ghost which I also forked and put it in the themes directory so I could mod it an update my deployment.

Of course, I had to change the CNAME to point to the new Azure website. I then had to change the pricing model to one that allowed custom domain names.

]]>I was trying to get Ghost running on my local Windows box and everything was going great until I got the following error:

ERROR: Unsupported version of Node
Ghost supports LTS Node versions: ^4.2.0 || ^6.9.0
You are currently using version: 7.7.2
This check can

]]>http://www.rhizohm.net/ghost_node_version_check-and-windows/5041537d-ca94-4a7a-bdcf-f46bf12f2aa0Fri, 21 Apr 2017 03:40:00 GMTI was trying to get Ghost running on my local Windows box and everything was going great until I got the following error:

ERROR: Unsupported version of Node
Ghost supports LTS Node versions: ^4.2.0 || ^6.9.0
You are currently using version: 7.7.2
This check can be overridden, see http://support.ghost.org/supported-node-versions/ for more info

So, basically, I'm running a version of Node greater than what Ghost supports. If you read their docs, you can override this by setting GHOSTNODEVERSION_CHECK to false. But how to do that in PowerShell? Here's how:

$env:GHOST_NODE_VERSION_CHECK = "false"
npm install --production

Boom! Good to go running Ghost locally!

]]>I wanted to copy every file from one directory to another using AzCopy. The cruz is using the /S switch which acts recursively. Here’s the syntax:

]]>http://www.rhizohm.net/syntax-for-copying-an-entire-directory-to-a-new-directory-in-azure-blog-storage-using-azcopy/324348e4-4ee0-4bee-a7cb-09e1c16c74adWed, 20 Jul 2016 21:56:30 GMTI wanted to copy every file from one directory to another using AzCopy. The cruz is using the /S switch which acts recursively. Here’s the syntax:

]]>I wanted to run our coded ui tests using Selenium against the staging uri during our continuous integration workflow of our cloud service web role. But the problem arose that I didn’t know the dynamically generated URI that gets created by Azure when deploying to the staging slot.

I

]]>http://www.rhizohm.net/passing-an-azure-staging-slot-uri-to-a-test-runner-in-visual-studio-team-services-release-management/6d2fc7ed-62a4-4db3-afea-8c92a5762023Tue, 12 Jul 2016 23:34:55 GMTI wanted to run our coded ui tests using Selenium against the staging uri during our continuous integration workflow of our cloud service web role. But the problem arose that I didn’t know the dynamically generated URI that gets created by Azure when deploying to the staging slot.

I chatted with DevOps guru Thiago Almeida about this and he had the following suggestion:

Add a Powershell script between the publish step and the test step. Here’s what the Powershell looked like:

To use this, you’ll need to change the service_name to your Azure service name.

Then, on the Test Assemblies task’s “Override TestRun Parameters” value, update it to this to set the base URL to be the value of the variable set in the PowerShell task above:

webAppUrl=$(StagingWebsiteURL)

Nice! It totally worked! Tip of the hat to Thiago Almeida for suggesting this!

]]>Started playing with .NET Core and, typical developer, didn’t read any docs first. Installed the tools, created a new project and typed Console.WriteLine(“Hello world.”). And what did I get when I compiled? A .dll? Weird. There was a public static void main(). What did I]]>http://www.rhizohm.net/why-did-net-core-build-a-dll-and-not-an-exe-/d1cb90a3-dc46-4663-8b4c-fd605795e5c6Thu, 16 Jun 2016 21:06:36 GMTStarted playing with .NET Core and, typical developer, didn’t read any docs first. Installed the tools, created a new project and typed Console.WriteLine(“Hello world.”). And what did I get when I compiled? A .dll? Weird. There was a public static void main(). What did I do wrong?

Well, my app didn’t compile! Why? Because I was running on Windows Server 2012, which requires the win8-x64 runtime. I added that so my runtime node looked like:

"runtimes": { "win8-x64": {}, "win10-x64": {}, "osx.10.11-x64": {} }

And, walla, my code compiled and a directory was generated with a .exe. Hello world!

]]>Okay, this wasn’t entirely obvious: to get nifty project templates for .NET Core applications, you have to install both Visual Studio 2015 Update 2 (https://www.visualstudio.com/en-us/news/vs2015-update2-vs.aspx) and the .NET Core SDK for Windows (https://www.microsoft.com/net/core). Then you’ll see]]>http://www.rhizohm.net/using-visual-studio-2015-project-templates-and-net-core/ffb0cf06-5e81-4b9f-a219-78aa439074c0Thu, 16 Jun 2016 03:57:40 GMTOkay, this wasn’t entirely obvious: to get nifty project templates for .NET Core applications, you have to install both Visual Studio 2015 Update 2 (https://www.visualstudio.com/en-us/news/vs2015-update2-vs.aspx) and the .NET Core SDK for Windows (https://www.microsoft.com/net/core). Then you’ll see the templates for .NET Core when creating a new project.]]>I had a rather old Azure web job that I always deployed by creating a .zip myself and uploading it to the portal manually. I decided to use the nice feature inside Visual Studio 2015 that publishes the project for you. But when I walked through the wizard and hit]]>http://www.rhizohm.net/resolving-the-msdeploypublish-error-when-publishing-azure-web-jobs-from-visual-studio-2015/52240eaa-f6e8-4a88-9183-0e0fedf911deFri, 22 Apr 2016 20:03:00 GMTI had a rather old Azure web job that I always deployed by creating a .zip myself and uploading it to the portal manually. I decided to use the nice feature inside Visual Studio 2015 that publishes the project for you. But when I walked through the wizard and hit “publish” I received the following error:

Error MSB4057: The target "MSDeployPublish" does not exist in the project.

I realized a couple things. First, you have to add the Microsoft.Web.WebJobs.Publish.1.0.11 package from Nuget. Even though VS has the publish to Web Job action, it doesn’t automatically add that package to your project.

Once you add it, you’re still screwed. VS doesn’t automatically modify your .csproj file to use this package. To do so, you need manually hack your .csproj file. Easiest way is to unload it in VS and add the following line right after the CSharp targets – I’ve included it below for context:

Note that I got into this whole scenario because for some mysterious reason, I could no longer successfully upload projects through either the legacy portal or the new one, so if you hit issue, my recommendation is to publish through VS, which is much cleaner anyway.

]]>Here's what I had to do to get SSL working in VS2015 with a web role in a cloud service:

1. Make sure that the url itself is 127.0.0.1 when launching IIS Express. If it is localhost, it won't work.

2. Make sure that you bind to

]]>http://www.rhizohm.net/using-ssl-in-the-azure-compute-emulator-with-a-cloud-service-web-role/ad4272e0-e09e-49ee-b0c3-ae1a025b1026Tue, 08 Dec 2015 21:40:00 GMTHere's what I had to do to get SSL working in VS2015 with a web role in a cloud service:

1. Make sure that the url itself is 127.0.0.1 when launching IIS Express. If it is localhost, it won't work.

2. Make sure that you bind to the azure dev fabric thumbprint that gets installed with the SDK. If you are using the 2.7 SDK, it is F8ACE24A36F93B006BFAF495F6C14FB827AC61A3

3. Make sure that the dev fabric cert is in the right place. EG, if you say it is in "MY" then make sure it is Personal when you are looking in certificate manager. Or, if you say it in root, make sure you copy it there

I explicitly remove the HTTP endpoint as well.

]]>I’ve never liked the default logging mechanisms in Azure for application event logging. Parsing the WADDiagnosticInfrastructureLogsTable is always such a hassle. Events I’ve written are mixed in with all the other events that Azure is firing all the time, and everything in my event is]]>http://www.rhizohm.net/using-nlog-for-diagnostic-logging-in-windows-azure-cloud-services-and-writing-the-logs-to-azure-table-storage/73085d83-b1a7-40fb-accc-276ae6a7dffaFri, 06 Nov 2015 22:57:00 GMTI’ve never liked the default logging mechanisms in Azure for application event logging. Parsing the WADDiagnosticInfrastructureLogsTable is always such a hassle. Events I’ve written are mixed in with all the other events that Azure is firing all the time, and everything in my event is jammed into a single field.

]]>I recently needed to encrypt/decrypt strings sent as querystring parameters over the wire. The use case happens to be allowing people to unsubscribe from a newsletter by clicking on a hyperlink in their email. The server receives the email as a querystring. Obviously, I don’t want to expose]]>http://www.rhizohm.net/encrypting-and-decrypting-a-string-sent-as-a-querystring-parameter-using-c-/db2c740c-ecd8-4aa6-8b69-f37e88915844Wed, 24 Jun 2015 14:39:34 GMTI recently needed to encrypt/decrypt strings sent as querystring parameters over the wire. The use case happens to be allowing people to unsubscribe from a newsletter by clicking on a hyperlink in their email. The server receives the email as a querystring. Obviously, I don’t want to expose a public service that takes an unencrypted email. So, I encrypt the email as part of the newsletter template. Then decrypt on the web server.

Not rocket science, but worth going over how I did it, as there were a few gotchas.

First, I generated an RSA crypto key as XML, using the code found here:

publicclass MyCrypto

{

RSACryptoServiceProvider rsa = null;

string publicPrivateKeyXML;

string publicOnlyKeyXML;

public void AssignNewKey()

{

const int PROVIDER_RSA_FULL = 1;

const string CONTAINER_NAME = "KeyContainer";

CspParameters cspParams;

cspParams = new CspParameters(PROVIDER_RSA_FULL);

cspParams.KeyContainerName = CONTAINER_NAME;

cspParams.Flags = CspProviderFlags.UseMachineKeyStore;

cspParams.ProviderName = "Microsoft Strong Cryptographic Provider";

rsa = new RSACryptoServiceProvider(cspParams);

//Pair of public and private key as XML string.

//Do not share this to other party

publicPrivateKeyXML = rsa.ToXmlString(true);

//Private key in xml file, this string should be share to other parties

publicOnlyKeyXML = rsa.ToXmlString(false);

}

}

Then, to encrypt the string in such a way that it could be passed as a querystring, I had to make some changes. First, I changed the encoding to UTF8 and not ASCII. Second, I then base64 encode the string, Third, I URL encode the string:

publicstring EncryptAndEncode(string text)

{

string encryptedText;

using (RSACryptoServiceProvider rsa = new RSACryptoServiceProvider())

{

rsa.FromXmlString(Resources.Resources.publicKeyXML);

var bytes = rsa.Encrypt(Encoding.UTF8.GetBytes(text), true);

encryptedText = Convert.ToBase64String(bytes);

}

return HttpUtility.UrlEncode(encryptedText);

}

On the decrypt side, because I’m getting the string as a method in my controller, the URL decoding is handled for me. So, the decrypt looks like this:

privatestring Decrypt(string text)

{

string decryptedText;

using (RSACryptoServiceProvider rsa = new RSACryptoServiceProvider())

{

rsa.FromXmlString(Resources.Resources.publicPrivateKeyXML);

var bytes = Convert.FromBase64String(text);

decryptedText = Encoding.UTF8.GetString(rsa.Decrypt(bytes, true));

}

return decryptedText;

}

]]>Hit a bunch of gotchas doing this; figure I’d share with the world how I got it working.

1. RUN THE UPDATED SQL SCRIPT

There’s a piece of SQL script that isn’t in the setup script and can only be found if you download the source … or

]]>http://www.rhizohm.net/installing-blog-engine-3-1-as-a-virtual-application-under-an-mvc-website-with-sql-server/b7b48515-5214-4ea7-b761-8e6cd9c39e76Thu, 14 May 2015 15:42:31 GMTHit a bunch of gotchas doing this; figure I’d share with the world how I got it working.

1. RUN THE UPDATED SQL SCRIPT

There’s a piece of SQL script that isn’t in the setup script and can only be found if you download the source … or copy/paste from below :)

]]>http://www.rhizohm.net/blogengine-net-provider-migration/7c7f8290-3fdb-47fb-b591-486832fdd908Thu, 11 Dec 2014 22:14:05 GMTI recently migrated this blog (which runs on Blog Engine.NET 2.8) from the XMLProvider to the DBProvider. I followed the instructions here: http://www.nyveldt.com/blog/page/blogenginenet-provider-migration which almost worked, but I had to make a couple changes:

BlogService.Provider.FillCategories and BlogService.Provider.LoadSettings requires that you pass the current blog, so those lines just got changed to

BlogService.Provider.FillCategories(Blog.CurrentInstance)

BlogService.Provider.LoadSettings(Blog.CurrentInstance)

Then, I had to manually update the GUID of the blog itself in SQL. Basically, after you run the DB create script and run the migration page, you grab the id of your old blog and update the bd_blogs table:

]]>I have been working on a project using the .NET SDK for Hadoop. I wanted to add some unit tests to the project, so I ended up writing some fakes for HDInsightClient, JobSubmissionClientFactory and JobSubmissionClient. I was hoping I might be able to reuse some fakes from the SDK]]>http://www.rhizohm.net/simple-unit-tests-for-hdinsight-c-sdk/18f7538d-64c9-4b3e-b821-8d04fb7f6839Fri, 03 Oct 2014 21:40:11 GMTI have been working on a project using the .NET SDK for Hadoop. I wanted to add some unit tests to the project, so I ended up writing some fakes for HDInsightClient, JobSubmissionClientFactory and JobSubmissionClient. I was hoping I might be able to reuse some fakes from the SDK git repo, but it seems like their unit tests actually stand up an instance of Hadoop. I didn’t want to actually stand up an instance; I’m treating Hadoop like a black box and I’m more interested in getting code coverage on all the C# code around the calls to Hadoop.

For my fake of IHDInsightClient, I only implemented CreateCluster() and DeleteCluster(), nothing fancy.

I had to make my own interface and wrapper to have a factory that would make a JobSubmissionClient (which is the same thing that the SDK did for its cmdlets):

Finally, for my FakeJobSubmissionClient, I do need to fake the work that the job does in Hadoop. In this case, it writes a file to blob storage as a result of the Hive query it runs. So, since my fixture has a static reference to a fake blobClient, I was able to fake the work that Hadoop would do in my implementation of CreateHiveJob(HiveJobCreateParameters hiveJobCreateParameters).

With all these fakes, I then wired up dependency injection in my UnityContainer and I was good to go. And now I have much more confidence that future changes to this codebase won’t cause regressions.