I needed a console app that reads some inputs from an online Excel workbook, does some processing and then writes back the results to a different worksheet. Because I enjoy pain I decided to use the thinly documented new Microsoft.Graph client library. The sample code below assumes that you have a work or education Office 365 subscription.

// the excel unit tests seem to be the most useful documentation right now:// https://github.com/microsoftgraph/msgraph-sdk-dotnet/blob/dev/tests/Microsoft.Graph.Test/Requests/Functional/ExcelTests.cs
}
}
}

Paste the code into a new console project and then follow the instructions at the top to add the necessary NuGet packages. You'll also need to register an application at https://portal.azure.com/. You want a Native application and you'll need the Application ID and the redirect URL (just make up some non-routable URL for this). Under Required Permissions for the app you should add read and write files delegated permissions for the Microsoft Graph API.

Hope this saves you a few hours. Comment below if you need a more detailed explanation for any of the above.

I'm working on page speed and Google PageSpeed Insights is telling me that my PNGs are just way too large. Sadly .NET does not provide any way to optimize PNG images so there is no easy fix - just unmanaged libraries and command line tools.

I have an allergy to manual processes so I've lashed up some code to automatically find and optimize PNGs in my App_Data folder using PNGCRUSH. I can call CrushAllImages() to fix up everything or CrushImage() when I need to fix up a specific PNG. Code below:

ASP.NET has a CssMinify class (and a JavaScript variant as well) designed for use in the bundling pipeline. But what if you want to have your CSS minified and inline? Here is an action that is working for me (rendered into a style tag on my _Layout.cshtml using @Html.Action("InlineCss", "Home")).

Note that I'm using this to inline CSS for this blog. The pages are cached so I'm not worried about how well this action performs. My blog is also basically all landing pages so I'm also not worried about caching a non-inline version for later use, I just drop all the CSS on every page.

Did you know that Windows still has a vestigial finger command with just about nothing left to talk to? One of my New Year's resolutions is to bring finger back and unlike the stalled webfinger project I need to make some progress. Here's some C# to run your own personal finger daemon... you just need to create a .plan file in your home directory (haven't done that for a while):

I've used the ZoneInfo (PublicDomain.ZoneInfo) project from CodePlex for quite a few years, especially in Catfood Earth. The project had rusted a little so I emailed the author (Mark Rodrigues) and he was kind enough to add me as a developer. I've just updated ZoneInfo with some of the local changes I'd made and a variety of patches from the CodePlex community. It now works with the latest IANA tzdata file, at least for the test cases I can run. Let me know if I missed something (and thanks Mark for letting me contribute back to this very helpful project).

I had a play with the Google Spreadsheets API recently to feed in some data from a C# application. The getting started guide is great and I was authenticated and adding dummy data in no time. But as soon as I started to work with real data I got:

The original sample code still worked so it didn't seem like any sort of temporary glitch as the message suggests. After much hair torn it turns out I was getting this error because I had used the literal column names from my spreadsheet. The API expects them to be lower case with spaces removed. If not columns match you get the unhelpful error above, if at least one column matches you get a successful insert with some missing data.

Error messages are one of the hardest parts of an API to get right. If you're not very detailed then what seems obvious to you can leave your developers stumped.

Search on enter has been broken for a while in BlogEngine.NET (I'm running the latest 2.8.0.1 version). Finally got a chance to look at this today and there is a simple patch to the JavaScript to fix it. See the issue I just filed on CodePlex for details.

I've been using the Facebook Comments Box on this blog since I parted ways with Disqus. One issue with the Facebook system is that you won't get SEO credit for comments displayed in an iframe. They have an API to retrieve comments but the documentation is pretty light and so here are three critical tips to get it working.

The first thing to know is that comments can be nested. Once you've got a list of comments to enumerate through you need to check each comment to see if it has it's own list of comments and so on. This is pretty easy to handle.

Last but not least you want to include the comments in a way that can be indexed by search engines but not visible to regular site visitors. I've found that including the SEO list in the tag does the trick, i.e.

I've included the source code for an ASP.NET user control below - this is the code I'm using on the blog. You can see an example of the output on any page with Facebook comments. The code uses Json.net.

I messed up the first upgrade attempt because the updater utility updates the source folder (containing the newly downloaded 2.8 code) instead of the destination folder (containing the current version of your blog). This is a little odd and the result is I uploaded an unchanged instance and then embarrassingly complained the the Facebook bug hadn't been fixed. It had, just not in the folder I was expecting. I probably didn't pay enough attention to the instruction video.

Having got that out of the way I discovered that new posts were appearing with a bad link (to /.aspx instead of /blog-title.aspx). I rarely post using the editor as I have a home-grown post by email service running. After a bit of digging it turns out that prior to 2.8 you could leave the slug empty when creating a post but now this results in the bad link. Luckily there isn't much effort require to fix this, you just need to set the slug before saving the new post:

In the middle of playing with this my live site died and started returning a 500 error. No amount of uploading the working local copy would fix this. Happily Server Intellect have outstanding support and restored a working backup for me in the middle of the night. Thanks chaps!