Series Overview

Custom Error Handling Attribute

Added in FilterConfig.RegisterGlobalFilters and bound in DiagnosticModule, AppErrorHandlerAttribute invokes reporting the unhandled exception and setting the error view to be displayed to the end user. An enableErrorPages appSetting controls whether any of this is done; for local debugging or a dev web server having this off might be desirable.

Logging and Reporting the Error

In the same namespace the ErrorReporter class invokes generation of an error report and logs and emails the error report.
The overload with customActivityMessage would generally be used with handled exceptions where it may still be desirable to
report the exception in cases.

Building the Error Report

Various diagnostic info classes are responsible for building different diagnostic "sub reports".
Each inherits from DiagnosticInfoBase which is a glorified StringBuilder, with functionality to build both a plain text version of the report (logged to DB) as well as an HTML formatted version (used for emails).

The base class has safe appending and formatting functionality and ensures that any error in generating a part of the report doesn't stop the whole process.

The Error View

At the end of AppErrorHandlerAttribute the SetErrorViewResult invoked the Views\Shared\Error.cshtml page. According to the HTTP status code, content such as text, images and styles differ. Separate error pages may have more advantages if there are a larger number of differences; a single page was less work here.

That code makes me itch a bit and could be done better but there is a reason for it. If you're wondering why not just use custom error pages, the answer is that doing so for a 404 page ends up producing a 301 redirect to then 404 on the custom not found page. For SEO purposes that is usually considered a bad practice. If you don't have a public site or as much SEO concern, that may be acceptable but in this case it wasn't.

Error Controller

The error controller sets and logs the response status codes and returns the error view. For the Gone action, usually there'd be routes defined for legacy URLs that would direct to that action.

It was the eleventh hour before a web app was to go live and there was nothing in place for error handling or logging. You're shocked I'm sure (that's my sarcastic voice). Normally with web apps in the past I've used ELMAH for unhandled error logging and NLog with text file targets for other logging.

ELMAH works great but in this case I was told it was pulled for some reason. With no time to find out the story there I threw together something to capture basic error details and send an email. Later that grew and was customized and ELMAH never returned though perhaps it should have. Still there are benefits of doing this yourself including more customization, less bloat, less hunting through perhaps lacking documentation etc.

On the logging front I still wanted to use NLog but text file logging was no longer a fit. This app had a few instances behind a load balancer and central logging was needed plus it needed a higher volume of logging in some troublesome workflows, more so than the occasional errors or warnings here and there. I had not used NLog with SQL Server before but this seemed like the perfect time to do so.

Series Overview

Setup and Configuration

I began by installing the NLog NuGet package and the NLog.Extended package for asp.net specific information into the ASP.NET MVC 4 project. The NLog.Config package is useful for a standalone NLog config file but for web apps with existing config transforms I find embedding the configuration in Web.config easier.

In web.config under configuration/configSections, the NLog section is defined:

<section name="nlog" type="NLog.Config.ConfigSectionHandler, NLog" />

Under <configuration> my initial NLog config looked like the below, using the async attribute on targets to make each target work asynchronously:

The Database

Our DBA setup a separate database for logging data to set DB properties differently as log data and transactional app data are different beasts. The sample app for this post is written against the Chinook sample database so I named the log database Chinook.Logs. The primary table script follows, minus the indexes, SET statements and the like.

The table is pretty straightforward with perhaps a couple of exceptions. The Code field was to be used for custom error codes (HTTP or otherwise) or "tagging" log statements. PartitionKey was added by our DBA mainly for archiving log data but also for efficient querying by day; more on this to come. Finally I originally had the string columns as varchar but we found that NLog used nvarchar and a conversion was happening with each insert that had a slight performance impact.

The insert sproc grabs the day of the week and inserts that as the partition key and the rest of the data just passes through.

That was scheduled to run daily. So if today is Friday, datePart(weekday, getdate()) returns 6, +1 is Saturday so all of last Saturday's log records get automagically switched over to the LogEvent_Switched table. This leaves the last 6 days of log records in LogEvent and the 7th day in LogEvent_Switched. If you don't have the Enterprise edition and can't use partitioning, regular deletes may work but could be problematic when deleting a large number of rows during frequent inserts or selects.

Finally permission to execute the sproc was granted and then it was back to .net land.

The functionality of the Log class is provided mostly by inheriting from NLog's Logger class.
One custom method is defined to allow passing custom log properties much like you would in ASP.NET (new {code = "404"}).

In some cases we can't easily do ctor injection - for example in the application class of Global.asax.cs or in a filter attribute.
For these exception cases AppLogFactory is used to create an instance of ILog.

Troubleshooting Failed Logging

When setting up logging initially or making changes later, logging might not work.
Usually I'd first fire up SQL Server Profiler and watch for calls to the log insert sproc.
If I didn't see them then I'd know it's probably a configuration issue. In the nlog tag in the
config file I set throwExceptions="true" internalLogLevel="Debug" internalLogFile="${basedir}/NlogInternal.log"
but in practice that rarely seemed to do anything; turning off async temporarily may further help troubleshoot.

First I'd check when creating the log if the log levels were enabled as expected from the config.

Next I'd check the configuration in code with a small class, also useful for changing config at runtime.

Evaluating Async Logging

Because I'm paranoid, OCD, detailed and curious I wanted to ensure the async behavior and compare to non-async.
On the home page I added a link to generate batch log records, corresponding to this controller action.

Evaluating Log Queries

Usually when inspecting log records I'd select just those columns I'm interested in and order by LogId DESC.

Once enough log data was there it was helpful to query on the columns that would typically be filtered on and evaluate the execution plan and response times.
From there indexes were added as needed for columns such as EventLevel, URL, UserName, etc. and execution times compared afterwards.

Preparing for Deployment

In Web.Config.Release NLog config transformations are performed to change the log level from Trace to Info, to remove debugger output, and to tone down NLog internal issue reporting.

Slow Cheetah preview:
Note that removing the nlog xsd attribute in the root web.config (and all transforms) was needed for the transformation to work correctly. Also at one point I tried a replace transform to replace the entire nlog tag but found it didn't work correctly and it created too much duplicate content between configurations.

After creating an ASP.NET MVC 4 web app I noticed in the Visual Studio Package Manager that there were several NuGet packages that had updates available.
My first thought was just to update them from the dialog while I was there. That has a few problems though:

You have to update packages one at a time

You cannot specify any of the options from the dialog that you could from the console

Updates will need to be performed again later and why repeat a manual process

Problems Updating Packages

Whether updating NuGet packages from the dialog or the Package Manager Console, there were a couple of issues I ran into:

Modified files - I received some warnings such as "Skipping 'Scripts\jquery.validate.js' because it was modified" so I was not getting full updates.
I did not actually modify any of the files NuGet indicated were modified but I guess Visual Studio touched them in some manner.

No -reinstall or -force options - This relates to the modified files problem but at times it is necessary to uninstall and reinstall NuGet packages to
update them correctly. I've requested a -reinstall option before and that is
tracked as an issue to address. Additionally a -force option is coming to Update-Package
in NuGet 1.9 so that could help later.

Source control - We use TFS source control (insert look of pain here) and it freaked out at times when updating packages. For example:
This was worked around by
breaking up some checkins.

Package dependency conflicts - Sometimes one package cannot be updated because another package depends on a different version of the same package.
I ran into this problem trying to upgrade Json.Net but could not because a RavenDB package depended on an older version. This isn't something that can be automated per se but something
to be aware of.

Package Awareness

There are a good number of packages used by default in an ASP.NET MVC 4 web app. I wanted to ensure I understood the packages in use
and the dependencies between them. One tool that I had forgotten about is the NuGet Package Visualizer which
came in handy for visualizing the relationships.

Creating a script

To address these items I created a PowerShell script to be included in the web project with functions that can be called from the Package Manager Console to automate some of the
NuGet work.

The below script is a first pass attempt that I am sure will be revised later but it seems to be working so far. On CodePlex I previously saw a more generic PowerShell script to
more forcibly update all packages listed in packages.config. I cannot seem to find that at the moment but that is certainly a valid option. For now I want to be very explicit about
each package to update.