Tuesday, August 19, 2008

This is my second blog post today - a first for me! I just noticed that Gabriel Schenker did a really nice write up on Tarantino Database Migrations. As with my blogging, I kind of assumed Headspring would be the only user of Tarantino when I open-sourced the project on Google code. It's very flattering to see someone like Gabriel take the time out of his day to document the project.

As Jeffrey Palermo mentioned, we're always looking for contributors. In particular, an Oracle guru that could add Oracle support would be very welcome :). However, if we don't get a contributor, I plan to add Oracle support before the end of the year to support my current project.

If you're like me, you're lazy. And last night, while on my couch watching the Olympics, I attempted to do a simple thing - compile a solution containing a setup project (.vdproj) from NAnt. After all, when you're distributing a Windows Forms application to a non-technical audience, it's natural to want an installer to bootstrap to the .NET framework, copy the application to the program files directory, and create an icon on the desktop and start menu. To enhance efficiency through automation, it's also natural to want to build the software on a build server instead of manually on a developer workstation. Let's just say it took longer than a few heats of the 100m butterfly to complete this simple task. Hopefully I can decrease that learning curve for you, my fellow developer.

My first thought was, "I'll use msbuild to compile the solution". It's completely logical, but unfortunately impossible. vdproj files can only be compiled by Visual Studio. After about an hour of scouring the Internet in denial and throwing up in my bathroom, I accepted this *enormous* limitation. Apparently this issue has been known by Microsoft since at least 2004, but they haven't had time to address it. Most likely this time was instead spent pursuing the Microsoft dream. By the way, in the Microsoft dream a non-technical website administrator drags their entire SQL Server database onto the ASP.NET "design surface" and selects the publish option from the build menu.

The next problem that arises is: how do I call Visual Studio from NAnt? What executable and command-line arguments do I use? How do I dynamically determine where Visual Studio is installed? Below is some NAnt code that answers these questions:

There are a couple of ways of skinning this cat, but reading the Visual Studio 2008 path out of the registry actually turns out to be the simplest solution. I hope this helps somebody else out some day.

Thursday, August 14, 2008

I saw a TV commercial recently that reminded me of what it's like for developers learning advanced object-oriented development techniques. The commercial showed a golfer standing in front of a golf ball, club in hand. Before he swings, he thinks about all of the tips his friends have given him about golf:

He's thinking about so many things that instead of hitting the ball, he releases the club in his backswing and nails his caddy - after all, he forgot to grip the club. I couldn't find the clip on YouTube, but I found something very similar to give you an idea:

I had a similar experience when learning to water-ski on one ski. I didn't get on top of the water until I realized the only thing I needed to remember was to push hard with my back foot. That was it - when the boat engine roared, I pushed with my back foot. Viola!

When I began working for Jeffrey Palermo, I struggled to find a common theme to guide my development decision making. The list of rules to follow when writing maintainable, object-oriented software is immense:

For developers dedicated to excellence (but unfamiliar with advanced OOP), the list above can basically prevent them from ever writing any code. In my case, what will Jeffrey think of the code I commit? How do I *know* I'm not writing the legacy code that will torture my team in the future? Is there a simple rule I could follow (like pushing with my back leg) that I can focus on to be successful?

Fortunately, I found just such a rule to address all of the concerns listed above. It is... drum roll... 100% Unit Test Coverage. Specifically:

1. Unit test every meaningful input combination2. In Onion Architecture terms, test the behavior of only one domain object or service at a time3. Test all delegation from one class to another4. Keep your tests small and readable

Viola!

By testing all delegation logic, you're almost forced to define an interface for each of your services so you can mock them when unit testing the classes that depend on them.

By testing every meaning input combination, you're forced to break up code into small, simple classes. Methods with high cyclomatic complexity will create too many input combintations. A class with 16 input combinations can often be broken up into two classes with four input combinations each. You just cut the number of test cases in half (4 + 4 = 8, 4 * 4 = 16).

By unit testing your classes, you prevent any dependencies on infrastructure. For example, you can't include code like DateTime.Now directly in your classes. Otherwise, you'll need to reset your system clock to produce a predictable unit test result - yuck.

The mistakes you do make will be small and easy to correct. After all, your unit test suite will catch any refactoring mistakes you make.

The next time you're feeling overwhelmed keeping up with the Palermos, Millers, and the like, challenge yourself to achieve 100% Unit Test Coverage. If you're like me, you'll find it rarely leads you astray.

Friday, August 1, 2008

Since the .NET 1.0 Beta was released in 2001, I've been firmly in the .NET camp. However, there is (at least) one thing that the Ruby-on-Rails guys really nailed - Convention over Configuration.

How should I organize my solution? What should I name my views? What should I name my product category table this time (tblProductCategory, ProductCategory, or product_category)? How should I test my data access layer? In my opinion, the best answer to these and the thousands of other trivial decisions we make as developers every day is "You tell me" or "I'll tell you", but let's be consistent.

During my time as a developer, I have made two observations about (non-Headspring :) ) developers. First, most developers feel entitled to largely contribute to these kinds of decisions or outright make them for themselves (and thus not be consistent with the rest of the team). Second, in pursuing their agendas, they feel justified in spending hours arguing the relative merits of their trivially different positions.

On the surface, this may seem like a plus. "Isn't it great that our developers are so passionate about pursuing perfection?" In practice, though, all this philosophizing about casing, underscores, solution organization, and other small issues comes at a price. That time is not spent thinking about software licensing, golf-club fitting, apartment locating, and Sarbanes-Oxley compliance (my last several projects). In short, it's not spent thinking about how to address the difficult business problems we were hired to alleviate through technology.

This is where clearly defined and rigorously practiced technical leadership is worth its weight in gold. Like in any discipline, a good leader relies on his or her own experience and seeks the input of other team members. The leader than aggregates all of this information, makes a decision, communicates the decision, and ensures compliance (through carrots and sticks). While developers tend to fear strong technical leadership, good leadership makes everybody more comfortable in their role by providing clear expectations. It also greatly increases team effectiveness (by eliminating wasteful discussions) and the likelihood of delivering software that truly meets the needs of the business.

This technical leadership is particularly important in the land of .NET. Let's face it - the development techniques Microsoft promotes in its sales demos, certified training curricula, and Patterns and Practices website do not generally constitute solid software engineering practice. For example, with the exception of Scott Guthrie and Scott Hanselman, I have yet to find anybody in Microsoft even talking about test-driven development.

To this end, Jeffrey Palermo has been working hard to harden the definition of the Onion Architecture. For at least our Headspring team, this architecture will provide guidance to better enable our technical leaders to move quickly and deliver excellence.