Meta

I have used the trial version of NDepend now on several occasions. In general I use it to get a quick overview of large code bases that I am trying to grasp or I use it as a first step in code reviews. It is an amazing tool.

The latest version 4 includes default rules for validating architectures:

This kind of rule can also be validated by the Visual Studio layered diagram. But not everybody owns Visual Studio Premium or Ultimate . You need Ultimate to create the diagrams but you can validate using Premium. With $11.000 for ultimate, $9000 for Premium and $400 for NDepend, NDepend is a cheaper alternative for validating your code against this kind of rule

What I like about the layered diagram is:

It is easy to setup

It is also a nice way to visualize the layers in your application and their dependencies.

With a hack you can validate you code against your architecture every time you compile

It is easily included as a build step in TFS build.

So let’s look at how NDepend v4.0 does compare on these points.

Easy to setup.

I already had a project in which I used the layered diagram. For setting up NDepend, I started by adding a new NDepend project to this solution:

After clicking another Ok I was up and running NDepend. That was pretty easy. You do get a lot of extras when doing this, another dialog in which you can choose to visualize the code or to explore the validation rules. Default the rules are validated on several triggers.

Visualizing application architecture.

The Layered Diagram gives me the following overview, which I like a lot. It not only gives me a certain understanding but also a way of communicating with team members and other technical stakeholders.

The layered diagram also gives the ability to group a number of projects and draw a single dependency for all projects inside of that group, E.g. in the previous diagram the LayeredSample.Services.PortalUpdatBatch group contains four projects any one of these projects has a dependency on the layeredSample.Core group so instead of drawing eight dependencies, this is shown by a single arrow between the two groups. This is a good thing because the diagram is kept readable. This is a bad thing because observers might miss the fact that there are a lot more dependencies.

NDepend gives a number of useful diagrams out of the box. The Dependency Graph comes closest to what I like.

The extra info you get when hover over the projects like file location, lines of code and lines of comments is great. Also the fact that you can make the size of the boxes and lines depend on code metrics like lines of code is pretty useful.

What I don’t like is the fact that you can’t save the layout of a diagram, yes you can save it as a picture but I really would like to be able to save which assemblies I have selected and the position they are placed at. As it is now, changing what the box size depends on, redraws my diagram and replaces all the assemblies that I threw away. That is really annoying, (Patrick Smacchia told me there are plans to fix this in the near future)

Validate at compile time

To do any of the architectural validation at compile time we want, we have to write custom CQLinq queries. I had not written any CQLinq queries before but with the online default rules as samples, the intellisense and the design time compilation it was pretty easy to setup the rules I wanted. For example I wanted only the Core.Configuration project to call the System.Configuration assembly :

//<Name>System.Configuration Should only be referenced by Core.Configuration</Name>warnif count > 0

I also wanted none of the Core Assemblies to reference any of the service assemblies.

// <Name>Core assemblies should not reference any of the service assamblys</Name>warnif count > 0

from a in Application.Assemblies.Where(x => x.Name.Contains("LayeredSample.Core") && x.AssembliesUsed.Any(y => y.Name.Contains("LayeredSample.Services")))

select new { a, a.Name, a.AssembliesUsed }

Rules for single dependencies you want to ensure or forbid are made very easily by opening the dependency matrix, clicking with the right mouse button on any dependency and selecting “Generate a code rule from this”

Marking a rule as a critical rule makes the NDepend indicator turn red when it is violated during compilation. It is only too bad that Visual Studio still reports that the build has succeeded even after breaking critical rules. I could find no way to accomplish the build to fail with a local build.

The rules I have build here for NDepend as well as the Layered Diagram are very solution specific. For every project they would have to be created again. NDepend does give you the ability to create more generic rules that you can use over multiple solutions, for details read “Validating Architecture through LINQ query”. Also the out of the box generic rules like “UI should not use DAL” and “Avoid namespaces dependency cycles” are very useful and don’t require any custom code at all.

Including NDepend in the TFS Build.

My first step into integration NDepend into TFS was a bit disappointing. Looking at the documentation at NDepend it only showed not very detailed steps for integration into TFS2008 and TFS2010, I want to use TFS2012 or TFS Service. Also the recommended way for integrating with TFS2010 is a codeplex project with version 0.5.0 beta 1, not very promising.

But since the documentation for TFS 2010 states that adding a single build step that executes Ndepend.Console.exe with the proper parameters should do the trick it should be possible to do the same for TFS Service or TFS2012. I tried it for both.

started by creating a team project and adding my sources to this project. Next step was to create a new build definition with a new process template based on the default process template.

Next step was adding a folder to my project called lib\build, placing NDepend complete in that folder and adding this to source control.

Added the Ndeped .ndproj file to source control.

After that added an InvokeProcess activity to my new build template in the …. section. Based on this post and this post. And calling the NDepend.Console.exe with the parameters described in the NDepend documentation I got it to work with a lot of trial an error

NDepend is an amazing tool and in this blog post I don’t do justice to all it can do by just comparing it to the “Layered Diagram”. If you want to quickly get a feeling for an existing code base, or when you do want to de reviews or if you want to keep on ongoing eye on the quality of the code base NDepend is a tool you definitely want in your tool belt.

If I get to a project that has Visual Studio Premium or Ultimate I will be using the Layered Diagram for this kind of validation, it is just too easy to use and the visualization and validations are adequate for the job. This does not mean that there is no room for NDepend, it still offers enough other added value.

In a project without Visual Studio Premium or Ultimate but with an NDepend license I will use it for these kind of architectural validations.

In a single project without Visual Studio Premium or Ultimate and without an NDepend license I don’t think that the architectural validation on it self is enough to buy NDepend . I do think that in any organization there should be a few licenses present that can be used for multiple projects to keep the code quality up to standards.

A while back I started reading the Microsoft Patterns and Practices eBook “Developing applications for the cloud, on the Microsoft Windows Azure Platform.” Now I finally got around writing down some of my thoughts on this book.

When I started reading the book my first thought was “strange”, when I got deeper into the book, my thought was “strange” , when I finished the book, my thought was “strange but informative”

The initial “strange” was from the fact that it is an eBook you can buy but you can also read it completely free online. Since even the pdf can be downloaded for free I don’t really see why you would buy the eBook.

The second “strange” was because I could not find a clear audience for the book. It goes from high level functional to low level development to low level infrastructure to business case. Every audience can get value out of the book and gain initial understanding but also every audience will skip parts just because they won’t find it interesting,

In the end the missing clear audience and free online content remained but I also gained a lot of knowledge from the book. With a fun writing style that is combined with low level technical detail ( might need some changes since Azure is evolving rapidly) it gave me some new unique insights on how to think and reason about cloud applications.

In the end I highly recommend reading the free online version of this book. With the technical details for which you have to wonder if they still hold true and the parts that you probably want to skip I think it is just not worth the money.

This review is done as part of the O’Reilly Blogger Review Program. O’Reilly provided the book but does not make any judgment on the content of the review.

I just finished reading “Head First Python” and had great fun doing so.

I Think that this is a great book for the beginning Python developer. But unlike a lot of other books that aim for beginners, good practices aren’t ignored for the sake of simplicity. The practices are explained and used.

The head first series can take some getting used to. Whenever I talk to someone about the head first books I advice them to borrow a copy before buying or take a look at the Google books version of the book. The style just isn’t for everybody. But what I have seen so far is that the style works. The pretty pictures, the repetition and the way context is given continuously make you grock and remember what you read easily.

When you want to think about if the book is for you, the author categorizes the book as follows:

This book is for you if you can answer “yes” to all of these:

Do you already know how to program in another programming language?

Do you wish you had the know-how to program Python, add it to your list of tools, and make it do new things?

Do you prefer actually doing things and applying the stuff you learn over listening to someone in a lecture rattle on for hours on end?

This book is not for you if you can answer “yes” to any of these:

Are you looking for a reference book to Python, one that covers all the details in excruciating detail?

Do you already know most of what you need to know to program with Python?

Would you rather have your toenails pulled out by 15 screaming monkeys than learn something new? Do you believe a Python book should cover everything and if it bores the reader to tears in the process then so much the better?

And I agree to these rules. It’s not a reference book, it isn’t the Python bible. It is hands on and a good start for learning Python. But if you are an experienced developer be prepared for a quick read. I read it in three evenings and sometimes things where explained that shouldn’t be explained to someone who already knows how to program in another programming language. But that is a thin line.

Many books for beginners leave out good practices because it might confuse the reader and take their mind of the things the book is trying to teach. “Head first Python” is a welcome change to this. As an example I loved quotes like these:

As a general rule, Python programmers look for ways to reduce the amount of code they need to write and worry about, which leads to better code robustness, fewer errors, and a good night’s sleep.

And on page 31:

Recursion to the rescue!
The use of a recursive function has allowed you to reduce 14 lines of messy,
hard-to-understand, brain-hurting code into a six-line function. Unlike the
earlier code that needs to be amended to support additional nested lists
(should the movie buff require them), the recursive function does not need to
change to process any depth of nested lists properly.

This review is done as part of the O’Reilly Blogger Review Program. O’Reilly provided the book but does not make any judgment on the content of the review.

I took and passed the BizTalk 2006 R2 exam, also known as 70-241 on April the 28th 2010. When you look at the official site preparation materials you can get a bit discouraged. There is nothing there. The skills measured are more interesting they can give a framework for learning. This post describes the resources I used.

Sources used.

I read two books : Professional BizTalk Server 2006 and Foundations of Biztalk Server 2006. The foundations book is a nice weekend read to give a general overview and convince a manager you have BizTalk knowledge. It is by far not enough for taking the exam. The Professional book is a great book with a lot of depth and great practical content. The professional book feels more like a great guide when working in the field and doesn’t feel like an exam preparation guide. I found it rather difficult to read.

Last year I took a five day training course called Biztalk 2006 deep dive from Quicklearn. This was an amazing class with in depth hands-on training. After this training I didn’t do anything with BizTalk for a year so a lot of the knowledge was pretty much gone. But I still had the course material and used this for preparing now.

This year I did an in–house training at my employers ( Sogeti ) this was a training led by an experienced BizTalk developer and the discussions, questions and answers were a great learning aid. But since the course was based on the foundations book I felt it also wasn’t enough for preparing for the exam. Most of links below were supplied by the trainer. Thanks Robin !

I used the following sites for gathering information (watch out BizTalk 2004, 2006, 2006 R2 and 2009 mixed)

A nice poster can be gotten here, it shows pretty detailed all the components of BizTalk. Below is my simplified graphical representation of the major BizTalk components.

Biztalk host: Logical set of runtime processes in which you can deploy services, pipelines etc. Represents a collection of physical runtime instances that contain deployed items.

Biztalk host instances: The physical installation of a host. A single host can contain many host instances. A single host instance can be added to multiple hosts.

Message: A message can be any kind of data, xml files, csv files, msmq messages, parameters from a service call.

Messagebox: The heart of BizTalk, a high per formant sql server database. All messages are kept in the message box. Receive ports and orchestration publish messages in the message box. Two way receive ports, send ports and orchestrations can subscribe to these publications.

Receive Port: A collection of one or more receive locations.

Receive Locations: A single location where messages can be received (think : subdirectory, msmq queue, webservice , database table) The location contains configuration for the adapter and the pipeline.

Pipeline: Prepare messages before they are put into the messagebox or before they are send to a send port. Receive pipeline has four stages Decode, Disassemble, Validate and Resolve party. The Send pipeline has three stages Pre-assemble, assemble and Encode.

Optional components: Within the different pipeline stages extra components can be placed. these can be components provided with BizTalk or third party vendors or your own custom components.

Map: A translation from one message type to another. Uses XSLT for the translations.

Orchestrations: BizTalk’s representation of a business process. Within an orchestration messages can be copied, evaluated, send to person by email, used to invoke other services etc.

Send port: The opposite of a receive port and receive location combined. It contains the configuration for an adapter and a send pipeline.

Send port group: A collection of send ports.

SSO database: A database to make single sign-on possible. It stores an encrypted mapping between a windows user ID and the associated credentials for one or more affiliated applications. BizTalk also uses this database to keep any secure encrypted information like usernames and passwords for connection to services,

Management /Configuration database: The database in which all configuration of BizTalk is kept.

Tracking database: Keeps track of all messages that have been processed by BizTalk.

RFID platform: This part of BizTalk makes it possible to develop Radio Frequency Identification (RFID) solutions.

Business Rule engine: Declaratively define business rules that can be used from within orchestrations.

Business activity monitoring: BAM Enables Users to view aggregate data related to thousands of instances of a business process, View a single instance of a business activity, search for instances of a specific business activity, filter activities at a given stage of completion and define charts, reports, KPIs, and alerts

Business activity services: As part of running a business process, a business analyst may need to create a relationship with a new trading partner, for example, that defines the partner’s role, the business agreement between the two firms, and other aspects of this new association. Maybe a purchasing manager needs tools that can wrap together and distribute the artefacts required to let a partner quickly implement and begin participating in a business process. In BizTalk Server 2006, these functions are provided by Business Activity Services (BAS).

There where fifty questions that needed to be answered within two hours. After all this preparation, I found the exam still pretty hard. The parts I missed most in my preparation were about exception handling in orchestrations.

To anybody who is also is trying to prepare for this exam I hope this helps and good luck ! Don’t hesitate to ask any question.

Saturday November the 17th 2009 Devnology held their first Devnology Community Day. A great day with amazing content for developers.This post gives my impressions of the day.

For people who don’t like to read just watch the slideshow.

General Impression

When you put a group of people that all share the same interests in a room it always creates a breeding ground for discussions and sharing knowledge. If you than put these same people in a nice building, feed them and in general make sure they have nothing to complain it only gets better. I could feel this from the first minute I arrived at the community day and it lasted for the rest of the day. The only thing I possible can complain about was the lack of an internet connection. This was an unforeseen problem and will be double checked next time. Pieter Joost promised me so it must be true.

Sessions

The session where divers in content and in setting, there were workshops, presentations, Chalk ‘n Talks and Lightning talks. Topics you could choose from are in the table below.

I learned some Haskell ( Even won a book ), got an introduction to Ruby and saw some amazing Java code for Software transactional Memory.

Devnology

Devnology is a great organization that already has organized a lot of amazing meetings. A panel discussion on Model Driven development and a session with Greg Young about domain driven design are just a few of the highlights. Devnology has a mission that says it all:

Devnology aims to provide the Dutch software development community with opportunities to exchange knowledge and experience. We aim to bridge the gap between theory and practice of software development.

Most of the twitter tweets about Devnology use #devnology so you can see what people are saying over here.

Contacting and asking Devnolgy questions is easy, just follow this English contact page, they are all very involved and more than willing to answer questions.

Devnology depends on sponsors and so far they had some great sponsors. The community day was held at VXCompany in Baarn. VXCompany hosted the event very well and free charge.

Conclusion

I had a very well spent Saturday. Everything was taken care off but there also was a lot of freedom to find your way. The speakers were very involved, very easy to talk to and as far as I can assess very knowledgeable. You just have to love the organizers, the speakers and the attendees for giving up their free Saturday and making this an event to remember.

I am planning on exploring how to architect a good MVC application. During this exploration I found some interesting web-resources. I might be using this post in possible future blog posts as a reference.

Sample apps

I really think that looking at other peoples code can give you great ideas about what to do and what not. So here are some sample apps to look at. You could look at SharpArchitecture also as a sample app but I thought that one deserved a paragraph of it’s own.

I don’t know Rob Connery personally but you gotta love what he does. StoreFront, is his sample app. And you can find a lot about it here.

Read the separate Oxite paragraph for more details before downloading ! Oxite got a lot of bad credits but combined with the critics you can find it still has value in learning how to architect an MVC app or how not to architect an MVC app.

SharpArchitecture already has an architecture and while exploring it you can get very inspired. As far as I can tell there went a lot of thought in this architecture and for now I see it as a starting and reference point for my explorations.

Most screen casts are not on an architecture level, but I do believe that the more details you know, the better you can architect the application. Also you need to be aware that these screen casts might be about different versions of the MVC framework.

There once was a woodcutter working hard cutting down a tree. Suddenly a leprechaun appeared and started yelling stop stop.

The woodcutter looked at the leprechaun and started smiling : “Are you going to offer me a pot of gold if I quit cutting down your house ?”. The leprechaun frowned and replied “off course not silly, leprechauns don’t live in trees ! I live in a beach house in Miami, but I do want to make you an offer.”

The leprechaun made the following offer:

If you promise you will give half of your winnings to charity, I will let you win twenty million.

The woodcutters first thought was that he was not going to give away ten million. But thinking a bit harder he came to the conclusion that he just was offered ten million and he should accept the offer as fast as he could. So he started picturing what his wife would say when they would have won the ten million. Humm wait a minute, I won’t be able to convince my wife to give away ten million. So he asked the leprechaun if he was allowed to tell his wife about the deal. The leprechaun agreed.

If you and your wife will give half of your winnings to charity, I will let you win twenty million.

So the woodcutter started picturing again how he and his wife would celebrate the ten million tomorrow or the day after. Hummm the leprechaun did not say anything about a date. So leprechaun when is this going to happen ?. Oh sorry said the leprechaun I cannot commit to a date, it will be in your lifetime. The woodcutter thought hard and did not find this very acceptable, suppose he was already going to win twenty million, was the leprechaun going to cheat him out of ten million ? The woodcutter strongly suggested that the leprechaun would give a date, the leprechaun finally agreed.

If you and your wife will give half of your winnings to charity, I will let you win twenty million before the end of this year.

The woodcutter started imaging again about spending all those euro’s. Hummmm the leprechaun did not say anything about euro’s. So leprechaun what will the currency be ? Oh the leprechaun could not really tell, it was a matter of what lottery he could manipulate and negotiations with other leprechauns. The leprechaun and the woodcutter discussed this a bit and came to a new agreement.

If you and your wife give half of your winnings to charity, I will let you win about twenty million euro’s, remaining after being converted form the original currency, before the end of this year.

Learning from the discussion so far the woodcutter did not start imaging anything but he started analyzing the agreement and the word ‘about’ started to feel a bit funny. What would happen if it was nineteen million nine hundred thousand, would he still be required to give half ? And what charity ? He could start his own foundation for the needy woodcutter. After three days of negotiating they came to an agreement.

If you and your wife give exactly half (rounded down) of your winnings to a charity of the leprechauns choice. The leprechaun will let the woodcutter win at least twenty million euro’s, remaining after being converted from the original currency, before the thirty first of December this year at midnight. The wife will not get her own amount but she shares with the woodcutter and if anything significant changes in the made agreement the woodcutter and the leprechaun will renegotiate the agreement.

The woodcutter went home a happy man. When he got home, his wife was mad for him being late three days. After some explaining she did not like the deal, she wanted half of the ten million. The wife left the woodcutter and the woodcutter went back to the forest to renegotiate.

The wife lived happily ever after.

When making agreements consider your organization, time, cultural differences, measurability of requirements, changes in the environment, personal viewpoints and as much as you can think of. Make the choices flexible enough to last and make them fast enough so you won’t be caught by design creep. Don’t exclude any major stakeholders.

Today I went to devdays 2009 www.devdays.nl. Except from the terrible traffic it was a great day. A lot of good content and it was hard to make a choice on what session to go to.

During the sessions I made notes on my new msi netbook. It worked great and only in the last session suddenly shut down because the battery was low. I am posting my notes as is, so if it reads a bit clunky it is because it was written in a hurry. Feel free to ask questions or add content in the comments.

A small introduction for these posts you can find here.Today the first day of www.devdays.nl. the day started great with a broken Tom-tom and me taking a wrong turn in The Hague. So I missed the keynote.

We are programming very declarative, this creates a lot of noise. Finding out what the code does can be hard. A lot of details go into the how instead of the what. Krishnan uses linq to show that linq already has less noise. His non-technical girlfriend can understand a linq query.

Krishnan says that there is room for static and dynamics languages.

When you look at hardware trends,more and more multi processors machines are getting mainstream. C# 4.0 will give more possibilities for parallel programming. Krishnan demo’s this by using a LINQ ray tracer program this program is part of the parallel extensions. To make it parallel he only uses the .asParralel keyword on a LINQ query.

The themes for C# 4.0 are

Dynamically typed objects

Optional and Named Parameters

improved COM interoperability

Co- and Contra-variance

For some parts of the programs we write statically typed objects can get in your way. For these parts you can use dynamic typed features of the DLR. The DLR provides for expression trees, dynamic dispatch and call site caching. As an example Krishnan shows ugly C# reflection to determine a type and invoking a method and compares it to JavaScript and to c# use the dynamic keyword. The dynamic keyword is a much cleaner piece of code. In the demo he calls an iron python coded calculator form c# using the dynamic keyword. The add function can take any argument that exposes the + operator to function correctly, any argument compiles. The next demo shows writing a dynamicbag that is a child implementation of DynamicObject. In this demo he uses a dictionary that is holding the properties you can call on the object, these methods than can be added at build time from the calling code. So DynamicBag.MyCustomProp = 1 will create an entry in the dictionary the holds MyCustomProp and value 1, so this code works with some minor overriding of methods even if DynamicBag doesn’t have a property MyCustomProp.

The current way to use methods with less or more parameters is to use overloads. In C# 4.0 you have named and optional parameters. The optional parameter feature you can use by setting a default value to the parameter in the method declaration. When calling the method you can use named parameters this makes the code more readable.

For COM interop there are improvements:

Dynamic mapping

Optional and named parameters

Indexed properties

Optional ref modifier

interop type embedding

To explain Co- variance you can look at an array of strings and putting a button in it. This is co-variant but not safe. When looking at List<String> and Ienumarble<object> these cannot be cast to each other. In C#4.0 you can now use IEnumarable<in T> and IEnumarable<out T>.

In the future it might be possible to use the compiler as a service. Your program should be able to change compile behaviour.Krishnan shows a demo of code that will compile in the C# version that will come out after C# 4.0. In this he uses a CSharpEvaluator class with which he writes a program from strings. Using this he writes a command line c# interpreter. Kewl.