16 months on, the blog has finally clocked up 50,000 hits! My original motiviation for blogging was to be able to write something less formal and that took less time to write and format than the articles I was putting out on the Xense website. At the time there were very few Documentum blogs (Johnny Gee’s was probably the only one I noticed regularly) and I was keen to do something deeply technical in a similar vein to Jonathan Lewis (Oracle) or Mark Russinovitch (Windows).

One of the things that has surprised is just how much effort it is to keep finding the time and inspiration to write. Back in January 2007 I managed more than one article a week, these days I think I’m doing well if I can do a couple a month. Partly this is because I had a number of small pieces of research that were already done and simply needed to be turned into words. These days I still have loads of ideas but so little time to follow up and do the research.

The focus has changed a little bit too. When I started I had been spending a lot of time knee-deep in object replication. Frankly I don’t like the technology and would be very hesitant to recommend it on a project. Part of the problem is the obscurity of the implementation. The serious guts of the workings are embedded deep into the Content Server C/C++ code so it’s not easy to work out what is going on when it fails unless you want to dive into the assembly-level debugger (which I have resorted to on occasion).

The other problem is the dump and load process that underlies it. Dump and load is simply too flaky for a reliable replication solution. I managed to find various ways of crashing the content server (which is catastrophic on the thread-based windows implementation) which I wrote up for a client in a document called ‘Killing the Content Server’. I sent it to Documentum support too.

Here’s to the next 50,000!

BTW I’ve added in a blogroll entry for Andrew Binstock – and excellent blog covering all sorts of things Java Development related. Very honest, very open.