Tuesday, 26 November 2013

Unity and discussion: need for friendly criticism

One of the themes we heard repeatedly at the Radical Independence Conference this weekend was calls for nationalisation: nationalisation of the banks, of Grangemouth, of the oil industry. This makes me very cautious. Of course, conference speeches are not places for nuance, for detail. It's possible that those who urged nationalisation did not mean the statist, centralising nationalisation of 1945. So I'm cautious rather than hostile.

My intention in this essay is to set out the reasons that I'm cautious. This isn't to criticise anyone; it isn't to be hostile to anyone. As Dennis Canavan said, we must keep our eye on the ball; we must achieve independence, and to do that we must work together as a broad front. We don't need schisms, splits. I'm not seeking to promote those. I'm seeking to start a discussion.

Nationalisation provides new targets for elite capture

In this essay I use 'elite capture' as shorthand for the propensity of well-connected influential elites to establish themselves in positions of power and benefit in institutions and schemes set up for the public good, or for the good of specific minorities. For example, what is known as 'the quangocracy' or 'the great and the good' are well connected elite groups who establish themselves in positions of profit in many public bodies within Scotland and the UK generally.

Nationalisation - the concentration of the whole of an industry within a nation into a single unit owned by that nation - provides a target for elite capture. Western experience of industrial organisation is to have decision making power concentrated at the top, in 'the board'. Elites have many excellent and well-developed strategies for the capture of key concentrations of power. Creating large new top-down structures within society, with control over key economic assets, is just inviting elite capture.

Left elites and right elites

It's natural human behaviour - normal, obvious, we all do it - to advance people we know, people we trust, people like us. When this happens in recognised elites - the soi-disant aristocracy, or the 'old school tie' of the public school. But the same mechanisms operate on the left; and there are what I would describe as 'right elites' as well as 'left elites' in the British Labour movement.

Labour shadow cabinet composition

There are twenty-seven members of the current Westminster shadow cabinet, and actually, when you look at their records, they're a pretty impressive group of people. Jon Trickett is a plumber to trade, a long time peace campaigner, an anti-fascist, and came up through the trades union movement to become a councillor in his home town of Leeds before being elected to parliament. Steve Bassam, a social worker, founded a squatters union and campaigned for the rights of squatters and the homeless, before serving on his own local council and then in parliament. Ivan Lewis set up a learning difficulties support charity at the age of seventeen. He, too, served on his local council before being elected to parliament.

Many of them are clearly exceptionally bright. Mary Creagh, Andy Burnham and the twins Maria and Angela Eagle were all working class kids who went to Oxbridge. Rachel Reeves may also be - I don't have information on her background, but she was certainly educated at New College, Oxford.

Indeed, nine of the twenty-seven - one third - went to Oxford or Cambridge, and there's the first part of the rub. Four of them - through no fault of their own - had parents who were already members of the British elite. Six of them went to fee paying schools. Thirteen of them - almost half - have never worked outside politics and the Westminster village. Taking the intersection of those sets (oxbridge or elite parents or fee paying school or never worked outside politics), nineteen - two thirds - can be classified as elite. Admittedly, that's a crude score; admittedly, as I've said before, many of these - most of these - are pretty impressive people.

But what they are not is the labour movement. What they are not is 'workers'. True, the nature of work has changed over the past fifty years. Fifty years ago, the Labour front bench contained miners, steelworkers, shipbuilders. We can't expect to see such trades now, as industry has vanished from the landscape. But we have one plumber - one! One social worker, an administrative worker, a school teacher, a television journalist, a radio producer. Those, we can all accept are real jobs, and more, real work. Workers' jobs; labour, if you like. Six of them. For the rest, one economist; four academics; only five lawyers. All the rest are wonks.

I would argue that for the most part the Labour front bench are for the most part a 'right elite'. They have, like their Conservative opposite numbers, succeeded at least partly because they are members of old elite structures - inherited privilege (Hilary Benn, Ed Milliband, Hariet Harman, Yvette Cooper); oxbridge or private school; direct entry into politics.

Yes, they are polished, impressive people: elite education does that for you. Put them on a panel of potential parliamentary candidates alongside an engineering worker just off back-shift and of course they'll shine. But there's more to it. Those old elite structures have had hundreds of years to develop the - unwritten, unthought, even unconscious - practices of elite capture.

Flowers affair

Joyce McMillan is of course, perfectly right to argue that the Flowers affair has been blown out of proportion by the right in order to attack the left. However, the Flowers affair has recently highlighted a different sort of elite structure, one which is more clearly a matter of the left. Paul Flowers rose through the ranks of the Labour and Co-operative movements despite the fact that he was frequently discovered to be either useless or a liability. Like Buggins, he was simply shuffled sideways into other posts until he ended up in one in which he could do real damage. Paul Flowers represents a different sort of elite, a left elite, a consequence of the organic development of the left in Britain.

Democratic deficits on the left

What this reveals is the systematic democratic deficit in old British left structures. The left, although it claims to be (and, to be fair, largely aspires to be) democratic, grew up in the Victorian period when telecommunications either didn't exist or else were out of the economic reach of working people. It was natural in the Victorian period to develop a hierarchical system of organisation, with local chapels or branches at the bottom, sending delegates to regional committees which in turn sent delegates to the (national) executive committee. Not every trades union member, of course, makes it to branch meetings - when I was an apprentice printer and a member of the National Graphical Association, we were fined if we failed to attend chapel meetings (and the fines, like our union dues, were deducted from our pay before we got it), but we were not told when or where chapel meetings would take place. The only way to find out was either to be at the preceding meeting, or be told by a friend who had been.

But even ignoring such obvious abuses, the reasons why members may not attend branch meetings are not always just apathy. Union branch meetings tend to run to a formula, and are commonly pretty turgid affairs which take up a lot of an evening. They are often not designed to be inclusive, to be welcoming to the rank and file membership. They tend to select 'in groups'.

But it tends to be branch meetings, not the membership as a whole, who elect delegates to area committees. It tends to be only those delegates who have much contact with the delegates from other areas, so even in those unions where delegates do constitutionally take instruction from their branch on whom to vote for in elections to national committees, the opinion of the branch delegate is likely to be very influential in the branch's choice.

And so it goes. Most trades unions are not participatory democracies. They're not even representative democracies. They're multi-tiered representative democracies, and at each tier the electoral college gets smaller and more self selecting. Of course, in the Victorian period when these structures were established, most trades unions were small, with a few thousand members at most; the process of consolidation and amalgamation over the past hundred and fifty years has further concentrated power, further increased the separation between the people with power - the national executives and general secretaries - and the ordinary membership.

In an electronic age it doesn't have to be like this and increasingly unions do hold direct elections; but the very size of modern unions means that the candidates for office cannot be known to a significant proportion of the membership, so elections - like elections to parliament - have to be on the basis of leaflets of a few thousand words, and such exposure as the candidates can manage to get themselves in trade journals and in the national media.

Egos and personality cults

One of the things which has also badly affected democracy on the left in Scotland has been egos and personality cults. I haven't been directly involved in any of these and I don't really understand the dynamics of them so I won't attempt to analyse the problem but I think we can all accept that it has existed, and that it has tended to act in anti-democratic ways.

Decentralisation and democratic control

The EU has a concept - called subsidiarity - that decisions ought to be taken at the most local practical level of democratic control. Smaller, more local, is inherently more democratic. Of course there are risks of elite capture, petty corruption and cronyism in small local structures just as there are in large, national structures, but such issues cause less damage precisely because they are more local. So what I want to argue is that there are smaller, more local, forms of industrial organisation which democratise control far better than crude old-fashioned nationalisation.

Grangemouth and Govan

The petrochemical installation at Grangemouth, and the shipyards on the Clyde, present particular difficulties for the general solution I propose to the problem of concentration of power and of elite capture, so I'll attempt to characterise those problems before going on to talk about more general issues.

As I understand it the petrochemical installation, although currently divided into two separate functional units ('refinery' and 'chemicals plant') is essentially one integrated facility where the parts are largely dependent on the whole and cannot easily be operated or managed separately. Furthermore, it is as I understand it key to the mechanisms which drive the oil along the undersea pipelines which bring it ashore. It's a big deal, a big plant, and important to the nation. Furthermore, it can't reasonably be expected that its workers can, from their own resources, raise the price of new investments when they become necessary. So it depends inherently on outside sources of finance.

This being so one cannot realistically reorganise the plant into a collection of human-scale workers co-ops. Even if you divide it into one co-op for refinery operations, one co-op for chemical operations, one co-op for engineering and maintenance (for example), you still require overall co-ordination. And you require relationships with external investors/lenders, whether those investors/lenders be conventional venture capitalists, a national investment bank, or a collection of mutual banks. Whoever the lenders/investors are, they will need an effective input into top-level decision making. So you inevitably end up with something which looks very much like a top-down board of directors. If we are to maximise national income from the oil we choose to extract from the North Sea, we need Grangemouth. Making Grangemouth work is a bullet we have to bite.

The shipyards are similar, if not necessarily such an extreme case. A shipyard, like any other large industrial site, has a penumbra of sub-contractors, and those sub-contractors can in general easily be workers co-operatives. Ships are, these days, largely built of modules, and the group of workers who build a module is not necessarily very big. And, in any case, it's likely that in future we will put the marine engineering skills of the Clyde more into building offshore energy generating plant than into building large warships, so again the units of labour do not necessarily need to be as large.

But so long as we are building very large engineering systems on the Clyde, there does need to be some co-ordination. There also needs to be lending or investment. So the Clyde shipyard may need more organisational structure than simply loose associations of small and medium sized workers co-ops.

However, these are extreme cases, and we should not build our overall industrial strategy on extreme cases. Most industrial enterprises in Scotland have at most a few hundred workers; organising these as independent workers co-ops is not hard to imagine.

Banking

The United Kingdom has a small number of very large banks - banks which are deemed 'too big to fail'. We have had very few mutual banks, of which the largest - the Co-op Bank - has just failed. Germany by contrast has many Volksbanken - literally banking co-ops - and, additionally, 431 municipal savings banks and eight state-owned Landesbanks. This is in addition to private sector banks.

As several people at the Radical Independence Conference pointed out, the largest banks in Scotland already are publicly owned. They easily could be nationalised. Yes, indeed they could, but they'd still be too big to fail and they would still be targets for elite capture. Rather than centralising that power as national banks, they could be broken up into their individual branches and given to their account-holders as mutuals or to their workers as workers co-ops. Either way, both account holders and workers have clear common interest in ensuring the stability and profitability of the bank they directly own, so have a clear interest in making sure it is well run. And these individual, small banks would not be 'too big to fail'. Banking regulation would still be needed to monitor that not too many of these many small banks were choosing to run the same risks at the same time, but it could be fairly light touch because the consequence of individual banks failing would be manageable.

In particular these small mutual banks must be empowered to invest in Scottish industry, and, in order to make large investments where those are needed, they must be empowered to combine into associations to make particular large loans or investments.

Industry

Workers co-ops are already a well understood concept in Scotland and are supported as a matter of policy by the Scottish Government and more widely by voices on the left. Rather than nationalising industry, I would far rather see the state set up a series of workers co-ops, each of such a size that the members of the co-op can all know one another at least by sight and reputation - so not more than say one thousand members. Obviously, as I've suggested above with Grangemouth and Govan, for some key industries it may be necessary in some key industries to have some co-ordination between groups of co-ops to allow for efficient running of very large industrial assets, but this should be exceptional not normal. Big may be efficient but it is not always beautiful, and in my opinion there is some trade off between raw efficiency and democratic control. Less wealth spread more evenly may be better than more wealth captured by elites.

Further, I'm not proposing that private industry should be seized and collectivised overnight. I'm suggesting that key industrial assets in which the state has a strategic interest (e.g. Grangemouth, Govan) should be; and that generally, where the state has a controlling interest in an enterprise (for example Prestwick Airport) there should be a presumption that it will be reorganised as a workers co-operative.

Finally I think it would be a good thing if the state provided some systematic incentive for industries to re-organise themselves as workers co-ops; for example, there could be significantly lower levels of corporation tax for co-operatives.

Summary

I do understand why under current circumstances people are calling for nationalisation. Capitalism is out of control and a wholly unreasonable proportion of the common wealth is being captured by a few elite bankers and venture capitalists. But nationalisation not only isn't the only possible solution, it in its turn offers targets which elites - very likely the same elites - will capture.

The alternative which puts power right in the hands of the people most closely involved in it are loose federations of small mutuals and workers co-ops; and I believe it would be as easy to create these as monolithic nationalised industries.

Sunday, 10 November 2013

I was recently given, as a coding exercise by a potential employer, this problem.

It's an interesting problem, because the set of N-grams (the problem specification suggests N=3, so trigrams, but I'm sufficiently arrogant that I thought it would be more interesting to generalise it) forms, in effect, a two dimensional problem space. We have to extend the growing tip of the generated text, the meristem, as it were; but to do so we have to search sideways among the options available at each point. Finally, if we fail to find a way forward, we need to back up and try again. The problem seemed to me to indicate a depth-first search. What we're searching is not an 'optimal' solution; there is no 'best' solutions. All possible solutions are equally good, so once one solution is found, that's fine.

Data design

So the first issue is (as it often is in algorithmics) data design. Obviously the simpleminded solution would be to have an array of tuples, so the text:

I came, I saw, I conquered.

would be encoded as

I came I
came I saw
I saw I
saw I conquered

The first thing to note is these tuples are rules, with the first N-1 tokens acting as the left hand side of the rule, and the last token acting as the right hand side:

I came => I
came I => saw
I saw => I
saw I => conquered

To be interpreted as 'if the last N-1 tokens I emitted match the left hand side of a rule, the right hand side of that rule is a candidate for what to emit next.'
The next thing to note is that if we're seeking to reconstruct natural language text with at least a persuasive verisimilitude of sense, punctuation marks are tokens in their own right:

I came => COMMA
came COMMA => I
COMMA I => saw
I saw => COMMA
saw COMMA => I
COMMA I => conquered
I conquered => PERIOD

Now we notice something interesting. It's perfectly possible and legitimate to have two rules with the same left hand side, in this case {COMMA I}. So we could recast the two {COMMA I} rules as a single rule:

COMMA I => [saw | conquered]

This means that, in our table of rules, each left-hand-side tuple can be distinct, which makes searching easier. However, a system which searches a table of N-ary tuples for matches isn't especially easy or algorithmically efficient to implement. If we had single tokens, we could easily use maps, which can be efficient. One can see at a glance that two tokens occur repeatedly in the first position of the left hand side of the rules, 'I', and 'COMMA'.

'I' has three possible successors:

I [came | saw | conquered]

However the right hand side is not the same for 'saw' as it is for conquered, so this composite rule becomes:

And thus, essentially, as a tree that, given a path, we can walk. Matching becomes trivial and efficient.
Thus far we're almost language independent. I say almost, because in Prolog (which would be a very good implementation language for this problem) we'd simply assert all the N-grams as predicates and let the theorem solver sort them out. However, I've not (yet) tackled this problem in Prolog.

Implementation: Java

I started in Java, because that's what I was asked to do. Java (or C#, which is to a very close approximation the same language) is pretty much the state of the art as far as imperative, procedural languages go. Yes, I know it's object oriented, and I know Java methods are in principal functions not procedures. But it is still an imperative, procedural language. I say so, so it must be true. What I hope makes this essay interesting is that I then went on to reimplement in Clojure, so I can (and shall) compare and contrast the experience. I'm not (yet) an experienced Clojure hacker; I'm an old Lisp hacker, but I'm rusty even in Lisp, and Clojure isn't really very Lisp-like, so my Clojure version is probably sub-optimal.

But let's talk about Java. I made a tactical error early in my Java implementation which makes it less than optimal, too. We have an input file to analyse, and we don't know how big it is. So my first instinct wasn't to slurp it all into memory and then tokenise it there; my first instinct was to tokenise it from the stream, in passing. That should be much more conservative of store. And so I looked in the Java libraries, and there was a library class called StreamTokenizer. Obviously, that's what I should use, yes? Well, as I learned to my cost, no, actually. The class java.io.StreamTokenizer is actually part of the implementation of the Java compiler; it's not a general purpose tokeniser and adapting it to tokenise English wasn't wonderfully successful. That wasted a bit of time, and at the time of writing the Java implementation still depends on StreamTokenizer and consequently doesn't tokenise quite as I would like. If I backported the regex based tokeniser I used in the Clojure version to the Java version (which I easily could) it would be better.
So the first gotcha of Java was that the libraries now contain a lot of accreted crud.

The second point to note about Java is how extraordinarily prolix and bureaucratic it is. My Java implementation runs to almost a thousand lines, of which over 500 lines are actual code (317 comment lines, 107 blank lines, 36 lines of import directives). Now, there are two classes in my solution, Window and WordSequence, which could possibly be refactored into one, saving a little code. But fundamentally it's so large because Java is so prolix.

By contrast, the Clojure reimplementation, which actually does more, is a third the size - 320 lines, of which 47 are blank and 29 are inline comments. I don't yet have a tool which can analyse Clojure documentation comments, but at a guess there's at least fifty lines of those, so the Clojure solution is no more than two fifths of the size of the Java.

The Java implementation comprises eight classes:

Composer essentially the two mutually recursive functions which perform depth first search over the rule set, to compose output

Digester scans a stream of text and composes from it a tree of rules

Milkwood contains the main() method; parses command line arguments

RuleTreeNode a node in the tree of rules

Tokeniser a wrapper around StreamTokenizer, to try to get it to tokenise English; not very successful

Window a fixed length stack of tokens, used as a glance-back window in both scanning and composing

WordSequence a sequence of tokens implemented as a queue

Writer a wrapper around BufferedWriter which performs on-the-fly orthographic tricks to create a verisimilitude of natural English

One might argue that that's excessive decomposition for such a small problem, but actually small classes greatly increase the comprehensibility of the code.

There are things I'm not proud of in the Java implementation and I may at some stage go back and polish it more, but it isn't a bad Java implementation and is fairly representative of the use of Java in practice.

Clojure implementation

Some things to say about the Clojure implementation before I start. First, I implemented it in my own time, not under time pressure. Second, although I'm quite new to Clojure, I'm an old Lisp hacker, and even when I'm writing Java there are elements of Lisp-style in what I write. Thirdly, although I'm trying to write as idiomatic Clojure as I'm able, because that's what I'm trying to learn, I am a Lisp hacker at heart and consequently use cond far more than most Clojure people do - despite the horrible bastardised mess Clojure has made of cond. Finally, it was written after the Java implementation so I was able to avoid some of the mistakes I'd made earlier.

I used LightTable as my working environment. I really like the ideas behind LightTable and suspect that in time it will become my IDE of choice, but I haven't got it working for me yet. Particularly I haven't got its 'documentation at cursor' function working, which, given my current (lack of) familiarity with the Clojure, is a bit of a nuisance.

I tripped badly over one thing. Clojure, to my great surprise, does not support mutually recursive functions, and the algorithm I'd designed depends crucially on mutually recursive functions. However after a bit of flailing around, I remembered it does support dispatch in one function on different arities of arguments, and I was able to rewrite my two functions as different arity branches of the same function, which then compiled without difficulty.

The other trip was that map, in Clojure, is lazy. So when I tried to write my output using

(defn write-output
"Write this output, doing little orthographic tricks to make it look superficially
like real English text.
output: a sequence of tokens to write."
[output]
(map write-token output))

nothing at all was printed, and I couldn't understand why not. The solution is that you have to wrap that map in a call to dorun to force it to evaluate.

Aside from that, writing in Clojure was a total joy. Being able to quickly test ideas in a repl ('Read Eval Print Loop') is a real benefit. But a clean functional language is so simple to write in, and data structures are so easy to build and walk.

Another thing Clojure makes much easier is unit tests. I got bogged down in the mutual recursion part of the Java problem and unit tests would have helped me - but I didn't write them because the bureaucratic superstructure is just so heavy. Writing unit tests should be a matter of a moment, and in Clojure it is.
I broke the Clojure implementation into four files/namespace:

analyse.clj read in the input and compile it into a rule tree; more or les Tokeniser and Digester in milkwood-java;

core.clj essentially replaces Milkwood in milkwood-java; parses command line arguments and kicks off the process;

synthesise.clj compose and emit the output; broadly equivalent to Composer and Writer in milkwood-java;

utils.clj small utility functions. Among other things, contains the equivalent of Window in milkwood-java.

Additionally there are two test files, one each for analyse and synthesise, containing in total seven tests with eight assertions. Obviously this is not full test coverage; I wrote tests to test specific functions which I was uncertain about.

Conclusion

Obviously, all Java's bureaucracy does buy you something. It's a very strongly typed language; you can't (or at least it's very hard to) just pass things around without committing to exactly what they will be at compile time. That means that many problems will be caught at compile time. By contrast, many of the functions in my Clojure implementation depend on being passed suitable values and will break at run time if the values passed do not conform.

Also, of course, the JVM is optimised for Java. I've blogged quite a bit about optimising the JVM for functional languages; but, in the meantime, my Java implementation executes about seven times as fast as my Clojure implementation (but I'm timing from the shell and I haven't yet instrumented how long the start up time is for Java vs Clojure). Also, of course, I'm not an experienced Clojure hacker and some of the things I'm doing are very inefficient; Alioth's Clojure/Java figures suggest much less of a performance deficit. But if peformance is what critically matters to you, it seems to me that probably the performance of Java is better, and you at least need to do some further investigation.

On the other hand, at bottom Java is fundamentally an Algol, which is to say it's fundamentally a bunch of hacks constructed around things people wanted to tell computers to do. It's a very developed Algol which has learned a great deal from the programming language experience over fifty years, but essentially it's just engineering. There's no profound underlying idea.

Clojure, on the other hand, is to a large extent pure Lambda calculus. It is much, much more elegant. It handles data much more elegantly. It is for me much more enjoyable to write.

Sunday, 3 November 2013

A stove is the heart of any home, particularly so at this time of year. A stove transmutes wood into heat. But heat comes in a number of forms, and we appreciate it in a number of ways. My stove provides me with toasty warm towels from my heated towel rail, when I step out of the bath. It provides me with the hot water for my bath. It provides me with my hot meals, my well cooked food. It heats my oven and bakes my cakes. And, most important of all, it keeps the whole of my house warm and comfortable. And all this for no fuel bills, save the labour of cutting the wood.

So what is this paragon, I hear you ask; how much, I hear you ask, does such a thing of wonder cost?

Well, for a start, it's not an Aga. Agas are, indeed, wonderful things (although I don't know how well they work on wood) but they're vastly out of my price league; an Aga would cost as much as my house. And, they're enormously heavy. Getting an Aga over the hill to my cabin would have been exceedingly difficult. So no, it's not an Aga. More surprisingly, it's not a Rayburn, either. I've installed second-hand Rayburns in every house I've owned until this one. Rayburns are indeed good, although they are not that good if you burn coal - it's too corrosive, and you end up having to replace the grate and firebricks every year. On wood, which is what I have, Rayburns are fine - a Rayburn would have been good. But at the time I built this house, even a second hand Rayburn was out of my budget.

Also, a Rayburn has a small hotplate - efficient, certainly, but small. A Rayburn oven does not have a window in its door, so you can't see how your cake is rising. A Rayburn's firebox is not adaptable. And, like the Aga, it's very heavy.

No, my stove is a thing called a 'Plamak', or 'Plamark' - it's Bulgarian, and in Bulgaria they use cyrillic script; it doesn't transliterate perfectly. Specifically, it's a Plamak B: B for boiler.

Back in the days of the old Soviet Union one could buy Moskvitch and Lada cars; Ural motorcycles; Zenit cameras. They were sturdy but crude, by Western standards. Simple, but very cheap, and they worked. My first car was a Moskvitch van. The Plamak is a little bit like that: honestly made, a little crude in places, but it works. Unlike an Aga or a Rayburn it's made of pressed steel - very nicely enamelled, but just pressed steel. The handles on the ovens and firebox are made of something like Bakelite. The rail across the front on which one can hang teatowels to dry isn't very sturdy and it's a little too close to the body of the stove for convenience. The hotplate is just a plate of steel sheet, and will probably, over time, corrode and need to be replaced. There's no insulated cover for the hotplate. The oven doesn't have a built-in thermometer (but it does have a window in the door, so you can easily put a thermometer inside). Unlike an Aga or a Rayburn, it doesn't have a lot of thermal mass, so when the fire goes down it cools quickly - if you're cooking something that needs a consistent temperature you need to pay attention, and feed it small logs frequently.

But, it has real good points.

The fire box has an extra, removable grate. In summer you can put this grate in, and it halves the size of the firebox, allowing you to cook more economically. In winter, obviously, you take it out. The hotplate is enormous - it will easily take half a dozen pans. There's a very simple flue control which switches the smoke path from across under the hotplate and up the chimney, to round under the oven, depending on what sort of cooking you want to do. And cleaning out that flue path under the oven is absurdly easy - you just lift out the oven floor.

It also burns exceedingly well. Frankly it's too big a stove for this little house - until I installed the big radiator and my heated towel rail, I couldn't effectively use the oven because if I ran the stove hot enough to cook in the oven the hot water tank would boil. Now I can control that, by pumping heat out of the hot water circuit though the radiator (at cost, sometimes, of making the house too cosy - it can easily reach thirty degrees in the bedroom), and so I can bake. It does go through wood fairly quickly - two bucketfulls of logs in an evening - but in two hours it will heat enough hot water for two long, deep, hot baths.

All in all I'm enormously pleased with it. So, you ask, what does this paragon of a stove cost? Amazingly, three hundred and eighty pounds. Honestly, if you want a stove that cooks and heats water, get a Plamak B. It's a bargain.

Friday, 1 November 2013

Today's job was to get a continuous integration server set up and integrated with my Redmine project management system. Since I run Debian 6 on my server, and I prefer where possible to install from the official Debian packages, the Redmine version I'm running is 1.1, which is somewhat behind the curve. I had a look around at which continuous integration server to use. I've tentatively picked Jenkins, the more purist-open-source variant of the Hudson/Jenkins project. Reasons include: it's available in the Debian 7 distribution (but sadly not in Debian 6), and it has a plugin for Leiningen, which is my favourite build tool.

So... on to install, and there the fun began.

Installing Jenkins

As I said, Jenkins is not available in the Debian 6 distribution. However, the Jenkins project had set up their own Debian repository, so after adding their key and link to my system I was able to apt-get it. You'd have thought that would be all, but sadly no.

The Jenkins package, as packaged by Jenkins, does not depend on either Tomcat or Jetty. Instead, it assumes you will be serving no other web-apps and tries to install its own servlet engine (I think Jetty, but to be honest I was too annoyed to check before taking it off again). Obviously, I do have other web-apps, so this didn't work for me. However, I copied the WAR file from the the Jenkins release into /var/lib/tomcat6/web-apps, and, of course, being a web-app, it just worked...

Except it didn't. Jenkins expects to have some space of its own to write to, outside the servlet engine sandbox. That is, in my opinion, bad behaviour. Specifically it expects to be able to create a directory /usr/share/tomcat6/.jenkins, which is bad in two ways: it writes to a directory to which, for security reasons, Tomcat damned well should NOT be able to write, and it creates a hidden file which a naive administrator might not notice and which consequently might not be backed up.

After some thought I decided to put Jenkins writable space in /var/local, so I executed:

Configuring Jenkins for even modest security, however, was a complete bitch.

Jenkins has five different authentication models:

It can have authentication switched off entirely. Anyone can do anything... No. Not going to happen, on an Internet facing server.

It can delegate authentication to the servlet engine. I'm not wonderfully happy about that, because administering Tomcat users is a bit of a pain.

It can use LDAP... if you have an LDAP server, which I don't.

It can delegate authentication to the undelying UN*X system, but only if the servlet engine can read /etc/shadow! There's NO WAY I'm permitting that.

It can run its own internal authentication... you'd think that was the obvious one. But as soon as you've selected that option, you're locked out and cannot proceed further.

Fortunately, you can completely reinitialise Jenkins by deleting everything under its home directory and rebooting Tomcat.; it then proceeds to reinstall a default set of files, and you get a new, empty Jenkins.

But, you can't add people to Jenkins until you've configured 'enable security' and chosen one of the security models. So, first, configure 'Security Realm' to 'Jenkins's own user database', and remember to tick 'Allow users to sign up'.

Then, sign up. That bit's easy, it prompts you.

Then, you need an authorisation strategy. Of these, there are five:

Anyone can do anything (aye, right!)

'Legacy mode' (only 'admin' can do anything)

Logged-in users can do anything

Matrix-based security

Project-based Matrix Authorization Strategy

If you tick 'Matrix-based security' or 'Project-based Matrix Authorization Strategy' and click 'Save', you're locked out again and have to go back to deleting everything in the home directory, rebooting and starting again.

After ticking either 'Matrix-based security' or 'Project-based Matrix Authorization Strategy' (which are, frankly, the only authorisation strategies which make sense), you MUST tick the box which allows the group 'Anonymous' to 'Administer' BEFORE you do anything else. Otherwise, you're stuffed.

So then you try to add a security group, and, wait, you can't. You're stuffed. The 'internal' security model does not have groups, so you must add yourself - your own user ID - to the security matrix, give yourself permission to administer, and then save, and then revoke 'anonymous' permission to administer, and save. Otherwise any Johnny hacker out there in Netland can come along and pwn your server.

To be fair, there are plugins available to add a number of additional authentication methods, including OpenID. I haven't tried these.

Integrating with Redmine

Now, integrating Redmine with Jenkins. Recall that Jenkins is a fork of the Hudson project; they're still pretty similar, and although there isn't a Redmine plugin specifically for Jenkins, there is one for Hudson. I installed that, and on initial testing it appears to work. I wanted to do the integration from the Redmine end, because Redmine does work for me as a project management tool, and I don't yet know whether I shall stick to Jenkins. But the alternative would have been to install a Redmine plugin into Jenkins - that exists; and, indeed, I may install it, as well, since it seems to have some useful functionality.

However, all this still left one gaping hole. Both my Redmine installation and my Jenkins installation were running over plain old fashioned HTTP, which means I was passing passwords in plain text over HTTP, which is asking for trouble - a continuous integration server, simply in the nature of the beast, can do pretty extensive things and would be a wonderful tool for an attacker to control. So I set up HTTPS using a self-signed certificate - I know, but I don't need a better one - and configured Tomcat to communicate only locally over AJP; I then configured the Apache2 HTTP daemon to redirect appropriate requests received over HTTPS via AJP to Tomcat, using mod_jk.

So far so good.

Still to do

I need to integrate Jenkins with Git; I've downloaded the plugins (and downloading and installing plugins for Jenkins is extremely straightforward) but I've yet to configure them.