The dream of inexpensive computing for everyone has been with us since
the first computers. Along the way it has taken some unexpected turns.
This article summarizes key trends and a few of the surprises.

We started off with computers that were the size of buildings that only a government at war could require.

Not really exclusively such. Building-scale computers were very limited in spread, and showing up a bit later than the very beginnings: Z3 or Colossus were between the size of a wall unit (or few wardrobes) and a small room (plus at least Z3 looks like it could be made by a dedicated individual backed by some patron - and indeed, that's basically how Z2 was made).

Many other machines from that time were similar, the wardrobes-room size seems like it was more typical generally, also in the ~50s. I suppose the most-publicised ENIAC really fitted the "building" (and cost) perception, hence established it in public imagination over the decades...

Overall, history wasn't so "linear" like you painted it, in the march towards "embedded" - don't forget that the first x86 CPUs (or generally the first microprocessors) were meant more exactly for such scenarios (and, ironically, first ARMs were for desktop machines).

BTW, reminded me about one Wiki art: http://en.wikipedia.org/wiki/Microcomputer_revolution#The_Home_Comp... - how we apparently predicted that a central computer will control the home and its appliances ...only, we failed to anticipate that connections and software will be the really expensive and/or hard part - meanwhile, the computer quickly became so inexpensive that each appliance can have one, with a bonus of keeping the software single-purpose & easy (and we still didn't really manage to tackle the issue of interoperation).
By my rough estimate, there are around 20 processors in the room I'm sitting in ...and only one PC.