Years ago when I was working in Ireland I developed a habit of marking anything not yet configured in a configuration file and in code with the string "here be dragons".
[mail servers]
pop="pop.mycompany.com"
smtp="here be dragons"
This had the simple advantage that one could search all config files for "dragons" to find items not yet configured (maybe the SMTP server wasn't known yet but now is).
After a while I and later others started referring to a status where an application was installed but not yet configured as having dragons, as in "Module X is installed but some dragons remain".
Now in Switzerland another interesting term developed, more or less by accident. The German equivalent of the English term "text book" (as in "ideal") is "picture book". Our test environment installation people have a habit of taking screenshots of anything that didn't work or confused them during installation of our application. (That was very useful.) They then sent us the screenshots so we could resolve those issues or explain them away.
And now an installation routine that is particularly buggy is referred to as a "picture book installation" (again, think "text book installation").

Stack Overflow – like most online communities I've studied – naturally trends toward increased strictness over time. It's primarily a defense mechanism, an immune system of the sort a child develops after first entering school or daycare and being exposed to the wide, wide world of everyday sne...

All activities that make one think help with other activities that require thinking.
Some people study languages for fun, just to keep their brains busy. Others paint or learn to play a musical instrument. Learning to code is not only a fun hobby (for those who like it) but it also trains the brain and can be useful.
Not everyone who learns French needs to become an interpreter. Not everyone who attends pottery class will try a career in the tableware industry. So why shouldn't people learn to code without it being part of a career as a programmer?

The whole "everyone should learn programming" meme has gotten so out of control that the mayor of New York City actually vowed to learn to code in 2012. A noble gesture to garner the NYC tech community vote, for sure, but if the mayor of New York City actually needs to sling JavaScript co...

Amazon simply decided not to display footnotes correctly. I have complained about that several times, but Amazon ignore complaints.
The Kindle is basically useless for scientific books or any books that rely on footnotes. I don't know why Amazon are trying to destroy the eBook format.
I stopped buying eBooks from Amazon and now use the Kindle for my existing books and for non-Amazon eBooks.
I don't see what Amazon have to gain from destroying the reading experience like that. It's not like making the footnotes difficult to reach makes "piracy" more difficult or anything like that.

I adore words, but let's face it: books suck. More specifically, so many beautiful ideas have been helplessly trapped in physical made-of-atoms books for the last few centuries. How do books suck? Let me count the ways: They are heavy. They take up too much space. They have to be printed. ...

"Jakob Nielsen wrote about 300 DPI displays back in 1997: http://www.useit.com/alertbox/9703b.html"
This is off-topic, but just too interesting. From the article:
"Use hypertext to split up long information into multiple pages"
That is bad. That is really really bad. He got it so terribly wrong even though he wrote the article at a time when PCs were as badly connected to the net as mobile devices are now.
I absolutely hate loading a Web page for a minute or two (depending on location) and then find that after a minute of reading I have to load _another_ page (and another, and another). The ridiculous habit of pretending that a Web page is like a page in a book or magazine is, imho, one of the worst features of the Web today.
Good thing Jeff's blog is not like that.

What was Microsoft's original mission? In 1975, Gates and Allen form a partnership called Microsoft. Like most startups, Microsoft begins small, but has a huge vision – a computer on every desktop and in every home. The existential crisis facing Microsoft is that they achieved their missi...

I don't see a move from the PC to a "Post PC Era". To me it looks as if the computing world constantly tries to figure out what the ration between computers and users should be.
Every ten years or so somebody notices that it is inefficient or somehow not good enough if [only one user uses a given computer|a given computer is used by more than one user] and advocates that instead [many users should use one computer|every user should get his own computer].
In the 60s and 70s we called it mainframe, now we call it cloud. The clients are more sophisticated (the iPads and phones and the like, not the users) and the cloud consists of many networked computers rather than one big one, but the effect is the same: many users use one computer (system) again. (In the 90s the expected shift from many computers to one computer system failed because of a mismatch between user expectations and available connectivity technology. But we did get the Web.)
I think that perhaps the entire history of computers can be explained as the constant struggle to make one computer system support more than one user whenever every user had his own computer and to give every user his own computer whenever one computer system was used by many users. This resulted in more computers and more users because we always added but never subtracted.

What was Microsoft's original mission? In 1975, Gates and Allen form a partnership called Microsoft. Like most startups, Microsoft begins small, but has a huge vision – a computer on every desktop and in every home. The existential crisis facing Microsoft is that they achieved their missi...