Tuesday, May 4, 2010

Consider these everyday examples: When your daughter goes to school in the morning, she puts in her backpack the things she needs for the day; that’s prefetching and caching. When your son loses his mittens, you suggest he retrace his steps; that’s backtracking. At what point do you stop renting skis and buy yourself a pair?; that’s online algorithms. Which line do you stand in at the supermarket?; that’s performance modeling for multi-server systems. Why does your telephone still work during a power outage?; that’s independence of failure and redundancy in design. How do Completely Automated Public Turing Test(s) to Tell Computers and Humans Apart, or CAPTCHAs, authenticate humans?; that’s exploiting the difficulty of solving hard AI problems to foil computing agents.

Computational thinking will have become ingrained in everyone’s lives when words like algorithm and precondition are part of everyone’s vocabulary; when nondeterminism and garbage collection take on the meanings used by computer scientists; and when trees are drawn upside down.

But wait — didn't you just say we're using "computational thinking" already, even without the fancy vocabulary? And when you express the desire that the language of computational thinking enter the general public’s word-hoard, aren't you forgetting that much of the terminology of computation was itself borrowed from everyday life? Children were “backtracking” for their mittens — and hikers to discover missed forks in their paths — long before there were computers.

3
comments:

The difference between backtracking "then" and backtracking "now" is we have actual theory to talk about backtracking. Just like gravity didn't start with Newton (even reasoning about it), but the connection between a ball falling and the moon circling wasn't a formal understanding until his work. Perhaps this is gravitational thinking as opposed to thinking?

Language is code... social and neural networks function much like computer networks... thinking is computational thinking...

In the past we had a name for such reductionistic equivalencies: "lazy thinking."

Rather than insisting that we are more like computers (read: more inhuman) than we imagine, we should rather marvel at how deeply human the whole phenomenon of computing is. (Code, after all, is a product of human brains.)

Post a Comment

Search This Blog

About

Commentary on technologies of reading, writing, research, and, generally, knowledge. As these technologies change and develop, what do we lose, what do we gain, what is (fundamentally or trivially) altered? And, not least, what's fun?