… Moreover, the PC revolution has saturated the market at any accessible price point. That is, anyone who needs and can afford a PC has now got one…

Uhhh, no. PCs are not cheap. Not at all. The iPad is cheap [3], but PCs are very expensive.

Yes, you can buy a “PC” for a pittance. It makes a crummy boat anchor though. If you want it to do something useful you need to buy internet service. Where I live that’s about $600 a year – year after year. Unless you bought a Mac, or are geek enough to go without, you need to buy antiviral software. In theory you also need to $150 or so for Microsoft Office. And good luck with backup.

But that’s not the real cost.

The real cost is that you need an IQ-equivalent of 110 or higher, and a love of debugging and troubleshooting. For most of the population, that’s absolutely unaffordable.

So Charlie got this one point wrong – but it only strengthens his overall argument. My four month old quad core iMac running 10.6 is an anachronism [2]. Its era is passing. Welcome to the third era of the personal computer.

[1] I thought things would blow up in 2009. Didn’t happen! Microsoft dropped the price of XP to about nothing and crawled back enough control of the netbook to stun the market (same thing they did with Palm in the 90s by the way). It’s still going to happen, but that’s not the first time I’ve been wrong on transition times. I’ve since learned to take my time estimates for technology transitions and triple them.

[2] Charlie also omits the role Digital Rights Management (DRM) plays in driving this transition. DRM is one of the reason there’s so much good software being produced for the iPhone. Your CDs may be worth money some day.

[3] Not least because of the pay-as-you-go capped data plan. That’s as big a deal as the device. Yes, I know iPad’s require a PC-as-peripheral, but that will change within the year.

[4] Of course that’s what the original Mac was – the “computer for the rest of us”. Closed architecture. All applications were to be vetted by Apple. Strict UI standards. Heavy investments in usability and design. Single button mouse. It worked too – it really was easy to use. Much easier to use than OS X. Almost as easy to use as the iPad. History doesn’t repeat, but sometimes it spirals.

We now know how crummy the Soviet military infrastructure was before the collapse of the USSR. It’s likely that North Korea’s is in much worse shape. It’s likely their submariners are desperate and ill-trained. it’s a setup for an accident, or for a crazed officer to do something very stupid.

Would the submarine officers confess to having screwed up? In North Korea that would probably be a death sentence – or worse.

My money is on blunder.

Now it’s all about China, which has huge investments in North Korea. It’s all about whether China will decide that North Korea has to end, and, if so, on what terms and timeline.

For a long time my view on the euro has been that it may well have been a mistake, but that bygones were bygones — it could not be undone…

…but what if the bank runs and financial crisis happen anyway? In that case the marginal cost of leaving falls dramatically, and in fact the decision may effectively be taken out of policymakers’ hands…

…if Greece is in effect forced out of the euro, what happens to other shaky members?

Since then I’ve been chipping at the list, looking for the cause of the cause of the cause (etc – go too deep and it’s all entropy). Sure we’ve got above average corruption and economic financialization, but those tendencies have always been with us. This feels like something novel, something that, in modern times, has come along every century or so. (In deep history every 2,000 years or so.)

The Rise of China and India (RCI) has been like strapping a jet engine with a buggy throttle onto a dune buggy. We can go real fast, but we can also get airborne – without wings. Think about the disruption of German unification – and multiply than ten thousand times.

RCI would probably have caused a Great Recession even without any technological transformations.

Except we have had technological transformation – and it’s far from over. I don’t think we can understand what IT has done to our world – we’re too embedded in the change and too much of it is invisible. When the cost of transportation fell dramatically we could see the railroad tracks. When the cost of information generation and communication fell by a thousandfold it was invisible.

... I'd like to point you at this 2005 paper by the Author's License and Collecting Society, titled "What are Words Worth?, describing the findings of a study organized by the Centre for Intellectual Property Policy & Management (CIPPM)I, Bournemouth University. ..

... restricting the survey to focus on main-income authors (those who earned over 50% of their income from writing) gave median earnings of £23,000 and mean earnings of £41,186.

... the researchers went on to calculate a Gini coefficient for authors' incomes — a measure of income inequality, where 0.0 means everyone takes an identical slice of the combined cake, and 1.0 indicates that a single individual takes all the cake and everyone else starves. Let me provide a yardstick: the UK had a Gini coefficient of 0.36 in 2009, the widest ever gap between rich and poor— while the USA, at 0.408, had the most unequal income distribution in the entire developed world. The Gini coefficient among writers in the UK in 2004-05 was a whopping great 0.74...

... In addition to being a wildly unstable, lonely occupation with an insane income spread, there are other drawbacks to being a writer. Many American writers are forced to rely on a day job, or a spouse with a day job, for health insurance: health insurance for the self-employed is prohibitively expensive, especially for the self-employed poor. Those who don't have a job that provides healthcare, or a partner with family benefits, are never more than one accident away from bankruptcy. As the median age for publishing a first novel is around 34 because it takes a lot of life experience before you know enough to write something worth publishing, most authors are in the age range 34-70 — old enough that they're likely to develop chronic health conditions or need expensive treatments. (To be fair, it's not just authors who get the short end of this particular shitty stick: I suspect the US health insurance industry is actively suppressive of entrepreneurial start-up ventures by older folks in general.)...

...So here's the truth about the writing lifestyle: it sucks. It is an unstable occupation for self-employed middle-aged entrepreneurs. Average age on entry is around 34, but you can't get health insurance (if you're American).... As a business, it's a dead-end: you can't generally expand by taking on employees, and the number of author start-ups where the founders have IPOd and cashed out can be counted on the fingers of a double-amputee's hands...

I've read Stross for years - in the small science fiction/fantasy world he's a modern giant. He is an extremely smart man and I believe that he works very hard. Although he's a relatively successful science fiction writer, if he wanted money he'd be working for Goldman Sachs.

Clearly Charles Stross has been cursed with the writer's obsession and he deserves our sympathies as well as our thanks. Maybe it was something he did in a past life. I've put "The Revolution Business" on my Amazon cart today. It's the least I can do.

Beyond the dismal reality of the 21st century wordsmith, there are other noteworthy insights in the essay (read the entire work of course). I agree with Charlie that "US health insurance industry is actively suppressive of entrepreneurial start-up ventures by older folks"; I think that's going to change thanks to MY President.

Most significantly for the rest of us, we know fiction writing, like acting and sports, is a "winner take all" form of work. A small fraction of writers take home a vast majority of the earnings. In an interconnected world, where work can flow easily, it's conceivable this will become widespread among all knowledge workers. Really, you do want your babies to grow up to be cowboys.

A few weeks ago I said Farewell Palm. Now HP has paid $1.2 billion in cash to acquire Palm ($5.70 a share).

It's good news for those who bought Palm stock in the past few weeks, but it's no reason to consider buying a PalmOS device. Whatever Palm was yesterday, it's now being digested by a very average large publicly traded company. Palm is now HP.

An average PTC like HP can compete effectively against other clumsy but powerful PTCs like IBM, Dell, RIM, and Microsoft. HP is capable of turning out devices that are every bit as good as Windows Mobile phones of 2008.

Except it's not 2008, and the competition is not RIM or Microsoft or Dell. The competition is Google and Apple.

Google and Apple are also publicly traded companies. They are not typical however. They are very deviant. Google has an underestimated two tier ownership structure that gives great power to its founders. Apple has Steve Jobs, who in addition to being an insane genius with mind-control powers is also Apple's founder and has cult like authority over the company and its shareholders. Both Google and Apple behave like privately held companies with public money.

Palm is still dead. I don't know why HP did this deal. Maybe it was all IP, but they paid a lot for IP. I think they hope to stay in the only game in town. It won't work; there's no room for them at the table.

This is about Google and Apple. Microsoft will take 3rd place. RIM will fall by the wayside within three years. HP won't last a year. They can't compete.

Within a year Apple will make MobileMe an iPhone back-end peripheral. When they do that iPhone users will not need a local copy of iTunes (PC or Mac), and they won't need an ISP. After that Apple will make MobileMe into a Facebook competitor.

Next year's iPad will come with a starter MobileMe subscription.

[1] I'm a dreadful prognosticator, but it doesn't stop me. I'm often right in the long run, but usually premature.

.... That meeting concluded with Mitchell saying: “You asked if I think Netanyahu is serious. They ask the same question. You are an expert on Palestinian and Israeli politics. They are the same. But no one in the world knows American politics better than me, and this I will say. There has never been in the White House a president that is so committed on this issue, including Clinton who is a personal friend, and there will never be, at least not in the lifetime of anyone in this room." ...

... The establishment of well-connected walking and bicycling networks is an important component for livable communities, and their design should be a part of Federal-aid project developments. ... transportation agencies should plan, fund, and implement improvements to their walking and bicycling networks, including linkages to transit...

... transportation agencies should give the same priority to walking and bicycling as is given to other transportation modes. Walking and bicycling should not be an afterthought in roadway design

... children should have safe and convenient options for walking or bicycling to school and parks..

... DOT encourages bicycle and pedestrian accommodation on bridge projects including facilities on limited-access bridges with connections to streets or paths...

..Current maintenance provisions require pedestrian facilities built with Federal funds to be maintained in the same manner as other roadway assets. State Agencies have generally established levels of service on various routes especially as related to snow and ice events...

... The Secretary shall not approve any project or take any regulatory action under this title that will result in the severance of an existing major route or have significant adverse impact on the safety for nonmotorized transportation traffic and light motorcycles, unless such project or regulatory action provides for a reasonable alternate route or such a route exists." 23 U.S.C. 109(m)....

There is a difference between Obama and the GOP alternative. A vast, huge, multifaceted, every day in every way difference.

Anyone who thinks otherwise is a willing servant of Rupert Murdoch, owner operator of the Wall Street Journal and Fox news.

Melvyn and Shula do not have the best chemistry in during the In Our Time program The Rise and Fall of the Zulu Nation. I can see why Melvyn was peevish, but it's a bit of a shame. I'm sympathetic to Marks' notion that the emergence of Shaka Zulu was more chance than destiny; a contingent result of swirling change and disruption driven, fundamentally, by the technologies of innovative agriculture and consequent rapid population growth and Malthusian collapse.

That, however, was too much subtlety for 15 minutes of Shaka, for there was a lot of ground to cover in one 48 minute program. Even in this quick overview it's clear the history of the consequent fallings and risings of the Boer, Zulu, and British is immensely complex, full of chance and personality and mostly unknown.

So it is with history. Endless stories, of which we know only a tiny number. There must be many more, perhaps more grand and sad than any we know, lost in deep time.

Lost, but, in a sense, not unknown. History is fractal. The stories we know in detail are similar to those we know in outline are similar to those we know in myth, and are very likely similar to those we don't know at all. If we are wise enough to realize that history is fractal, we can study closely the history we know and learn universal truths. If we are foolish enough to believe our stories are unique, we walk the path of willful ignorance.

Monday, April 26, 2010

CrashPlan gets great press and even a Tidbits Take Control recommendation, but when I used it I ran into numerous fundamental flaws. Clearly, I can't rely on reviewers.

From that and similar experiences, here are Gordon's Rules of Engagement for software and services.

Desktop software

Is there obnoxious DRM? (Some DRM is understandable, but it shouldn't be obnoxious.)

If distributed on CD, can the product be used without the CD running?

Look at the installer. Drag and Drop is fine, but if it needs an installer it better be Apple's installer.

Inspect the uninstaller. The best apps don't need one - just delete the app. After that look for something built into the app. Then look for something that downloads with the app. If there's no installer stop immediately.

If it's software, is there an full feature trial period? Limited feature trials are worthless. I need at least a month, or, better, 10 days of use (which may take me months).

Who makes the product? What's their support site like? Can you find downloadable fixes?

Cloud services

Is it obvious how to delete your account and all data and services?

Do they want your Google credentials? If so, run and bar the door.

Do they support Oauth? Do they allow you to have multiple Oauth credentials associated with your account? Extra points for each.

Do they require a security question? If so, they're stupid. (Yes, even Google is a bit stupid these days - but they don't REQUIRE it.)

If there are annual renewals, is there an option to request approval prior to renewal?

Desktop or Cloud

Is there a high quality manual and/or help resource? It doesn't matter whether you're going to read it or not. Products with good manuals are almost always good products. It's a very reliable quality measure.

Is there a blog? Are the developers proud of their work?

Notice there's nothing in here about features, reviews, price, performance, etc. They only matter if a product passes the above screening tests. In fact it's rare for a product to pass all of the relevant tests and then be fail due to bugs or performance. A vendor who can do the above can usually do the product as well.

In Greg Egan's Teranesia [1] one story I can't currently locate (h/t Mel Anderson, comments), the protagonist is fighting the ultimate infection. It seems impossibly mutable. Turns out it has evolved to exploit quantum effects, and it's finding the perfect mutation by exploring all the many worlds of variation.

That wasn't the only science fiction story of the past decade to imagine that biological organisms, operating at atomic scales, might exploit quantum effects. Alas, science fiction memes don't last long these days. Protein exploitation of quantum effects has become a mainstream research topic. This Nov 2009 Sci Am news article is a good overview of the underlying physics; note especially the resolution to the old debate about how the quantum/classical transition happens ...

... In the modern view that has gained traction in the past decade, you don’t see quantum effects in everyday life not because you are big, per se, but because those effects are camouflaged by their own sheer complexity. They are there if you know how to look, and physicists have been realizing that they show up in the macroscopic world more than they thought...

... This work suggests that, contrary to conventional wisdom, entanglement can persist in large, warm systems—including living organisms. “This opens the door to the possibility that entanglement could play a role in, or be a resource for, biological systems,” says Mohan Sarovar of the University of California, Berkeley, who recently found that entanglement may aid photosynthesis ... In the magnetism-sensitive molecule that birds may use as compasses, Vedral, Elisabeth Rieper, also at Singapore, and their colleagues discovered that electrons manage to remain entangled 10 to 100 times longer than the standard formulas predict...

... In the average adult human, the brain represents about 2% of the body weight. Remarkably, despite its relatively small size, the brain accounts for about 20% of the oxygen and, hence, calories consumed by the body (1). This high rate of metabolism is remarkably constant despite widely varying mental and motoric activity...

Later articles suggest that while brains use a lot of calories, and are thus a very expensive evolutionary development, they don't use the calories for thinking. Brain calories are primarily consumed in "intrinsic activites" unrelated to environmental stimulus -- presumably maintenance functions of some sort.

So we knowledge workers don't get any caloric credit for thinking, and since we usually think while sitting (very bad) we're really pro blubber.

[1] We also know human brains are smaller and probably more efficient than they used to be. Of course, so are computers.

Friday, April 23, 2010

Something has changed over the past year with our iPhone voice services around our home (Macalester-Groveland, St Paul, Minnesota) and, to a lesser extent, along my commute (St. Paul to Roseville).

Voice services have gone from mediocre to intolerably bad. The “bars” are meaningless; even with 3-4 “bars” connection failures and call drops are ubiquitous. We can’t reliable make or receive calls from our home, which is rather a problem since we don’t have long distance landline service.

Curiously data services (3G) are doing well. We’ve tried turning off 3G services at home (EDGE only) to see if voice connections improves, but that doesn’t help enough. I suspect local AT&T carrying capacity is the problem, not wireless signal.

We rely on our iPhones for a lot of things, but we need voice services.

I understand that iPhone-class technology, and the absurdity of flat-rate data service pricing, has put a great deal of strain on AT&T’s networks. I understand that our community is resistant to installing new cell towers. That understanding does not translate into sympathy. Our family is paying, I’m chastened to confess, thousands of dollars for services AT&T is not delivering.

There are things AT&T could do. They could end their insane flat-rate pricing, and institute pay-per-use bundles with similar value but user-aligned incentives. They could provide us with a freeAT&T 3G MicroCell (aka femtocell) along with discounts for use. They could give us substantial discounts on their mobile services pending a fix.

They’re not doing any of these things. Instead of providing free MicroCells and discounted services, for example, AT&T charges for their femtocell solution.

I miss the days when it was possible to initiate class action lawsuits for failure to deliver contracted services.

If some other company gets iPhones in June our family will switch. We have only one under-contract iPhone and we can sell it and pay the AT&T penalty.

If no other company gets iPhones, I will find out how good Droid really is.

… We see that the coverage around your home is considered to be our best coverage and it includes 3G service. We also see that there is a planned tower about 2 miles from your home at Osceola and Lexington Parkway S. This tower is slated to be operational mid August 2010…

…Please contact Customer Care at 611 from a cell phone or at 1-800-331-0500 from landline phone if the problem persists. Have your wireless device available to allow for proper troubleshooting if problems persist…

So even though our coverage is failing on multiple phones, AT&T considers it to be pretty good. There’s something wrong there.

The 3 miles tower is too far away to help us directly, but it’s close enough to reduce the burden on our proximal towers. On the other hand by August we’ll have the 2010 iPhone and the 3G iPad – so any additional local AT&T capacity will be swamped.

When we think about science, most of us think of dramatic breakthroughs. We think Darwin and Wallace, Einstein and Bohr, Copernicus and Curie and we imagine everything changed overnight.

Most science, however, develops in bits and pieces, twisting and turning, waxing and waning, until, after thirty years, things are new. Even the dramatic shifts, like natural selection, took decades to get from radical to mainstream.

If you’re at all curious about things, you notice this in a single lifespan. Consider deep history; the story of humans from 150K to 3K years ago. In the past 30 years discoveries from genomics, climate research, linguistics, plant research, translation, anthropology and archaeology, combined with the revision of old biases, have dramatically changed our understanding of deep history. In each case, of course, computation has been a fundamental driver. That’s how it works – new instruments make new science.

It’s been growing slowly from all directions, but the sum is a very different world from what some of us learned in the 1970s. The human brain is evolving and changing far more dramatically than we imagined, and that evolution has not slowed with modernity. Our concepts of human speciation are being transformed; there were many “species” of human coexisting into deep history – and, like dogs and wolves, they probably crossed often.

Pre-agricultural humans were far more populous and widespread than we once imagined; the large populations of pre-invasion (early agricultural and hunter-gatherer) North America probably reflect worldwide pre-agricultural patterns.

Even after the development of agriculture and writing we see thousand year intervals of relative stasis in China, Egypt and Mesopotamia. How could this be when our fundamental technologies change in decades. Are the minds of modern Egyptians radically different from the minds of only 6,000 years ago? Why? Why do we see this graph at this time in human history?

Thursday, April 22, 2010

My kids need some typing software. I remember a "Broderbund" "Mavis Beacon" product from eons past, so I started looking into the current state of the product. It took me a while to sort things out; and along the way I was reminded of how much was lost in the tech bubble of the 1990s.

It turns out there's now a Mac-only "Mavis Beacon" product sold by "Software MacKiev" (Ukranian Mac contract software development) and another product (XP/Mac) sold by "Encore Software". The Encore Mac product is buggy and unsupported, the MacKiev version sounds a bit more promising.

What a mess. How did that happen? For that matter, what happened to all of the pretty good educational software produced by all of those companies? How did it all die, without true replacements (the closest things today are Flash apps on commercial sites that do child marketing)?

Part of the answer is that these companies had a good skillset for computer gaming, and that was a much bigger industry. Another part of the answer comes from the the wikipedia article on Broderbund (1980s history here) [1], (emphases mine, remember those who got cash were the winners) ...

... Softkey ...purchased The Learning Company for $606 million in cash and then adopted its name...Brøderbund was purchased by The Learning Company in 1998 for about US$420 million in stock...

In a move to rationalize costs, The Learning Company promptly terminated 500 employees at Brøderbund the same year,[16] representing 42% of the company's workforce.

Then in 1999 the combined company was bought by Mattel for $3.6 billion ... Jill Barad, the [Mattel] CEO, ended up being forced out in a climate of investor outrage.

Mattel then gave away The Learning Company in September 2000 to Gores Technology Group, a private acquisitions firm, for a share of whatever Gores could obtain by selling the company. In 2001, Gores sold The Learning Company's entertainment holdings to Ubisoft, and most of the other holdings, including the Brøderbund name, to Irish company Riverdeep.[19] Currently, all of Brøderbund's games, such as the Myst series, are published by Ubisoft...

This kind of churn is death to software. Software needs continuity to survive. The cycle of acquisition and 'rationalization" creates zombie software that staggers on, brainless, for years ... then dies.

The tech bubble made a few people rich, and it destroyed a lot of good products. Not to mention costing Mattel's shareholders quite a few pennies.

After the tech bubble burst came 9/11, then the great asset bubble and, not least, the Bush administration. One, two, three, four. No wonder America is reeling.

[1] At one time Mavis Beacon was sold under the Broderbund name, but by that time Broderbund might have been owned by Riverdeep. I include this story as an example of all the things the tech bubble killed.

Update 4/30/10: I bought the MacKiev product. It came with a solid, richly photographed manual with the name "Broderbund" on the front cover. This team is proud of their work. The manual included some interesting background on how Software MacKiev ended up doing their own OS X version:

Mavis Beacon Teaches Typing was created more than twenty years ago, and was first published in 1987. Software MacKiev’s involvement goes back to 1998 when our company developed version 9 for the Macintosh — both the US and UK editions. Then, a decade later, we had the opportunity to get involved with Mavis Beacon again — this time as the developer and publisher of a new generation of Mavis Beacon software for Mac OS X. We are so pleased and proud to be bringing the kind of quality you’ve come to expect from the creative labs of Software MacKiev to this new edition.

Update 6/8/2010: I've received a few more details. The "Encore" versions of Mavis Beacon for OS X are arguably fraudulent. They're not designed for the current OS. Looks like "Encore" bought up some discarded software assets ...

... Software MacKiev develops and publishes only the Mavis Beacon Teaches Typing 2008 and 2009 Deluxe, International, and School Editions for Mac OS X. The previous versions of Mavis Beacon were made for really old Macs with OS 9 by a company called Broderbund. A company called Encore has since taken over the broderbund.com Web site and continue to distribute the outdated software, which — as you point out — doesn’t work as it should...

Tuesday, April 20, 2010

... When researchers affiliated with the Pennington center had volunteers reduce their energy balance for a study last year by either cutting their calorie intakes by 25 percent or increasing their daily exercise by 12.5 percent and cutting their calories by 12.5 percent, everyone involved lost weight. They all lost about the same amount of weight too — about a pound a week. But in the exercising group, the dose of exercise required was nearly an hour a day of moderate-intensity activity, what the federal government currently recommends for weight loss but “a lot more than what many people would be able or willing to do,” Ravussin says."

An hour a day would be wonderful -- if my kids were grown.

The NYT article has the complex details. The effect of exercise on weight varies by age and gender, and between individuals as well. In general, however, it's not a good way to lose weight. Diet is more efficient.

On the other hand exercise seems to be essential to keeping weight stable after a weight loss diet. How and why? Nobody knows for sure.

Sitting turns out to be really, really, bad. We've had hints of that over the past years, but now it's getting pinned down. We don't know why, but sitting promotes obesity.

Incidentally, it's all harder for women. But you knew that.

Elsewhere in the NYT, Olivia Judson deepens the cheer with claims that obesity causes brain damage. She's a bit below her par though; she obscures correlation with causation in the interests of more hits(she well knows the difference, so two demerits to her). It is likely that obesity is associated with early dementia, but it's also associated with lower socioeconomic status, lower IQ, and the anger of the gods.

On the other hand, there's a weird association between exercise and brain function, even though I didn't believe it years ago. Exercise seems to help the health of neurons associated with cognition and memory in various animals -- for no particularly good reason. Since exercise is associated with lower obesity (in both directions) this further murkens the muddies.

Lastly, the idea that brain activity (bridge, crosswords, etc) slows dementia seems to be, at long last, good and dead. It doesn't work. Forget the bridge, forget the crosswords, go for a walk.

To sum it all up, my best guess at how this will all turn out:

Sleep is more important for brain health than we've imagined.

Exercise is more important for brain health than I thought 4-7 years ago. (I like to exercise, so it wasn't a prejudice against activity. It's just weird science.)

Exercise helps both sleep and brain health - so it's a double good. It doesn't lead to weight loss, but it's essential to maintain a stable weight.

We all need to diet all the time, so we need cultural and industry changes to make that very hard activity easier.

Obesity is almost inevitable in a food rich world, especially when we eliminate smoking and increase sitting (at computers). We need a miracle drug, we need cultural changes, we need mobile devices, we need gas to hit $10 a gallon.

Sitting is oddly bad for us. We should all be standing and walking.

Try not to get a concussion (but almost all enjoyable exercise increases head injury risk :-). Don't let your kids play football (which will eventually go the way of boxing).

See also:

BBC NEWS | Health | Creatine 'boosts brain power' 2003: Right. This went nowhere. A good reminder of how worthless most press releases are. We're still hearing about the huge onslaught of cognitive enhancer abuse, but it's mostly media imagination.

a retardation of an effect when the forces acting upon a body are changed (as if from viscosity or internal friction); especially : a lagging in the values of resulting magnetization in a magnetic material (as iron) due to a changing magnetizing force

Some interesting research has come out recently about the processing capacity of brains. For example, that the medial prefrontal cortex can only handle two tasks at once, or that working memory can only handle about 7 items at a time (but what's an item?), or that when people are actively trying to remember something complicated, their impulse control is reduced. In fact, there has been a lot of research showing that exerting the will to make a difficult decision uses a fuel resource (sugar from the blood) that many of these other tasks also need.

What happens when these resources are used up? When we have been thinking too hard, or have been under heavy stress, or haven't had enough to eat or sleep, or are trying to remember too many things, or are trying to drive, or need a fix,we fall back on a simpler part of the brain. We lose the ability to think rationally, to choose future benefit over immediate reward; the ability to choose at all is reduced. We become irritable, forgetful, angry, quick to argue....

I've been disappointed that there have been few studies of how physician cognition adjusts to using automation tools (electronic health records, etc) during patient care. These tools all seem to have a substantially higher cognitive burden than phone use, but the impact of phone conversations on driving performance has been studied to death. Do physicians become more irritation and distracted when they try to simultaneously talk with patients, think about the answers, and use current clinical software?

I suspect IBM mainframe printers did something comparable once upon an eon. Moving bits of the OS between client and server.

Myself, I'm looking forward to embedded OS scanners that will let me pick up the scans off the network. That's the way the big office machines work, and I'd love to separate my home scanner from my machines.

Spooky. I am one with xkmind. Except that not only can I wait, I would really prefer that the future spare my children too.

The xkcd story is pure 21st century geek culture. How geek? On Google Reader xkcd gets over 100 "likes" within hours of release. That's probably the fastest and highest "like" rating in the entire geek universe.

... If it's a not-for-profit publication, you need no permission -- just print them with attribution to xkcd.com. If it's a for-profit operation, I will probably give you permission if you email me to let me know. You can post xkcd in your blog (whether ad-supported or not) with no need to get my permission...

... In September 2009 Munroe released a book, entitled xkcd: volume 0, containing selected xkcd comics.[80] The book was published by breadpig, under a Creative Commons license, with all of the publisher's profits donated to Room to Read to promote literacy and education in the developing world. Six months after release, the book has sold over 25,000 copies. The book tour in New York City and Silicon Valley was a fundraiser for Room to Read that raised $32,000 to build a school in Laos....

Update: I corrected the original post. Munroe's publisher has donated the book profits, but Munroe can keep his well deserved share.

Today, however, I came across a post I could read. The sentences were still a bit stilted, but the paragraphs were coherent. Overall it read like a bilingual native Chinese speaker writing in quite good English.

The trick is that the interview was conducted in English, then human translated to Chinese, then Google machine (statistically) translated back to English. The Chinese translation must have preserved quite a bit of the original sentence structure; enough that the reverse translation worked quite well.

I wept at the end. Surprised me, since there was nothing in there that was new. When we moved car seats from the front seat to the back seat we saved many lives, but we made these errors inevitable.

It is very well written.

[1] If I still had infant passengers, I would clip a lead from the car seat to my belt every time I got in the driver's seat. Then I'd have to leave my pants in the car to forget the infant. I only heard of that fix after my kids were mobile.--My Google Reader Shared items (feed)

My daughter left Korea when she was 13 months. She enjoys eating a seaweed snack, and Nori condiments. I wonder if she also carries these novel bacteria ... (Incidentally, the US coverage is misleading. This is not about sushi, it's about the seaweed called "nori".)

... The tools in question are genes that can break down some of the complex carbohydrate molecules in seaweed ... They are wielded by the hordes of bacteria lurking in the guts of every Japanese person ... Some gut bacteria have borrowed the seaweed-digesting genes from other microbes living in the coastal oceans....

... Within each of our bowels live around a hundred trillion microbes, whose cells outnumber our own by ten to one. This ‘gut microbiome’ act like an extra organ, helping us to digest molecules in our food that we couldn’t break down ourselves. These include the large carbohydrate molecules found in the plants we eat. But marine algae – seaweeds – contain special sulphur-rich carbohydrates that aren’t found on land. Breaking these down is a tough challenge for our partners-in-digestion. The genes and enzymes that they normally use aren’t up to the task.

Fortunately, bacteria aren’t just limited to the genes that they inherit from their ancestors. They can swap genes between individuals as easily as we humans trade money or gifts. This ‘horizontal gene transfer’ means that bacteria have an entire kingdom of genes, ripe for the borrowing. All they need to do is sidle up to the right donor. And in the world’s oceans, one such donor exists – a seagoing bacterium called Zobellia galactanivorans.

Zobellia is a seaweed-eater. It lives on, and digests, several species including those used to make nori. Nori is an extremely common ingredient in Japanese cuisine, used to garnish dishes and wrap sushi. And when hungry diners wolfed down morsels of these algae, some of them also swallowed marine bacteria. Suddenly, this exotic species was thrust among our own gut residents. As the unlikely partners mingled, they traded genes, including those that allow them to break down the carbohydrates of their marine meals. The gut bacteria suddenly gained the ability to exploit an extra source of energy and those that retained their genetic loans prospered...

.... [the human gut bacteria] B.plebeius seems to have a habit of scrounging genes from marine bacteria. Its genome is rife with genes that are more closely related to their counterparts in marine species like Zobellia than to those in other gut microbes. All of these borrowed genes do the same thing – they break down the complex carbohydrates of marine algae...

... To see whether this was a common event, Hehemann screened the gut bacteria of 13 Japanese volunteers for signs of porphyranases. These “gut metagenomes” yielded at least seven potential enzymes that fitted the bill, along with six others from another group with a similar role. On the other hand, Hehemann couldn’t find a single such gene among 18 North Americans....

... People might only gain the genes after eating lots and lots of sushi but Hehemann has some evidence that they could be passed down from parent to child. One of the people he studied was an unweaned baby girl, who had clearly never eaten a mouthful of sushi in her life. And yet, her gut bacteria had a porphyranase gene, just as her mother’s did.

... “Today, sushi is prepared with roasted nori and the chance of making contact with marine bacteria is low,” she said. The project’s other leader, Gurvan Michel, concurs. He notes that of all the gut bacteria from the Japanese volunteers, only B.plebeius as acquired the porphyranase enzymes. “This horizontal gene transfer remains a rare event,” he says....

... Rob Knight, a microbiome researcher from the University of Colorado... “This result reinforces the need to conduct a broad and culturally diverse survey of who harbours what microbes. The key to understanding obesity or IBD might well be in genes or microbes acquired under circumstances very different to those we experience in Western society.”

Because Nori is now cooked, and because persistence seems fragile, the genes will probably disappear from Japanese gut bacteria. It's a fascinating example, however, of the power of the microbiome. The therapeutic implications are obvious. Science fiction writers, incidentally, have long described the use of tailored gut bacteria to enable novel diets. When we run out of beef, cooked grass might be yummy.--My Google Reader Shared items (feed)

... And then there's the famous memo sent by Blankenship in October 2005 to all 'Deep Mine Superintendents.' ...

"SUBJECT: RUNNING COAL

If you have been asked by your group presidents, supervisors, engineers, to do anything else other than to run coal (i.e. - build overcasts, do construction jobs, or whatever) you need to ignore them and run coal. This memo is necessary only because we seem not to understand that the coal pays the bills."

Mine 'overcasts' are critical to proper mine ventilation, and for many miners, Blankenship's memo made it abundantly clear exactly what Massey's 'top priority' was, and is."

...Paddy Power, an online bookmaker, is offering odds of 11 to 10 that dark matter will be found before black holes and 8 to 1 that black holes will be first. Dark energy, a mysterious force thought to drive the expansion of the universe, trails at 12 to 1. And for those who fancy a real outside bet, the firm is also offering 100 to 1 that the machine will discover God....

To make it worthwhile though you need to bet $100K to earn $1K. A CD would have comparable returns, which probably has something to do with the odds.

... Several years ago we began developing a large scale machine learning system, and have been refining it over time. We gave it the codename “Seti” because it searches for signals in a large space. It scales to massive data sets and has become one of the most broadly used classification systems at Google.

After building a few initial prototypes, we quickly settled on a system with the following properties:

Binary classification (produces a probability estimate of the class label)

Parallelized

Scales to process hundreds of billions of instances and beyond

Scales to billions of features and beyond

Automatically identifies useful combinations of features

Accuracy is competitive with state-of-the-art classifiers

Reacts to new data within minutes...

I can think of several reasons why they named it Seti. For one, HAL was taken.

In this popular account from 1860 [1]) the observation that "electricity" traveled through a medium implied that "other" fundamental forces such as light and gravity must also travel through a medium (emphases mine) ...

... The results of the experiments instituted by Sir William Grove are exceedingly curious, and must be regarded as all but proving the truth of the modern theory, which assumes that electricity is not, in any sense, a material substance but only an affection (state) or motion of the particles of ordinary matter.

If electricity is unable to pass over or through a vacuum, it is probable that all the other so-called imponderable forces—light, heat, magnetism, and possibly attraction—obey the same law, and as these agencies freely travel the interplanetary spaces, the supposition of Newton that such spaces may be filled with an ethereal form of matter receives an indirect but powerful support....

There is no ethereal form of matter in the 19th century sense however. Electricity has a relationship to fundamental electromagnetic forces, but it is not the same sort of thing. Once electricity was divided from "light" it was possible to find common models for light and magnetism.

In 2010 some people are trying to deal with the unquantifiability of gravity through a similar approach ...

One of the hottest new ideas in physics is that gravity is an emergent phenomena; that it somehow arises from the complex interaction of simpler things.

A few month's ago, Erik Verlinde at the the University of Amsterdam put forward one such idea which has taken the world of physics by storm. Verlinde suggested that gravity is merely a manifestation of entropy in the Universe. His idea is based on the second law of thermodynamics, that entropy always increases over time. It suggests that differences in entropy between parts of the Universe generates a force that redistributes matter in a way that maximises entropy. This is the force we call gravity.

What's exciting about the approach is that it dramatically simplifies the theoretical scaffolding that supports modern physics. And while it has its limitations--for example, it generates Newton's laws of gravity rather than Einstein's--it has some advantages too, such as the ability to account for the magnitude of dark energy which conventional theories of gravity struggle with.

But perhaps the most powerful idea to emerge from Verlinde's approach is that gravity is essentially a phenomenon of information.

Today, this idea gets a useful boost from Jae-Weon Lee at Jungwon University in South Korea and a couple of buddies. They use the idea of quantum information to derive a theory of gravity and they do it taking a slightly different tack to Verlinde.

At the heart of their idea is the tricky question of what happens to information when it enters a black hole. Physicists have puzzled over this for decades with little consensus. But one thing they agree on is Landauer's principle: that erasing a bit of quantum information always increases the entropy of the Universe by a certain small amount and requires a specific amount of energy.

Jae-Weon and co assume that this erasure process must occur at the black hole horizon. And if so, spacetime must organise itself in a way that maximises entropy at these horizons. In other words, it generates a gravity-like force.

That's intriguing for several reasons. First, Jae-Weon and co assume the existence of spacetime and its geometry and simply ask what form it must take if information is being erased at horizons in this way.

It also relates gravity to quantum information for the first time. Over recent years many results in quantum mechanics have pointed to the increasingly important role that information appears to play in the Universe.

Some physicists are convinced that the properties of information do not come from the behaviour of information carriers such as photons and electrons but the other way round. They think that information itself is the ghostly bedrock on which our universe is built.

Gravity has always been a fly in this ointment. But the growing realisation that information plays a fundamental role here too, could open the way to the kind of unification between the quantum mechanics and relativity that physicists have dreamed of..

In short, one way to deal with the gravity problem is to make gravity go away. It's merely a confusing epiphenomena.

Hey, it worked for electricity ...

[1] Just prior to the American civil war. In a few years the SciAm "150 year back" article excerpts will be all about military science relevant to the most bloody battles of the 19th century. That should be interesting.

Monday, April 05, 2010

Until I saw this graphic I thought the compact, point and shoot, camera was doing just fine.

Right. Of course. My iPhone 3G isn't good enough to replace a point and shoot, but my June 2010 iPhone will. I haven't been paying attention.

That's tech churn. From nowhere to everywhere to nowhere in no time at all. Reminds me of books on CD.

So is Amit Gupta right that the iPad has made the laptop obsolete? I don't think so. A good laptop is much closer to a desktop than a Point & Shoot camera is to a DSLR; there's room for both in the "pro and enthusiast" market. I do agree that specialty OSs such as ChromeOS and iPhoneOS will dominate the "everyday use" end of the market. The form factor is, however, less critical than the usability, security, business model and low operating costs of these next generation environments.

Remember too that, when the market demands it, an iPhone will support a keyboard and external monitor. Then you have a phone and a laptop ...

... This topic could be and is a full semester course at some business schools. It is a deep and rich topic that I can’t cover in one single blog post. But it is also a relatively narrow skill set at its most developed levels. If you are going to be a public equity analyst, you need to understand this stuff cold and this post will not get you there.

But if you are an entrepreneur being handed financial statements from your bookkeeper or accountant or controller, then you need to be able to understand them and I’d like this post to help you do that. I’d also like this post help those of you who want to be more confident buying, holding, and selling public stocks. So that’s the perspective I will bring to this topic.

In the past three weeks, we talked about the three main financial statements, the Income Statement, the Balance Sheet, and the Cash Flow Statement. This post is going to attempt to help you figure out how to analyze them, at least at a cursory level...

I've put time on my calendar to read and study these posts. (In my life only scheduled things happen. Your life may vary.)

Sunday, April 04, 2010

It costs a huge amount of resources to develop a functioning human adult. That's a price of being a social animal with a massively overclocked brain.

Not surprisingly, evolution has made us feeble killers. Our teeth, claws, and muscles are pathetic. Most adult humans are traumatized when they kill -- even when it's in self-defense and even after extensive training and conditioning. Only immature children kill easily, which is why child soldiers are valued by the world's most evil men. (See also: vertromedial injuries).

On the other hand, evolution hasn't had time to adjust to murder at a distance. Pilots do not seem to experience the trauma felt by marines, even though they may kill many more people. This has advantages for warfare, but there are problems with making killing too easy. These problems are showing up with drone use ...

... If they have not been so commandeered, attacks on such sites may constitute war crimes. And drone attacks often kill civilians. On June 23rd 2009, for example, an attack on a funeral in South Waziristan killed 80 non-combatants.

Such errors are not only tragic, but also counterproductive. Sympathetic local politicians will be embarrassed and previously neutral non-combatants may take the enemy’s side. Moreover, the operators of drones, often on the other side of the world, are far removed from the sight, sound and smell of the battlefield. They may make decisions to attack that a commander on the ground might not, treating warfare as a video game.

Ronald Arkin of the Georgia Institute of Technology’s School of Interactive Computing has a suggestion that might ease some of these concerns. He proposes involving the drone itself—or, rather, the software that is used to operate it—in the decision to attack. In effect, he plans to give the machine a conscience.

The software conscience that Dr Arkin and his colleagues have developed is called the Ethical Architecture. Its judgment may be better than a human’s because it operates so fast and knows so much. And—like a human but unlike most machines—it can learn.

The drone would initially be programmed to understand the effects of the blast of the weapon it is armed with. It would also be linked to both the Global Positioning System (which tells it where on the Earth’s surface the target is) and the Pentagon’s Global Information Grid, a vast database that contains, among many other things, the locations of buildings in military theatres and what is known about their current use.

After each strike the drone would be updated with information about the actual destruction caused. It would note any damage to nearby buildings and would subsequently receive information from other sources, such as soldiers in the area, fixed cameras on the ground and other aircraft. Using this information, it could compare the level of destruction it expected with what actually happened. If it did more damage than expected—for example, if a nearby cemetery or mosque was harmed by an attack on a suspected terrorist safe house—then it could use this information to restrict its choice of weapon in future engagements. It could also pass the information to other drones.

No commander is going to give a machine a veto, of course, so the Ethical Architecture’s decisions could be overridden. That, however, would take two humans—both the drone’s operator and his commanding officer...

Even if this particular implementation doesn't succeed, it makes a great deal of sense to build in this kind of automated oversight.

Obviously it's not simply of interest to weapons, though that's where the initial funding will come from. Even if we don't get sentient machines in the next fifty years (if you're the praying type, pray we don't), we will be deploying systems that make many risk/benefit trade-offs in many contexts. We will benefit if they evolve a conscience.

Some humans too would benefit from a prosthetic conscience. It might allow persons with disorders of conscience to function more effectively in the modern world. Our prisons are full of low IQ individuals with a limited capacity to model the impacts of their actions on other persons. A prosthetic conscience might allow them to avoid prison, or to have great success after prison life.

Of course if we do develop non-human sentience, it might be very much to our advantage if they felt qualms about hurting us ...