We’re in the middle of a huge platform shift in computing and most of us don’t even know it. The transition is from desktop to mobile and is as real as earlier transitions from mainframes to minicomputers to personal computers to networked computers with graphical interfaces. And like those previous transitions, this one doesn’t mean the old platforms are going away, just being diminished somewhat in significance. All of those previous platforms still exists. And desktops, too, will remain in some form when the mobile conversion is complete, though we are probably no more than five years from seeing the peak global population of desktop computers. We’d be there right now if we’d just figured out the I/O problem of how to stash a big display in a tiny device. But we’re almost there. That’s what this column is largely about.

I’ve been thinking about this topic ever since I wrote a column on an iPhone. It wasn’t easy to do, but I researched and wrote the column, loaded it to WordPress and added graphics, all by jabbing fingers at that tiny screen. It was for me an important test of what was possible and confirmed to me what I’d been guessing — that the iPhone is the first real device for the new mobile platform. Not a great device, but as Adam Osborne used to preach, it is an adequate device, and in the early days adequate is quite enough.

This seminal role for the iPhone is mainly by chance, I think. Its success is deserved no more than it is undeserved. The role could have fallen to Android or WebOS if they had been earlier or even to Windows Mobile if it had been a bit better. Steve Jobs proved his luck again by dragging his feet just long enough to fall into the sweet spot for a whole new industry. That’s not to say he can’t still blow it, but he has the advantage for now.

It’s important to understand just how quickly things are changing. Part of this comes down to the hardware replacement cycle for these devices. A PC generation is traditionally 18 months long and most of us are unwilling to be more than two generations behind, so we get a new desktop or notebook every 36 months. Mobile devices don’t last that long, nor are they expected to. The replacement cycle is 18 months, reinforced by customer contract terms that give us a new device every couple of years in return for staying a loyal customer. Mobile hardware generations last nine months, and 18 tends to be the maximum time any of us use a single device.

Think about it. This means that mobile devices are evolving twice as fast as desktops ever did. This just about equals the rate at which wireless network bandwidth is declining in price and matches, too, the faster-than-Moore’s Law growth of back-end services. Think about those first iPhones compared to the ones shipping today. In less than two years the network has increased in speed by an easy 2X and the iPhone processor speed has doubled, leading to a device that is at least four times more powerful than it was originally. It’s a much more capable device than it was, yet the price has only gone down and down.

This is not a celebration of the iPhone: the same performance effects apply equally to all mobile platforms.

Now just imagine what it says for the smart phones to come. In another two years they’ll be eight times as powerful as they are today, making them the functional equivalents of today’s desktops and notebooks. If only we could do something about those tiny screens and keyboards.

The keyboard is a tough one. In one sense it isn’t hard to imagine it being handled through voice input. That’s how they did it on Star Trek, right? But there was a problem with Star Trek computing: the interface is what I think of as interrogational. Kirk or Scotty asked the ship’s computer (a mainframe, obviously) a question that always had an answer that could be relayed in a handful of words. The answer was “yes,” “no,” “Romulan Bird of Prey,” or “kiss your ass goodbye, Sulu.” There’s never any nuance with an interrogational interface and not much of a range of outputs. It’s okay for running a starship or a nuclear power plant, but by being only able to speak it is limited to what words alone can do.

I attribute this, by the way, to Gene Roddenberry’s work as a writer. I doubt that he saw word output as a limitation, since his product was, after all, words. TV is radio with pictures, and the words really count a lot. But try to use them to simulate a nuclear meltdown with any degree of precision or prediction and they’ll fail you.

Our future mobile devices will use words for input, sure, but words alone won’t be enough. Still, between voice recognition, virtual keyboards, and cutting and pasting on those little screens, there’s a lot that can be done. It’s the output that worries me more.

I first wrote about this a decade ago when I heard about how Sony was supporting research at the University of Washington on retinal scan displays — work that eventually resolved into products from a Washington State company called Microvision. They’ll shine a laser into your eye today, painting a fabulous scene on the back of your eyeball in what appears to be perfect safety, but I have a hard time imagining the broad acceptance of such displays by billions (yes, BILLIONS) of users any more than I expect that Bluetooth earphones will survive a decade from now. Too clunky.

I think we’re headed in another direction and that direction is — as always — an outgrowth of Moore’s Law. Processors get smaller every year and as they get smaller they need less energy to run. Modern processors are also adapting more asynchronous logic — another topic I started writing about 10 years ago that offers dramatic energy savings.

We’re at the point right now where primitive single-pixel displays can be built into contact lenses. They act as user interfaces for experimental devices like automatic insulin pumps. This already exists. A patch of carbon nanotubes on your arm continuously monitor blood glucose levels, driving a pump that keeps your insulin supply right where it should be. Any problem with the pump or the levels is shown by a red dot that appears in your field of view courtesy of that contact lens. The data connection between pump and eyeball is wireless. The power to run that display is wireless too, since the contact lens display scavenges RF energy out of the air to run, courtesy of that mobile phone on your belt and that WiFi access point on the ceiling.

As long as we’re personally connected to the network we’ll have enough power to run such displays. No more airplane mode.

And while that display is a single pixel today, we can pretty easily predict at what point it could be the equivalent of HDTV. Except I don’t expect we’ll ever get there. That’s because, thanks to Ray Kurzweil’s singularity — that point at which everyday machines have more computing cycles than I do — we’ll soon have so much excess processing power that mere physical interfaces will be boring and not necessary.

Here’s my problem with the singularity: I don’t want to work for my computer, much less for my microwave oven, both of which are supposed to be way smarter than me by 2029, according to Ray. My way around this problem, in the Capt. Kirk tradition, is to find difficult jobs for all that computing power to keep it from interfering with my lifestyle.

So there’s a platform transition happening. We’re in the middle of it. The new platform is a mobile interface to a cloud network. And the way we’ll shortly communicate with our devices, I predict, will be through our thoughts. By 2029 (and probably a lot sooner) we’ll think our input and see pictures in our heads.

Think it can’t happen? Twenty years ago was Windows 3.0 and Mac OS 6. Twenty years from now computing won’t even be a device, just a service.

129 Comments

I think there’s a difference between intelligence and von Neumann boolean-logic machines, meaning I think the singularity believers are devil worshippers.

Take a laptop back 40-plus years to the 1960s and first scientists would marvel over it, but ultimately they would be disappointed. Actually Andy Hertzfeld made a very similar comment to Bob in a NerdTV interview, where he said he was dazzled by the improvements in hardware technology, but ultimately disappointed by where computers had evolved to.

We’re limited by the paradigm built into this whole endeavor. We may have HDTV contact lenses and super-computer calculators in our belt buckles, but ultimately we’ll be disappointed by what they do — meaning they won’t be doing much more than they are today.

David Janke
November 18, 2009 at 8:57 am

@John Roberts – agreed. We could probably reach the “singularity” tomorrow if we could only conceive of how to do it. The underlying limitation to breaking away from logic machines is not a lack of hardware; it has more to do with the limitations of our thoughts/understanding. Like you said, future computers will do pretty much the same stuff they do today… only faster (and taking up less space).

As for the article jumping the shark…
Don’t get too hung up on the conclusions. The real information is always in the middle part of the article. Then, “Cringely” just extrapolates out for some crazy solution/result.
View the beginning/middle as info and the last part as food-for-thought 😉

@Brian McTavish – I really like your “call your mother because she’s lonely” example. I think the next mind-blowing development has more to do with connecting (mining?) all of the information that is provided by ubiquitous computers.

Francis (Ottawa)
November 18, 2009 at 12:43 pm

Well if we reached the singularity, and were disappointed by it, we could always hit the reset button . . . hey maybe that’s what we did.

Francis (Ottawa)
November 18, 2009 at 12:45 pm

Worried about words only output? Bob, didn’t you try to scare everyone a few columns back with IBM’s virtual reality gaming / offshoring / training system?

Actually, the iPhone pricing has gone up a bit, even though the up front pricing is down. Is $200 up front and data plan of $30/mo for 2 years vs $400 up front and $20/mo for 2 years. Still, it didn’t take off until it hit that $200 mark. Go figuh.

I/O is always the problem, isn’t it. I mean, with things like the new iPod shuffle holding 4 GB’s in what’s basically a thick wire, soon servers and storage will be built in to the infrastructure of everything. Just how to interact with it. Turns out people aren’t really built well for this sorta thing. Will be interesting to see what comes up.

Jerry
November 17, 2009 at 10:23 pm

“Twenty years ago was Windows 3.0 and Mac OS 6.”

Yeah, we haven’t really come that far. Computers are still annoying as ever.

[…] In Technology on November 18, 2009 at 2:07 pm Cringely has hit the nail on the head regarding the future or handheld devices. These things are not phones anymore. They are all handheld devices which happen to have phone […]

Brendan
November 18, 2009 at 2:07 am

Whenever a great user interface is discovered, it has a habit of surviving for a very long time, and I’m not just talking about computers. Think about the two wheel-peddle-handlebar interface of the bicycle or the steering four wheel-accelerator-brake of the car or the bell-mouthpiece-receiver of the telephone. All are basically unchanged after after more than a century even though the gadgets that they control have changed dramatically.

I believe that the keyboard-mouse-monitor interface of the desktop is another such killer-interface and will be around for a very long time. It won’t work in situations where you’re on the move and need portability, but a lot of computer use will still continue to occur in the comfort of a home or an office.

The problem that I would have with thought-controlled devices is that they are not intuitive because humans are just not designed to control things directly with their thoughts. We evolved over millions of years to use our hands to control tools to carry out tasks. All of the user interfaces that I mentioned above are ingenious because they are perfect tools in that we instinctively feel them as extensions of our bodies.

What a laugh, the site certificates and cookies will need to be very robust because I will not want someone else’s input and output to get mixed up with mine. Can you imagine what the gestalt experience would be like.
You are borg …… hehehe

Dranorter
November 18, 2009 at 3:45 pm

That is the very reason I want a brain interface! It would be really useful to be able to get inside peoples’ heads and, you know, move their mouse to the right spot when they can’t see the link that’s right there in front of them.

Lots of other obvious applications of that too, like telepathy.

As far as the article goes: Cringely, if the network speed doubles and the processor speed doubles, the overall speed does not quadruple. It doubles. Doubling is not impressive anymore.

Furthermore, nobody likes contact lenses- why would we use them as a screen?

Luis Alejandro Masanti
November 18, 2009 at 4:56 am

quote:
“Steve Jobs proved his luck again by dragging his feet just long enough to fall into the sweet spot for a whole new industry.”

I think that it is quite more than “proved his luck again.”
As many big names reported in the article about Jobs becoming the CEO of the decade, to hit once is all they did.
Apple II, Macintosh, NeXT (where the WWW was born!), Pixar, iMac, iPod, iTunes, iPhone… I think he should be the luckiest ever man in the Universe!

OK, I forgot that he survived pancreatic cancer and a liver transplant… here I call him lucky!

OTOH, is Cringley lucky when he spoke “ten years ago” of what it is happening today? I think he is smart, clever… also lucky.

gargravarr
November 21, 2009 at 11:46 am

The WWW was created by Tim Berners-Lee at CERN. Perhaps on a NEXT machine. But NEXT had as much to do with it as AL Gore.

I’m confused. If we’re all neurologically patched Borg-like into a vast AI network that’s smarter than us and runs everything, how exactly the will the computer apocalypse play out? Matrix or I, Robot?

Joe Shelby
November 18, 2009 at 5:15 am

Well, there still remains that “well, what do I actually want to do” factor. Facebook and Twitter are probably the best evidence that, aside from reading, there’s little else people really WANT to do on a computer. Ok, so we’re not too far away from virtual reality porn a-la the kinds of simulations that make for lousy late-night movies on Showtime, and more seriously, first person shooters will get a bit too personal, but beyond that?

The OS hasn’t evolved since Windows 3 because *business* users have no other needs than what they do now, and in fact are increasingly resistant to change because change is expensive. Vista upgrades didn’t happen at home because they didn’t happen in the office. Office 2007 upgrades didn’t happen at home because they didn’t happen at the office. Right now I’m only just starting the process of upgrading our applications to Java6 (7 is a mere month or two away), Vista (which is already out of date), and Office 2007 (the most painful because the xml format of Excel 07 usually totally different libraries to read).

This is a huge expense on our part, just to keep our code basically acting exactly like it did before. Companies don’t like spending money like that. Companies also don’t like the amount of time lost when application interfaces change so dramatically (Office ’03 -> Office ’07, a hit on both user interface and file compatibility) that productivity drops, particularly in a recession.

So what will the embedded screen space give us? Built-in straight-to-the-brain movies (really putting the studios at risk because the experience will finally be *better* than the big-screen theater), built-in straight-to-the-brain powerpoint (I envision a lot of “glazed eyes” on trains as execs review them on their way to the office or a meeting), and for social applications, built-in straight-to-the-brain advertising. None of those sound very attractive to me.

Joe Shelby
November 18, 2009 at 5:18 am

oh yeah, and the obligatory “straight to the brain stock-ticker” demo that everybody provided, just like they did for IM clients, ajax demos, and cell phones.

Daniel
November 18, 2009 at 5:14 pm

Talk about hitting the nail on the head, I was laughing so hard my wife came in the room and gave me the ‘God your a geek’ look.

Tim K
November 19, 2009 at 5:11 am

Think of the mental effects of a mental IO device.

I often suffer from internet amnesia.

When I’m away from my machine I think of all these things to look up and to do when I next get in front of it. When I get infront of it, I forgot what all those things were, and just surf the same old sites.

The other thing that happens to me is I often fall into a wiki-hole.

I go to wikipedia to see how General George Patton died. Then I click to read about the battle of the bulge, and the next thing you know it’s four hours later and I’ve read the complete history of World War II. Only it’s the European theater, so I click on Japan just to see if it’s still there, and the next thing you know I’m chasing down every carrier plane that’s ever been built.

Now put those two things together with a thought base IO device: Internet Amnesia + Wiki-holes and full time IO device. Will I forget to breath or be unable to think? Will I fall into a wiki-hole and never come out?

Just wondering.

furicle
November 18, 2009 at 6:07 am

So using an iPhone as a computer is sign of things to come because it’s ‘adequate’ but Bluetooth headsets won’t survive because they’re ‘too clunky’ ?

Me thinks you made up your mind, then tried to justify it….

Brock
November 18, 2009 at 6:17 am

I’d rather have a laser painting pictures on my retina than a computer than can read my mind. I’m already a bit concerned about what Google knows about me based on the search queries I voluntarily type; I’d be really freaked if it knew what I was thinking.

robin
November 18, 2009 at 9:00 am

Not to mention personal ads: “Happy 85th birthday from the makers of Cialis, Rob. We noticed you haven’t had an erection in the last week and…”

I just listened to this then switched back to my RSS reader to find a Cnet story about IBM predicting a computer similar to a human brain by 2019. Coincidence? I think not. In fact, is Cringely even real? Has anyone seen him lately? Good stuff as always Mr. C…

Brendan
November 18, 2009 at 6:49 am

Bob, I couldn’t help noticing a slight contradiction between the article and the picture above it which shows you lovingly holding a chunky keyboard, unlike your wife(?) who can’t keep her hands off what looks like an i-pod 🙂

Ronc
November 18, 2009 at 2:10 pm

Not his wife. I believe she is Anina http://www.anina.net/index.html . She was interviewed in an episode of Nerd TV as a model who also programs cell phones. She is the reason we old timers keep bugging Bob about Nerd TV.

rrwood
November 18, 2009 at 6:50 am

Does Kurzweil get credit for the idea of The Singularity, or is it Vernor Vinge?

Kurzweil gets (takes) credit for everything. I listened to him speak once and, basically, anything worthwhile in human experience that has arisen in recent decades, he thought of first.

Blad_Rnr
November 18, 2009 at 7:42 am

“This seminal role for the iPhone is mainly by chance, I think. Its success is deserved no more than it is undeserved. The role could have fallen to Android or WebOS if they had been earlier or even to Windows Mobile if it had been a bit better.”

I love how you blindly give passing credence to what Jobs and Apple did with the iPhone. You act like, “oh well, technology is technology.” How wrong could you be? Jobs took a HUGE gamble with the iPhone. It’s got over 200 patents. It’s a controlled environment where developers continue to pump out apps by the day to take advantage of the platform and seek financial gain. It is well-executed platform that will probably produce the iTablet and allow Apple huge growth for years to come. And you act like anyone could have done it. Heh.

There is nothing even close to it unless you are a geek who wants to play (Android). For the masses the iPhone rules and Apple is reaping the financial rewards. You totally forget there has to be PROFITS in order for technology to move forward. Show where that has not been the case? Risk does not happen unless there is a chance for a reward. Android or WinMo could never have done what Apple did because their formula relies on multiple iterations of hardware and even now Android developers are starting to see that it really throws a wrench into the mix when developing for multiple phones.

You missed the point entirely. Yes, technology is moving forward but it’s the risk takers who are searching for profits who are driving it, not some altruistic phenomenon that the evolution of technology is the end-all, be-all. Your column is a slap in the face to Jobs, acting like anyone could have made an iPhone. Well, they didn’t, and they had years to do it. And they still can’t do it.

(I miss the old Cringely. Is it possible he was somehow kidnapped and we got his evil twin?)

And I think you’re jumping to a HUGE presumption that “PROFITS uber alles” is the only motivation people have for creativity, innovation, invention, etc. Bob (old, new, whichever he was) did a great job in documenting one significant example of this with the story of VisiCalc in his original Nerds series. I’m not saying money doesn’t drive a lot of people, but it doesn’t drive everybody. Therefore we shouldn’t swallow the club-for-growth’s Kool-aid that the ONLY motivations that drive people are financial. Don’t underestimate ego, fame, power, legacy, etc. Does anybody really believe that greed and financial success is what’s on Steve Jobs’ mind every morning he gets up? I think not, at least from my seat in the gallery.

Has anyone who make specific future predictions ever been correct?? Flying cars anyone??
Kurzweil and his type have a constantly rolling 20 year window for the emergence of artificial intelligence. We are as far away as we have ever been, Kurzweil’s predictions rely more on a series of unlikely, fantastic breakthroughs than on the steady development of science and technology.
and, if you look at his very specific predictions he is clearly wrong more than right.

Here’s what Jeff Hawkins (author of On Intelligence, co-founder of Palm Computing, Redwood Neuroscience Institute, and Numenta) said about Kurzweil and singularity during his J. Robert Oppenheimer Memorial Lecture in Los Alamos, NM (July 2009):

“There is a group of people who worry about what they call the singularity, as described in a book that came out recently called The Singularity is Near, by Ray Kurzweil. Ray Kurzweil is a very smart guy but I think he’s wrong about this one. The idea here is essentially that if we can create machines that are intelligent, and more intelligent than humans, then those machines can create machines that are more intelligent than them, and then those machines can create even more intelligent machines, and then you have this sort of runaway chain reaction. Instead of protons hitting each other, we have computers designing new brains and things like that.

“This is not going to happen. I’m just telling you, it’s not going to happen.

“It’s not as though, what if I had a computer that could design even faster computers? Would we all of a sudden have infinitely powerful computers? No! There are limits to this stuff . There are limits to actually building intelligent machines, and then training them. There are certain capacity limits and it takes a long time to train them, as well as a whole series of others issues here.

“I do believe we can build brains—computer brains, artificial brains, or intelligent machines—that are faster and higher capacity than humans. But they are nothing at all like a human. These are just boring boxes like my laptop here. These are not emotional machines that experience lust and sex and want to control the world and reproduce and so on.”

Ron
November 18, 2009 at 8:22 pm

The wife and I are off to create a brain tonight. 😉

WLH
November 22, 2009 at 8:00 pm

If Jeff Hawkins founded Palm, then his credibility as a futurist is questionable.

Brian McTavish
November 18, 2009 at 8:29 am

Well yeah, sorta. But electronic ‘mind reading’ is merely a special (and if you’re thinking of, for example, dictating speech, currently un-doably difficult) case of the tech gradually molding itself more closely to the way we are. Between here and there (and necessary in order to get there) are many steps, including ubiquitous touch interfaces, semantically improved voice recognition (understands not just words, but what you actually mean), mid-air gesture interfaces, facial expression recognition, emotion recognition, body language recognition, various forms of physical and physiological monitoring (GPS statistical movement patterning, skin heat mapping, speed-of-movement mapping against daily norms – e.g. ‘You seem to be tired this morning’, relationship mapping and the appropriate exchange of these kinds of information – e.g. ‘You need to call your mother because she’s feeling lonely’), etc, etc. Each of these will, in exactly the same way you describe previous interfaces/ form factors not being completely displaced by the next but leaving traces and optimal surviving uses, nudge the way we interact with ‘cloud intelligence’ closer to the ideal of naturalism: the interface that is so good we have to be reminded it exists.

Andrej
November 18, 2009 at 9:05 am

Speaking of Vinge… this column sounds like Bob just read Rainbows End and then did a brain dump of anything and everything that occurred to him afterward without much thought or filtering.

Ryan Dancey
November 18, 2009 at 9:11 am

We used to know how to use an external resource to improve multitasking. It was called a secretary. The interface was all voice and it worked really well. Put enough AI in my iDevice to act as a competent secretary – screen my calls, order my lunch, answer my questions about when the client is coming in for the meeting, etc. and I’ll be much more productive.

Look at Don Draper’s desk in Mad Men. He has a phone, and a place to write. And that’s it. Draper enhanced the leverage he could apply to his insights and management ability by People Power. Mine should be enhanced by Apple Power.

If I could have the power of my desktop in a form factor equivalent to my iPhone, I’d dump the desktop – I don’t need a big black box cluttering up my desk. Why can’t my iDevice wirelessly connect to my monitor, speakers/headset, keyboard & mouse? Get rid of that silly beige CPU box and its clutter of cables and confusion. Get rid of the software that endlessly wants my attention for unimportant trivia. I’ve got real work to do. If Windows was my secretary he’d already be fired.

It already connects to my WLAN, my cell service and by extension all the cloud apps I use – phone, address book, calendar, file storage. By centralizing my communications it has made me more productive.

Untether me so that any workstation I sit down at; in my office, at my home, in the airplane, etc. becomes “my computer” and I’ll be more productive.

Make my iDevice social – automatically harvesting contact information from people I shake hands with, or companies I do financial transactions with, and I’ll be more productive. How hard would it be for my iDevice to hear me set up a meeting, schedule that in my calendar, confirm it with the iDevices of the other parties, book a free meeting space and connect the video conference when I’m in the room and ready to start?

Get rid of my computer. Bring back a competent staff.

RyanD

Ronc
November 18, 2009 at 2:23 pm

You can easily do that (“Get rid of my computer. Bring back a competent staff.”). You just have to be willing to pay for it.

John Dingler
November 18, 2009 at 9:40 am

Hi Bob,
You open up a discussion on the glorious time the lower voltage body will experience when it’s always connected to higher voltage devices.

However, imagine the unintended negative medical consequences occurring to the body’s physiology after it swims in that high voltage electronic field made up of all kinds of radio waves, this, from cell phones, earphones, hip-worn cash registers, cell towers, microwave ovens, contact lenses, and RF implants. It will be great! The human body will just tingle from the sub rosa tickle it will experience. Oh my. The stimulation will be as tantalizing to the senses of the connected man as lapping-up tasty antifreeze is to happy cats. I am looking forward to the future where a daily burst of sustained exposure will energize me.

Best of all, the plethora of exquisite novel agonies will likely open up promising opportunities for big Corpo medicine to devise expensive medical procedures to cure us.

Therefore, invest in big pharma that specializes in R&D in RF-induced, extremely interesting, novel diseases now before Warren Buffet buys up all the stock.

Hate to break it to you, but you got your upgrade cycle wrong. While I realize most people upgrade their phones (and computers) more often than I do – I’m still using the same model phone that I got when I originally signed my cell contract in Oct. 2004 – it would be the same phone, except the original one got replaced 2 years out just before warranty expired. We’re thinking about new phones – an OpenMoko for me, and possibly an iPhone for my wife; but we won’t sign any data contracts – I’m not adding another $30/mo to the phone bill; I currently have all data services disabled at the provider (AT&T), and blocks put in place on the phones to keep from accidentally triggering them (e.g. web, texting). (Yes, that would keep me from buying an iPhone.)

We usually keep computers around for about 8 years before upgrades.

I know others similar. Needless to say – ultimately, yes, we are in a mobile transition right now. But I don’t think it’s happening any faster than previous transitions, just differently.

AND, btw, the only thing that will really solve the I/O problem for displays is either holographic displays (think HelioDisplays but more portable) and/or contact lenses with integrated circuits for personal projection displays.

I have a shorter vision horizon than you. When I got my iPhone, we by coincidence at my company had a “computing fair” showing all sorts of technology. My iPhone is so good at many things, it shows and I can believe that what used to be my laptop/desktop will (within 5 years?) become my cell/pda/camera/everything in terms of computing power and memory, etc.

What is missing is the I/O, and our computing fair showed that also. You can now buy ($300?) a small cigarette-sized box that projects a very nice 30″ or greater diameter screen displan. You can now buy a separate roll-up keyboard, or another small box that projects a red image of keys onto you workspace; you could type on them.

The point is that at your office, you place the iPhone into a docking station and have a very nice real screen and keyboard. On the road, you have a more-than-adequate projection of the screen and a fairly good and compact way to key in your material.

It is so close that I can see it and almost taste it. I don’t know if they will be separate boxes or built-in to the iPhone (or whatever), but I will bet money that is will be there. And I’ll love it.

The electronic contact lenses can wait (as well as the cochlear ear implant).

DumaFan
November 20, 2009 at 6:57 am

What you want may be coming sooner than later. Google Pranav Mistry for his Ted Talk in India (the one posted in November) and his Sixth Sense device. Equipment costs around $300 (projector, cellphone) and he’s currently planning on open sourcing the software. If he does, it’ll be interesting to see how the Linux community, specifically, the Android/Google community, adapts his truly great creation.

Steveorevo
November 18, 2009 at 11:44 am

Worried about the output being just words? Augmented Reality devices, realtime rendering.

Imagine a real life clippy (god forbid) or a Roger Rabbit ready to work for you.

dfclark
November 18, 2009 at 1:13 pm

Speaking as both a Neurologist and computer science graduate, I doubt we’ll have thought interpretation or machine-to-CNS input devices in the time frame you’re entertaining. Modern EEG and MRI are painfully limited for solving much cruder problems. Look for Neurologic prostheses to lead the way to such computer-human interfaces, but the theory is not even there yet.

Creating a totally and immersive virtual world like that portrayed in “The Matrix” is far more complicated than sticking six large electrodes into your spinal cord. In fact, I would guess that 85% of the bandwidth between your brain and the outside world does not even involve the spinal cord. Sight alone has extremely high bandwidth as does audition and acceleration sense. Any computer-nervous system interface will require devices specialized for each human sensory modality, and there are more than just the five you learned in kindergarden.

More on-topic, I don’t see any successful device coming out that lacks a silent interface. People will always want to compose and send textual messages to each other or browse content on the internet while sitting in those boring conferences or in the bathroom stall. A voice-exclusive input won’t fly.

Patrick C
November 18, 2009 at 1:24 pm

This is the type of speculative futurism that almost always disappoints.

Mobile Platform Performance: we need to understand why mobile growth is so fast to understand if it will continue. It seems very unlikely that mobile computing power will ever exceed that of less constrained platforms. So it will probably slow.

Eyeball raster: Neal Stephenson predicted that in Snow Crash in 1992, and its still not here. Maybe?

Singularity: It assumes Moore’s-like laws are exponential. In reality these things grow as a sigmoid, we just happen to be living in the steep region of the curve. You don’t get singularity from a sigmoid.

Brain beaming: New Scientist pablum. There are fundamental barriers to beaming a signal into the brain that have little to do with computational power and unlike computing our ability to do this transmission grows slowly and linearly. How do I transmit a complex signal to the brain without damaging it? How do I know where it needs to go?

We won’t have the resolution, and we can’t account for person to person neuroanatomical variance. Of course, right now we don’t even have a theory that explains what form that signal would take. So the technology is beside the point unless we have a breakthrough in theoretical neuroscience that is bigger than any breakthrough in any field of science. Ever.

Lighthouse
November 18, 2009 at 1:35 pm

We will have a mental interface when we have jet packs. You are making the same mistake earlier technologist made. My grandma in her life time went from horse and buggy to moon shots and international jet travel. The futurists in her day expected that trend to continue and predicted all kinds of crazy transportation devices. Of course it did not happen and neither will the computer singularity.

One basic reason, hardware may be advancing every year but software is stuck. MS, the richest, most powerful software company on the planet just spent several years developing Vista and we got, what, 3d icons? That is what they did with all that power? See thru windows? Really? How the hell are we going to have a singularity if the software is so lame?

And what about the users? We have built the Internet, this enormous, complicated machine, the largest technological device ever built, our generations moon shoot, our civilizations pyramids. Just look at the software stack it is built on, all those man years in switching, server O/S , client O/S, browsers. All that time and money spent wiring it up. And the users? What do they do with it? Down load porn, play single person shooters, steal music and post party pics on Facebook.

God help us, we are too stupid a race to create something like the singularity. The future will look much like today, only mobile.

Patrick C
November 18, 2009 at 1:59 pm

A little more background on brain-beaming.

We have a current technology that can non-invasively stimulate the brain called TMS(trans-cranial magnetic stimulation). This is the only promising technology to perform what Cringely is proposing, but it has serious limitations that will prevent it from scaling up(technically it is scaling down…)

It uses electromagnetic induction to generate currents in pyramidal cortical neurons oriented perpendicular to the surface of the skull. Right now you can use it to make a person’s finger twitch(or more accurately induce a desire to twitch their finger as you target the volitional area of the cortex). The resolution is ~1 cm^3

The problem is that there isn’t much potential to do any better. Because of the properties of magnetic fields you can only stimulate a subset of neurons with a particular geometry. To “beam a thought” you would need to target many different neural populations. Plus you would need much higher resolution. Unfortunately, a magnetic field that can penetrate deep enough to stimulate your cortex, will inevitably stimulate a bunch of cortex. Thought beaming would require stimulating many different small groups of neurons simultaneously. Thus placing a constraint that the field be stronger, to reach more neurons, but also weaker, to target neurons precisely.

Another unpleasant side effect is the tendency of the device to affect peripheral neurons. It will make all your facial muscles contract quite uncomfortably.

This technique is about the equivalent of taking an AC power line and touching it to your motherboard directly. It will do something, but not something particular.

Any effective future interface will be invasive, requiring cybernetics and serious neurosurgery; hardly an interface option we can expect will catch on.

Oooo…Computers in our heads. I guess that may be coming. “Output” could be OK but “Input”? Do you want that new car, or is mental advertising telling you that you want that new car?

Andrew
November 18, 2009 at 4:19 pm

The singularity will always be 20 years off.
Kurzweil forgets the practical experementation phase in the cycle of improvement, so as you see in the race towards fusion power, you may be able to theorise/work out/plan the next improvement very quickly – but it is still expensive in money and time to implement and test.
There is a big difference between seeing the electrics in a brain and knowing you are thinking about coffee – I am not sure we are even up to being able to tell if someone is lying yet.

Gee, we need something better than a tiny keyboard and screen. I know, let’s go watch the Jetsons and see what they would do. And let’s just randomly predict it will happen in 20 years without any thought whatsoever.

Question, what does Moore’s law have to do with understanding brain waves? You are like a bad movie for the scientifically illiterate, where computers are magic and can do anything.

18,000 basketball-sized pieces of space junk doesn’t seem like that many. if there were 18,000 basketballs on earth, you would have to drive a long time before you found one. plus, the increased area of the orbit as compared to the surface of the earth means that the basketballs are even more spread out. i am not worried.

Alain
November 19, 2009 at 9:34 pm

Unfortunately, those basketballs are not
uniformly distributed over the area of a sphere,
but are concentrated in the few useful orbits.
So it’s worse than what your calculation
would come out to by many, many orders
of magnitude.

OTOH, everything in a given orbit moves
basically at the same speed (by definition) so
you don’t tend to get any high velocity collisions. It’s just that satellites are very
delicate pieces of equipment…

Computers represent the epitome of INTELLIGENCE – speedy processing and almost limitless data storage. They are all Ken Jennings (of Jeopardy fame) – able to quickly answer any “trivia” type question. So if they are so “smart”, how come Artificial Intelligence never gets much better? What are the limitations of computers?

Let me take this opportunity to give my definition of WISDOM. Wisdom is the CORRECT application of Intelligence. The “great guru” on the mountain (parable) represents wisdom. Why do people trek great distances and endure much discomfort to ask him a question? Because he gives the CORRECT answer. If you wanted B.S. answers, you could ask the guy next to you.

This is the issue – Computers have NO wisdom! The act of programming is to instill wisdom to the computer – so it can make correct answers. Thus Artificial Intelligence SHOULD be called Artificial Wisdom – which then reflects on how difficult the topic is. Common sense ain’t so common . . .

So as fast and small as you can make computers, they will ALWAYS lack the wisdom factor. Anything they can do reflects on the wisdom and skill of their programmers – thus HUMANS.

Do you have stats for the stated average cell phone replacement interval? My impression is that most people replace their phones whenever their carrier subsidy allows them to, which would be every 24 months or so, not every 18 months maximum, as you said.

[…] I Cringely likes to provoke. His latest doesn’t disappoint. Cringely offers some provocative thoughts on how the revolution in mobile computing may overcome its logistical problems with tiny user interfaces. Via contact lenses? No way. How about this So there’s a platform transition happening. We’re in the middle of it. The new platform is a mobile interface to a cloud network. And the way we’ll shortly communicate with our devices, I predict, will be through our thoughts. By 2029 (and probably a lot sooner) we’ll think our input and see pictures in our heads. […]

James D
November 19, 2009 at 1:01 am

Recently there was a demonstration of how EEG could be used to think out letters and enter them into a computer. The data rate was not very high but at least it’s proof that controlling computers with our thoughts is already within the realm of current technology. What I know about the brain, however, leads me to believe this is not going to be practical; it’s “doable” but it won’t be convenient.

I think what is more likely as far as input are new variations on the current point and click mouse and keyboard interface. Instead of a mouse it will key off of eye movements and blinks. For a keyboard you will wear a ring which interprets your hand movements.and translates them into characters. In a strange way the Wii game system may be leading the way as far as futuristic input devices. Instead of typing you just spell the letters out in the air. It’s perfectly natural – perhaps even more natural than a keyboard.

Mark
November 19, 2009 at 7:15 am

I am dreading the day that our teenagers can post status updates on facebook simply by THINKING them!

Eugene King
November 19, 2009 at 12:11 pm

I didn’t read all the comments so please forgive me if this was mentioned earlier..
Since Bob mentioned Star Trek, do you folks remember the Next Generation episode where aliens were using a gaming device to take over the Enterprise?
The device shined beams into the users eyes.

Thanks. Cringely is very observant and insightful about future of personal computing and “assistant” devices.

Gives me opportunity to think, over lunch/coffee…

a few of my thoughts

Mobile right now seems more about instant and easy remote communications, more access to “cloud” info and time shift (Text, or whenever we miss each other messages). It is apparent that computing is more and more storage or access to information, which ultimately will all be in the “cloud”. The “singularity event” has probably happened for most people with PC’s (ie more processing and info capacity than their brain. Quote “thanks to Ray Kurzweil’s singularity — that point at which everyday machines have more computing cycles than I do “.

Ok, I admit my device is “smarter” than me (in some ways:)

(expanding on this discussion) it’s great to have all this processing and easy mobile info access, which is definitely more “efficient” (which businesses like:). However, from my viewpoint, these devices are not doing much to make us wiser, with higher values, healthier choices, nor shaping more peaceful relationships, strengthening families. While it is not the device or computer companies responsibility, we also badly need better “life guidance programming” on how to live with these devices, for “improved people”, and a better world result.

So technology while having many benefits, and connecting even our minds, it is not shaping our hearts. We need enhancements to both for transforming ourselves and thus the world.

In sum, I like my cell phone, and PC, and will sign up for combining them with more power…but will not trade-in my “old Bible” (or online versions), or kiss when I get home at night, or even good old fashioned advice from a friend.

my soapbox philosophy

JohnP

Alain
November 19, 2009 at 4:21 pm

Hey Cringely, you’ve got the right producer but the wrong show. Roddenberry was behind both the original Star Trek, and the subsequent iteration, Star Trek: The Next Generation. In the original Star Trek captain Kirk was too busy kissing the girls and getting into fights with the bad guys to have much time talking to computers. When the captain wanted something done he’d tell a subordinate to do it, not a computer. Sometimes, he or a subordinate would dramatically push a button or flip a switch. But most of the time they would talk in snappy dialogue about what they could wring out of the machine — “Scotty, you’ve got to give me two more minutes at warp X” — “I’ll do my best captain, but I canna let those engines go on for much longer”

It’s in the next iteration Star trek: The Next Generation that you had captain Picard akways saying, “computer, do this or “computer do that” or sometimes other members of the ensemble cast would do it, instead of going for the dramatic interplay with subordinates who did the button pushing or engine room sweating.

The reason why you never had visual interfaces had much to do with the lack of drama in them and also the cost of the special effects needed. Just doing the fairly “dumb” cell animation or the dumb colored filter work needed for interfaces in 2001 a Space Odyssey was a very time consuming affair, mostly because of setup work by visual artists. Eventually getting a lot of fancy digital effects computers simplified things but it never eliminated the long setup time by visual artists.

Dave Alford
November 19, 2009 at 9:34 pm

Yes, I’ve ruminated for years on the “PC in the watch” scenario, but at the end of the day, it’s always the damn interface that’s the problem.

A few years ago I got to try out a ‘wearable computer.’ It had a postage stamp sized eye piece for a display. The cpu was about the size of two decks of cards. The interface was a small hand held trackball. The trackball was the most interesting and useful part of the whole thing. The problem was simple — it is very hard to focus on an xga screen that is 2 inches from your eye. You can see the icon’s and menu options. The innovative trackball made it easy to select things. You can open a word processor and open a file. Actually “reading” the document proved to be very hard. The display while a very nice piece of engineering, was not really practical.

One of my favorite movies from 1967, The Presidents Analyst had an interesting concept — the cerebrum communicator. It is on youtube.

Roger
November 20, 2009 at 7:23 am

Has anyone told this to Willie Nelson? I bet not, – AND – I bet he wouldn’t believe it just like I don’t.

Fantasies of futures dominated by artificial intelligence are irresistible, and John is especially vulnerable to them. One thing that one really needs to bear in mind is the economic impetus for such developments. Is infinite computing power economically viable or even feasible? Desirable?

Microvision makes fantastic products, but how many have crossed the airline catalog oddity category into mainstream? At one point, the Palm PDAs were thought to be the epitome of design, but they gave way to other (inferior?) designs.

In as much, the future of the mobile computing platform will ultimately be determined by what services and apps people are willing to pay for. We sell iPhone apps, and I have a damn good idea that nobody is making any money off the apps yet. And, all of us are wondering what that killer app will be.

For the PC, the killer app was Visicalc. On just about any platform, the killer apps remain horrendously mundane: browser, wordprecessor, photo organizer, media player, email (maybe same as a browser) and maybe a spreadsheet program.

This explains Kurzweil’s and Hertzfeld’s lament mentioned above: both are very impressed with what computers can do, but very disappointed with what they have become. Well, if you look at the list of killer apps above, you see the answer: computers have become what we want them to be.

So, the fundamental question is, does the market want a machine that thinks for the user? Though we all wish that they would, even Sarah Palin and George W Bush won’t farm out the task of thinking to their computers.

In summary, let’s not forget that capitalism can be a powerful enforcer of mediocrity. You can make the most powerful and the most intelligent computer in the universe. If you can’t sell it, then you won’t be able even to power it.

Neil in Chicago
November 22, 2009 at 10:35 am

The “smart phone” is the true Second Generation Personal Computer. Clearly, it satisfies the information needs and desires of a lot of people. That is, it does everything they want when they’re not at work, where they use desktop systems. In the office, you use Office(tm).
So instead of its cash cows dominating the universe, suddenly Microsoft’s cash cows dominate only one galaxy in a suddenly expanded universe . . .
There you go, a hook for a whole ‘nother column.

Steve
November 22, 2009 at 9:35 pm

“Twenty years ago was Windows 3.0 and Mac OS 6.”

Doesn’t really help your point. The interface and O.S. ain’t that much better in 20 years. e.g. it took Windows 7.0 twenty years to provide an easy command for putting two windows side-by-side (an then they have the nerve to brag about in a TV commercial). Even though 3.0 was called WindowS with an s.

Eric
November 22, 2009 at 10:51 pm

My problem with the Singularity, though it is making a great platform for current hard science fiction (as opposed to pop sci fi) and a foil for post-Singularity fiction, it’s a canard. I’m not putting a copy of me in a computer. Because in the end, the mind living in the computer is a copy of you, not you. You’re going to die. A copy of the Mona Lisa is worth way less than the original for good reason.

Kurzweil is at best a pop-science tale-spinner. The basis of his claims are not more valid than the techno-babble in Star Trek.

ed mcguirk
November 23, 2009 at 11:42 am

Your mind on the computor does not get there by copying. It starts with an electronic prosthesis to your mind. It might only be a calculator or clock or phone but eventually you get more and more electronic functions added to your biological mind.

Then one day you notice that more of your mind is electronic than biological.

Then maybe a long time after that, you decide that the biological portion of your mind is an inconvienient limitation and you drop it.

ed mcguirk
November 23, 2009 at 11:46 am

Or maybe your electronic portion of your mind gets so spread out and diversified that you just forget where you left your tiny little biological bit of brain.

Excellent observations and thoughts. I can easily imagine contact lens as displays. But I still do not see what can replace the keyboard. A keyboard is nothing but symbolic logic interface. I can not speculate how it can be replaced, yet.

Nick
November 23, 2009 at 2:15 pm

A thought based UI is orders of magnitude more advancement than Mac OS 6 to 10.6. Bob, you’ve said before we tend to overestimate technology influence in the short term and underestimate it in the long term (or something like that). I think you’re overestimating change here. For example apart from mobility, in what way is SMS more advanced than the voiced based communication of the 1950’s phone system? I bet in 20 years mobile technology will do much more by expanding what we use it for, but if we need to write a college thesis there will still be a QWERTY keyboard and some kind of word processor involved.

You are looking too hard at the problem. Most of the hotels have flat screen TV’s. A vga/dvi output would make for an excellent large display on travel. So a folding or small keyboard with mouse would round out what you need in the hotel room. On the go the small screen is sufficient.

[…] Shipping More Units It’s important to understand just how quickly things are changing. Part of this comes down to the hardware replacement cycle for these devices. A PC generation is traditionally 18 months long and most of us are unwilling to be more than two generations behind, so we get a new desktop or notebook every 36 months. Mobile devices don’t last that long, nor are they expected to. The replacement cycle is 18 months, reinforced by customer contract terms that give us a new device every couple of years in return for staying a loyal customer. Mobile hardware generations last nine months, and 18 tends to be the maximum time any of us use a single device. […]

[…] Shipping More Units It’s important to understand just how quickly things are changing. Part of this comes down to the hardware replacement cycle for these devices. A PC generation is traditionally 18 months long and most of us are unwilling to be more than two generations behind, so we get a new desktop or notebook every 36 months. Mobile devices don’t last that long, nor are they expected to. The replacement cycle is 18 months, reinforced by customer contract terms that give us a new device every couple of years in return for staying a loyal customer. Mobile hardware generations last nine months, and 18 tends to be the maximum time any of us use a single device. […]

Michael
November 25, 2009 at 2:12 pm

Regarding the problem of input, those tiny keyboards have caused the adaptation of language to overcome their limitations: witness a teen’s texting. It is an evolving, bona fide dialect with a grammar and a social glue to it’s users. We have always adapted instinctively to our unyielding devices. Those screens, however…

Doug
November 29, 2009 at 8:27 am

The thought interface is intriguing, for me, to say the least. Only price has prevented me from owning a BrainFingers(tm) headband and home research into mind over machine control. Only the bulkiness of heads up displays has prevented me from home research into mind over machine feedback. Both of these are the same old price point and endless march of progress design barriers to entry that Bob has so masterfully communicated time and again. For me the Singularity represents the true human 2.0 where our physical 1.0 five senses mind in body control begin to be supplanted by the sixth sense of mind out of body control, something I would suggest we’ve all been dreaming (and using to get from A to B all these eons and what seems to me to be the principle driver of all of this innovation in the first place.) For me the hand (keyboard), eye (monitor) and ear (auditory) interfaces by thought human 1.0 has always been where my 2.0 brain has been trapped, I’ve been handicapped all my life by bodily limitation.

[…] and Apple for the next generation of computing (a mobile generation, as Robert X. Cringely reminds us), Google seems to be willing to compromise on means as long as the ends are the same. For them web […]

[…] an informed spectator of the digital world. Robert X. Cringely suggests that the solution is Pictures in Our Heads. He too sees a huge growth for mobile devices since the purchasing cycle is rapid and new […]

Paul
December 5, 2009 at 9:53 am

Random thoughts…

While we’ve been making computers “smarter,” people have been getting dumbed down by advertising, by bad education, by political propaganda, etc. As Dave Frishberg sang, we’re “marooned in a blizzard of lies.”

Even honest research in honest journals is so overwhelming in quantity that it’s hard to sift through enough of it to find the gems. And worthwhile stuff can easily end up ignored. Who knows if a great general solution to some serious current problem was published in 1952, and forgotten? It doesn’t help that fueled partly by the factors mentioned in the previous paragraph, we have such an attitude of “tommorowism” that an idea from last night gets dismissed as “Oh, that’s so 12 hours ago!”

Maybe it will become trendy to respect the best of the past again, and “tomorrowism” will eventually be dismissed as “so 2009.”

Computers may become more clever, but can they become wise?

I suppose that the Singularity means that computers will be so powerful that we’ll be able to test the “million monkeys typing out Shakespeare” and other Darwinian ideas, that random variation can eventually lead to higher and higher intelligence. But what happens if it doesn’t work? Then we’re left with a big question.

[…] bypass it and go straight to thought-controlled computers. On our own planet, labs are pursing thought-controlled UIs, and it’s easy to believe that this will be the ultimate UI: no words, no actions at all, […]

This sounds like science fiction but I guess it could become reality. It would certainly be a strange concept to see it inside our minds and I can’t really imagine it yet but then I guess all great things are hard to imagine.

You may have not intended to do so, but I think you have managed to express the state of mind that a lot of people are in. The sense of wanting to help, but not knowing how or where, is one thing a lot of us are going through.

I’ve found (in the research that I’ve done) that e-cigs have no negatives as far as health goes. Especially when you can control the amount of nicotine that you use in your e-cig, all the way down to none if you choose. At that point you would either be ‘vaping’ pharmaceutical grade Propolyne Glycol (very thin almost watery) or 100% vegetable glycerine (thick and produces a massive amount of vapor.

I knew I had been right. My friend and I placed a bet about which web site was superior. I thought your webpage was much much better created, but she believed this post on trendy style suggestions was much better. We rounded up 5 loved ones memebers who experienced not observed either web site prior to to study them each a lot more than. Majority chose your site. Thanks for maintaing a great website.

Appreciate your taking this possiblity to talk about this, Personally i think strongly over it and I benefit from learning about this subject. If at all, when you gain data, please update this website with new information. I’ve discovered it extremely useful.

I was wondering if I could use the information for my site. You can be sure I’ll post a link back to this post, and I definitely wont copy the whole thing word for word. I just want to take this content and use what you have here to talk about. Please send me a message and let me know if that’s ok with you 😉

How-do-you-do, just needed you to know I have added your site to my Google bookmarks because of your extraordinary blog layout. But seriously, I think your site has one of the freshest theme I’ve came across. It really helps make reading your blog a lot easier.

When I read a really good post I usually do some things:1.Share it with all the relevant contacts.2.keep it in some of the popular bookmarking sites.3.Be sure to come back to the website where I first read the article.After reading this article I am really concidering doing all of the above!

Hi webmaster, commenters and everybody else !!! The blog was absolutely fantastic! Lots of great information and inspiration, both of which we all need!b Keep ’em coming… you all do such a great job at such Concepts… can’t tell you how much I, for one appreciate all you do!

Great – I should certainly pronounce, impressed with your web site. I had no trouble navigating through all the tabs and related info ended up being truly easy to do to access. I recently found what I hoped for before you know it in the least. Reasonably unusual. Is likely to appreciate it for those who add forums or anything, website theme . a tones way for your customer to communicate. Excellent task.

A remarkable share, I just with all this onto a colleague who had previously been doing slightly analysis for this. And he in fact bought me breakfast because I came across it for him.. smile. So i want to reword that will: Thnx for the treat! But yep Thnkx pertaining to spending the time to go over this, I feel strongly over it and enjoy reading more with this topic. If probable, as suddenly you become expertise, would anyone mind updating your website with more information? It is actually highly great for me. Big usb up because of this blog article!