I know it's rare that we have a technical discussion here at UR. But every once in a while the urge prevails. If nothing else, it attracts the right people to the rest of the cult.

(For readers of that Wolfram Alpha post, it seems almost superfluous to remotely diagnose today's tech-media darling, Siri, as yet another tragic case of the hubristic user interface (HUI). Then again, if anyone can pull off hubris and exceed the gods themselves... but in my much-refurbished crystal ball, here is what I see: Siri works beautifully 98% of the time. The other 2%, it screws up. Half of these screwups are hilarious. 1% of the hilarious screwups are blogged about. And that's enough bad PR, fairly or not, to restrict usage to a fringe. As with all previous attempts at voice-controlled computing. Open the pod bay doors, Hal.

No tool can achieve the natural "bicycle for the mind" status of a mere mental peripheral, unless the mind has an internal model of it and knows when it will work and when it won't. This cannot be achieved unless either the mind is genuinely human and thus understood by empathy, or the actual algorithm inside the tool is so simple that the user can understand it as a predictable machine. Between these maxima lies the uncanny valley - in which multitudes perish.

The only exemption from this iron law of expensive failure, a voracious money-shark that has devoured billions of venture dollars in the last decade, is a set of devices best described, albeit pejoratively, as "toys" - applications such as search, whose output is inherently unpredictable. Ie, inherent in the concept of search is that your search results are generated by an insane robot. This is not inherent in the concept of a personal assistant, however. Also, while search results are inherently heuristic - search queries are inherently rigorous.)

In any case - computers. When I went to grad school to lurn computers, it was way back in 1992. I was pretty sure that, in the future, we would have cool shit. Instead, twenty years in the future, I find myself typing on... the same old shit. Yo! Yo nigga, this bullshit, yo.

It would be one thing if all the bullshit actually worked. Or at least if it didn't suck. At the very least, it would be better if our entire system software architecture - from the '70s database at the ass end of the server room, to the '90s Flash player playing ugly-ass Flash in your face - though it sucked giant tubes of ass like Hoover Dam in reverse, at least this was a secret. At least no one knew it was ass.

It's the 21st century. We should be soaring like eagles above the 20th-century legacy bullshit, expressing only the purest of functions in the pure language of mathematics. But somehow it hasn't happened. The technology just isn't there, or at least it isn't deployed. All we have is the same old assware, and no alternative but to live in its crack. Brendan Eich took what, two weeks, to build Javascript? And it has no long integers - just floating point. Millions of brown hours, deep in Brendan Eich's valley. To be fair, the fellow appears to be sorry. Not that this helps.

So - we're just going to assume that God won't tolerate this shit. Not that he spares the rod. But there's always a limit. So we're just going to pick an arbitrary year, 2020, by which the 20th-century assware will all be gone, all of it. And software will instead be great. From top to bottom, server to client, cloud to mobile, end to end and ass to elbow. (Note that 2020 is two years before the famous HTML 5 deadline.)

The question then becomes: with this great new software infrastructure, scheduled for 2020, what the heck will we be doing? How will we be living our wondrous 2020 digital lives?

I actually have an answer to the question. The answer is: personal cloud computing. I mean, duh. Yes, I know it sounds like yet another Palo Alto buzzword. Blander, in fact, than most. Google even finds it, a little bit, in reference to various BS.

Actually, I think the transition from 2011 computing to 2020 computing - if 2020 computing is personal cloud computing, as per my buzzword, which I claim in the name of Spain - should be at least as radical a disruptive break as any previously experienced in the seismically unstable Valley of Heart's Delight.

Consider a classic San Andreas tech-quake: the transition from minis to PCs. Cloud computing in 2011 is a lot like all computing in 1971. Why? Because industrial and consumer computing products are entirely disjoint. In 1971, you can buy a PDP-11 or you can buy a solar calculator. The PDP-11 is industrial equipment - a major capital expenditure. The solar calculator is a toy - an information appliance. The PC? The concept is barely imaginable.

In 1971, you already exist as a database row in various billing and banking systems. (I lived in Palo Alto in 1976 when I was 3. My parents apparently had Kaiser. When an employer in the late '90s put me on Kaiser, I was amazed to be asked if I still lived on Alma Street.)

What's an information appliance? Any fixed-function device, physical or virtual, whose computing semantics the user does not control. An IA is anything that processes data, but is not a general-purpose computer. (A non-jailbroken smartphone is about half an IA and half a computer, because the user controls the apps but not the OS, and the interface is app-centric rather than document or task-centric - the OS as a whole is little more than an app virtualizer, ie, a browser.)

To generalize slightly, we can say that in 1971, there was a market for industrial computing, and there was a market for information appliances. Not only was the connection between these two product lines roughly nil, it was more than a decade before the PC emerged to replace "smart typewriters," and the early 2000s before Linux effectively merged the PC and workstation markets.

Today, in the cloud, you can go anywhere and rent a virtual Linux box. That's industrial computing. You also have to cower in your closet, like me, to avoid having a Facebook profile. That's a virtual information appliance. So is just about any other consumer cloud service. Therefore, we have industrial cloud computing that isn't personal, and we have personal cloud computing that isn't computing.

In fact, if you use the cloud at all seriously, you probably have 10 or 20 virtual information appliances - each one different and special. If you are especially well-organized, you may have only two or three identities for this whole motley flock, along with seven or eight passwords - at most four of which are secure. Welcome to the wonderful new world of Web 2.0. Would you like some angel funding? Ha, ha.

In the future, in 2020, you don't have all these custom information appliances, because you have something much better: an actual computer in the sky. Instead of using Web services that run on someone else's computer, you use your own apps running on your own (virtual) computer.

I realize - it's completely wild, unprecedented and groundbreaking. But let's look at an example.

Let's imagine 2011 software is 2020 software, so we can see how this works. In 2020, of course, you use Facebook just like you do now. Facebook still rules the world. Its product is a completely different one, however - personal cloud computing.

This started in 2012, when Facebook introduced a new widget - Facebook Terminal, your personal Ubuntu install in the cloud. Everyone's Facebook state profile now includes a virtual Linux image - a perfect simulation of an imaginary 80486. Users administer these VMs themselves, of course. In the beginning was the command line - in the end, also, is the command line. Moreover, just because it's run from the command line on a remote server - doesn't mean it can't open a window in your face. If you're still reading this, you've probably heard of "xterm."

Terminal will simply bemuse Joe Sixpack at first - Facebook's user base having come a long way from Harvard CS. But Joe learned DOS in the '80s, so he'll just have to get used to bash. At least it's not sh. It has history and completion and stuff.

Furthermore, Joe has a remarkable pleasure awaiting - he can host his own apps. All the cloud apps Joe uses, he hosts himself on his own virtual computer. Yes, I know - utterly crazed.

Today, for instance, Joe might use a Web 2.0 service like mint.com. Beautifully crafted personal finance software on teh Internets. Delivered in the comfort and safety of your very own browser, which has a 1.5GB resident set and contains lines of code first checked in in 1992. Your moneys is most certainly safe with our advanced generational garbage collector. Admire its pretty twirling pinwheel as merrily your coffee steeps. Mozilla: making Emacs look tight and snappy, since the early Clinton administration.

But I digress. Where is Joe's financial data in mint.com? In, well, mint.com. Suppose Joe wants to move his financial data to taxbrain.com? Suppose Joe decides he doesn't like taxbrain.com, and wants to go back to mint.com? With all his data perfectly intact?

Well, in 2011, Joe could always do some yoga. He's got an ass right there to suck. It's just a matter of how far he can bend.

Imagine the restfulness of 2020 Joe when he finds that he can have just one computer in the sky, and he is the one who controls all its data and all of its code. Joe remembers when King Zuckerberg used to switch the UI on him, making his whole morning weird, automatically sharing his candid underwear shots with Madeleine Albright.

Now, with Facebook Terminal, Joe himself is King Customer. His Facebook UI is just a shell - starting with a login screen. Joe can put anything in his .profile or even fire it off directly from /etc/rc. He changes this shell when he damn well pleases. And where is his personal data? It's all in his home directory. Jesus Mary Mother of God! It can't possibly be this easy. But it is. So if he wants to switch from one personal finance app to another - same data, standard data, different app. He's a free man.

Suppose Joe wants to go shopping on teh Internets? He doesn't fire up his browser and go to amazon.com. He stays right there on Facebook Terminal and runs his own shopping application on his own virtual Linux box. Heck, he probably downloaded it from source and tweaked the termcap handling and/or optimization flags. (It's a general principle that anything written for termcap won't work on terminfo, even if it says it will.) Through an ASCII curses telnet in his Facebook Terminal - or, better yet, a Javascript X server in a Mozilla tab - he executes his shopping application (in C++ with OSF/Motif - that ultra-modern 3D look).

How does Joe's shopping application, which he hosts himself on Facebook Terminal, communicate with Amazon and other providers? Of course, book distributors in 2020 no longer write their own UIs. They just offer REST APIs - to price a book, to search for books, to buy a book. All of online shopping works this way. The UI is separate from the service. The entire concept of a "web store" is so 2011. Because Joe controls his own server, he can use classic '90s B2B protocols when he wants to replenish his inventory. I wouldn't at all rule out the use of SOAP or at least XML-RPC.

So. We have a problem here, of course, because Facebook Terminal is a joke. If Facebook users were a group of 750 million "original neckbeards," the system above would be the perfect product. Also the world would be a very different place in quite a number of ways. But let's continue the thought-experiment and stick with this spherical cow.

Consider the difference between the imaginary Facebook Terminal and the real Facebook Connect. The former is a platform - the latter is a "platform." There is a sort of logical pretence, at the user-interface layer, that a third-party site which uses Facebook authentication to commit, of course with your full cryptographic approval, unnatural acts upon your private data, is "your" app in just the same sense that an app on your iPhone is "your" app.

But you control one of these things, and not the other. When you host an app, you own the app. When you give your keys to a remote app, the app owns you. Or at least a chunk of you.

It's almost impossible for a Web user of 2011 to imagine an environment in which he actually controls his own computing. An illustrative problem is that chestnut of OS designers, cross-application communication. Look at fancy latest-generation aggregators like Greplin or ifttt. These apps work their tails off to get their hooks into all your data, which is spread around the cloud like the Columbia over Texas, and reconstruct it in one place as if it was actually a single data structure. Which of course, if you had a personal cloud computer - it actually would be. And "if" and "grep" would not seem like gigantic achievements requiring multiple rounds of angel funding, now, would they?

The Facebook of 2011 - and more broadly, the Web application ecosystem of 2011 - is not a personal cloud computer, because it's not a computer. Generalizing across your state in Facebook itself, plus all the external apps that use your Facebook identity, we see a collection of virtual information appliances, mutually unaware and utterly incompatible.

Even if Facebook becomes the universal authentication standard of the Web, a feat it would surely like to achieve, and surely a great advance at least in usability over the status quo, its users' lives in the cloud would not be anything but a disconnected salad of cloud information appliances. They would not have a personal cloud computer, or anything like one. Moreover, if one of these information appliances somehow evolved into a general-purpose computer, its users would realize that they no longer needed all the other information appliances.

Comparing the consumer cloud computing of 2011 to the personal cloud computing of 2020 is like comparing the online-services world of 1991 to the Web world of 2000. It's easy to forget that in 1991, Prodigy was still a big player. Prodigy: the Facebook of 1991. In 1991, you could use your 2400-baud modem to call any of a number of fine online services and other BBSes. By 2000, your 56K modem called only one thing: the Internet. The Internet, seen from the perspective of the Bell System, was the killer online service that killed all the other services.

Another difference between 2011 and 2020 is the business model. The Web 2.0 business model is first and foremost an advertising model. Or so at least has this present boom been built. Yo, bitches, I've seen a few of these booms.

Advertising is a payment model for information appliances. Your TV is an appliance. You see ads on your TV. Your PC is not an appliance. You'd find it shocking, disgraceful and pathetic if the new version of Windows Vista tried to make money by showing you ads. In fact, there have been attempts at ads on the PC - in every case, heinous, tacky and unsuccessful.

Advertising ceases to exist where an efficient payment channel arises. Why does TV show ads? Because the technical medium does not facilitate direct payment for content. It would be much more efficient for the producers of a new show to charge you fifty cents an hour, and most people would easily pay fifty cents per hour to never have to even skip past ads. Or to put it differently, fairly few people would choose to watch ads for fifty cents per hour.

Thus, if payment is straightforward, the whole inefficient pseudo-channel of advertising evaporates and the digital Mad Men are out on their asses. Taste the pain, algo-bitches! (There's only one thing I hate more than algorithms: the pencil-necked geeks who are good at algorithms.)

In 2020, how does Joe pay for computing? He pays for three things: content, code (ie, content), and computing resources. Probably his ISP is his host, so that's a very straightforward billing channel for resources, easily extended to code/content. Joe would never even dream of installing an app which showed him ads. So there's no use in figuring out what his buying patterns are, is there? Sorry, Mad Men. Go back to the math department.

Consider search in 2020. In search, too, PCC (not to be confused with proof-carrying code) separates the UI and the service. Joe uses one search app, which can be connected to any number of remote back-ends. If he doesn't like Google's results, he can Bing and decide, without changing his user experience at all. Result: brutal commoditization pressure in the search market, which has to bill at micropennies per query and has no channel for displaying ads - except in the results, which sucks and won't happen. Consider Mexican bikers, cooking meth in a burned-out Googleplex.

Alas! All that is great passes rapidly away. In this imaginary 2020, we see nothing left of Silicon Valley's existing corporate giants, except possibly a Facebook on steroids, whose information-appliance profiles have morphed into virtual Linux instances. Death by commoditization. Hey, it wouldn't be the first time.

But wait! Can this actually happen? Is it really possible to turn everyone's Facebook profile into a general-purpose computer? Frankly, I doubt it. If I worked at Facebook, which of course I don't, I would be extremely skeptical of Facebook Terminal, for reasons I think are quite obvious.

In real life, this apocalypse just isn't going to happen. In real life, 2020 will be pretty much just like 2011. And why? Because we just don't have the software technology to build 2020. And we're probably not about to get it, either.

Let's look at this issue in a little more detail. But the point is obvious. Hosting mint.com is pretty much a full-time job for the guys at mint.com. Expecting Joe Sixpack to download their code, for free or for pay, and set up his own server, is just absurd.

Of course, Joe is unlikely to have a serious load issue on his private server - because he's the only user. But still, Joe is not an Ubuntu administrator, he doesn't want to be an Ubuntu administrator, and frankly he probably doesn't have the raw neurological capacity to be an Ubuntu administrator. Scratching his balls, booting MS-DOS and typing "copy a:*.txt b:" is about the limit of Joe's computational ambitions and abilities. You could put a visual interface on his console, but frankly, this would probably only confuse him more. I want to serve Joe's needs, but I won't let myself overestimate his qualities.

We're starting to answer the essential question here: why hasn't personal cloud computing already happened? Why doesn't it work this way already? Because frankly, the idea is obvious. It's just the actual code that isn't there. (Here is the closest thing I've seen. Let's hope Joe Sixpack is a good node.js sysadmin.)

Let's go back to 1971. The idea of a personal computer was also obvious to people in 1971. Moore's Law was reasonably well understood in 1971. So it was clear that, if in 1971 you could build a PDP-11 the size of a refrigerator and sell it for $20,000, in 1981 it would be possible to build a PDP-11 that fit under a desk and cost $2000.

But this irresistible logic ran into an immovable object. Who wants a PDP-11 on their desk? The PDP-11 evolved into the mighty VAX. Who wants a VAX on their desk? Even if you can build a VAX that fits on a desk and cost $2000, in what way is this a viable consumer product? It's not, of course. Similarly, turning 700 million Facebook profiles into virtual Ubuntu images is not, in any way, a viable product strategy - or even a sane one.

The "Facebook Terminal" example is ridiculous not because the idea of personal cloud computing is ridiculous, but because "Facebook Terminal" is a ridiculous product. Specifically, the idea that, to build a virtual computer in 2011, we should design a virtual emulation of a physical computer first produced in 1981, running an OS that dates to 1971, cannot fail to excite the mirth of the 2020 epoch. (And I say this as one who still owns a copy of the seminal BSD filesystem paper, autographed by Keith Bostic.)

Again: who wants a PDP-11 on their desk? Here we encounter Gall's law:

A complex system that works is invariably found to have evolved from a simple system that worked. The inverse proposition also appears to be true: A complex system designed from scratch never works and cannot be made to work. You have to start over, beginning with a working simple system.

If you want an Apple II, you don't start by shrinking a PDP-11. You have to build an Apple II. If you want not an Apple II but rather an electronic typewriter, there's a market for that. I recall that market. In the long run I'm afraid it didn't compete too well with the PC.

But why was the Apple II simple? Because its inventors understood Gall's law, or at least its Zen? Well... possibly. But also, simply due to the limitations of the hardware, it had to be. Early microcomputers simply did not have the computational power to run a PDP-11 OS. Thus, there was no choice but to build a new software infrastructure from very simple roots.

This is of course a notable contrast from our present age, in which your Ubuntu image, carried on the back of a sturdy Xeon, smiles cheerfully from under its seven gigabytes of crap. The Xeon can run seven gigabytes of crap - but Joe Sixpack cannot manage seven gigabytes of crap. Amazing things, of course, are done with this assware. Amazing things were also done with VMS. Amazing things were done with Windows. Don't take it personally.

So: we've identified one existential obstacle to personal cloud computing. We don't have a cloud operating system, or anything like it, which could be remotely described as simple enough to be "personal" - assuming said person is Joe Sixpack and not Dennis Ritchie. No OS, no computer, no product, no business. The thing simply cannot be done. And Gall's law says we can't get there from here.

But actually it's not the only such obstacle. If we somehow surmounted this obstacle, we would face another insurmountable obstacle. It's not just that we need a new OS to replace Unix - we also need a new network to replace the Internet.

Managing a personal server in the sky is much harder than managing a phone in your pocket. Both run apps - but the personal cloud computer is a server, and the phone is a client. The Internet is already a bit of a warzone for clients, but it's digital Armageddon for servers. You might as well send Joe Sixpack, armed with a spoon, into the Battle of Kursk.

An Internet server is, above all, a massive fortified castle in alien zombie territory. The men who man these castles are men indeed, quick in emacs and hairy of neck. The zombies are strong, but the admins are stronger. They are well paid because they need to be, and their phones ring often in the night. Joe is a real-estate agent. No one calls him at 3 in the morning because Pakistani hackers have gotten into the main chemical supply database.

So long as this is true, it really doesn't matter what software you're running. If network administration alone - and if on a real computer, user-installed apps talk to foreign servers directly, and vice versa - is a job for professionals, no cloud computer on this network can conceivably be personal. It is an industrial cloud computer, not a personal one.

So: serious problem here. By 2020 - two years before the apotheosis of HTML 5 - we're going to need (a) a completely new OS infrastructure, and (b) a completely new network. Or we can also, of course, remain in our present state of lame.

Can it be done? Why, sure it can be done. If anything, we have too much time. The simple fact is that our present global software infrastructure, ass though it be, is almost perfectly constructed for the job of hosting and developing the upgrade that replaces it. All we have to do is make sure there is an entirely impermeable membrane between assware and the future. Otherwise, the new infrastructure becomes fatally entangled with the old. The result: more ass.

Assware has one great virtue: ass is easy to glue. All useful software today is at least 37% pure glue. You can just count the spaces between the letters. For instance, when we see a LAMP stack, we see four letters and three gallons of glue.

It is perfectly possible to create and even deploy an entirely new system software stack, so long as it entirely eschews the charms of Unix. If your new thingy calls Unix, it is doomed. Unix is like heroin. Call Unix once - even a library, even your own library - and you will never be portable again. But a Unix program can call a pure function, and indeed loves nothing better. You can't use ass, but ass can use you.

When people created the first simple operating systems from scratch, they had only simple computers to build them on. This enforced an essential engineering discipline and made the personal computer possible. No forces enforces this discipline now, so there is no economic motivation for creating simple system software stacks from scratch.

As for new networks - phooey. Layering a new peer-to-peer packet network over the Internet is simply what the Internet is designed for. UDP is broken in a few ways, but not that can't be fixed. It's simply a matter of time before a new virtual packet layer is deployed - probably one in which authentication and encryption are inherent.

Putting our virtual computer on a virtual overlay network changes the game of the network administrator, because it splits his job into two convenient halves. One, the node must protect itself against attacks on the underlying network by attackers without legitimate credentials for the overlay network. Two, the node must protect itself from attacks by legitimate but abusive overlay users.

Job one is a generic task - DOS defense of the most crude, packety sort - and can be handled by Joe's ISP or host, not Joe himself. Attacking an overlay node at the Internet level is a lot like trying to hack an '80s BBS by calling up the modem and whistling at it. Job two is a matter for the network administrators, not Joe himself. All of the difficulty in securing the Internet against its own users is a consequence of its original design as a globally self-trusting system. So again, we solve the problem by washing our hands completely of any and all legacy assware.

Let's review the basic requirements for a personal cloud OS - in case you care to build one. I see only three:

First, that motherfucker needs to be simple. If there's more than 10,000 lines of code anywhere in your solution, or the compressed source distribution exceeds 50K, Gall's law says you lose. Various kinds of virtual Lisp machines, for instance, can easily hit this objective. But, if it's Lisp, it had better be a simple Lisp.

What is a simple cloud computer, when introduced, version 1.0? Can it be a personal cloud computer? It cannot. The Apple II cannot exist without the Altair. With 10,000 lines of code or less, you cannot compete with Ruby on Rails for hosting the newest, greatest Twitter ripoff, just as the Altair cannot compete with the VAX - at the job of being a VAX. But the VAX also makes a pretty crappy Altair.

If history repeats itself, the 2012 ancestor of the 2020 personal cloud computer is neither the 2012 cloud information appliance, nor the 2012 industrial cloud computer. If it exists at all, it can only exist as a hobby computer - like the Altair.

A hobby computer doesn't try to serve the needs of any existing user base. It is its own need. It exists to be played with. As it is played with, it will naturally mature and learn to serve needs. But at first, it is much more important to remain simple, than to solve any specific problem.

Second, your virtual OS needs to be semantically isolated from the host OS. Anything that can call Unix, is Unix. That's why the Javascript/browser ecology, for all its stank, succeeds: it can't call Unix. It could invent its own compatibility disasters, but at least it didn't import Posix's. If Netscape had cut a hole into Unix, it would have died without a trace - as perhaps it deserved.

The natural consequence of this restriction is that Joe's virtual computer is, or at least should be, portable across hosts. This is a delightful service which can of course be implemented by assware with yet another layer of complexity, but should emerge naturally from any really simple system.

Third, your virtual computer needs to be a computer, ie, capable of arbitrary general-purpose Turingy goodness. It can compute, it can store data, it can communicate with other computers - it can even talk to the old legacy Internet, albeit via a gateway. Think of any Web app you use. If Joe's computer can't implement this app, at least logically, it is not in some sense a computer. For example, can it virtualize itself? If not...

So my view is: not only is personal cloud computing solvable, but it's simple by definition. So it can't even be hard. Some nigga should just do it. He's got eight years.

61 Comments:

The main barrier to this is deciding what an operating system should do, and how it's different from the tools of Web 2.0: mail, document tools, photo hosting, etc. What's the value of moving photos from Facebook where your friends will get updated on them automatically to some other cloud server, for the benefit of it being "your" server?

There are web desktops that pretend to be an OS; these don't live up to your standards, but even these are useless to Joe because they aren't Facebook. There is the Diaspora project, which would have allowed people running a 3.0 "cloud OS" to be linked into 2.0 lines of communication; it's also dead, because it's not Facebook.

Not a geek, so let me say that I hope you stick to writing and refrain from taking up the challenge yourself. Struggling through this, I begin to realize I am dealing with vast unknown unknowns, but I still have a real hard time understanding how this development will change the world for non-neckbeards, as opposed to providing them with better functioning toys. The relationship presented between the powers that be in technology, the masses, and the dissidents is a pretty nearly perfect analog for the current political situation though, even down to the proposed solution. You don't like Facebook? You must be an enemy of the people.

This post reminded my of this (in my opinion, important) article by Peter Norvig, especially since he's taken a moment to slag on Siri and Wolfram Alpha. Sure, it's possible that Siri will occasionally misunderstand her interlocutor, but Joe misunderstands things from time to time too. The people who create these things have created new database and computation infrastructure to support these online learning engines, and if our favorite search engine is insane, it has become so only by minutely observing all of us as we work.

Is it ironic that our host is staking out a slightly Chomskyite position on this topic?

I feel about tech posts how the proles feel about poetry, as if they are open threads. But they do create an urge to learn some things that will be useful if UR is wrong, and we manage to limp on for a few more generations.

I don't know much about computers and tech so this is just a layman's perspective.

I don't think we'll see this personal computer in the cloud thing until or unless we see general personal computing invade all kinds of ordinary every day physical objects.

Right now average people tend to associate "their computer" with the actual physical laptop they use most of the time. They associate it with a single piece of hardware. For them to make the mental leap it has to invade other physical objects they interface with regularly. Smartphones are a start since as you say they're half information appliance and half personal computing devices.

The question is whether things will go the Apple route, that is having proprietary information devices or will they go the mencius route with open devices. The cell phone situation would seem to tell us that proprietary is the way things are going.

I wonder if mm's current coding work is in this vein. One would think so.

Ok, I read all that, and I still have no idea what distinguishes a cloud computer from a computer, or (more importantly) why that distinction is something I should care about.

The innovation is moving applications and data out of the cloud, the data centre, the 1970s machine room... and onto powerful, cheap and portable computers. The reverse is not an innovation, it is a regression, a benefit for service providers rather than users. What does "the cloud" offer, that we cannot do at lower cost and greater convenience on our own machines?

I say this here because of all the places where such things might be discussed, this is perhaps the one place where a bullshit-free reply might be forthcoming. The Slashtards are too deep in the specifics of the technology to give a straight answer, while the tech journalist crowd know nothing other than buzzwords. So what? And why.

Furthermore I'm deeply suspicious of anyone suggesting major replacements of legacy technology. Too much wishful thinking there. It's a common refrain amongst CS grad students who always seem to imagine that they can do things better... and invariably have dreams far in excess of their capabilities.

A cloud computer is basically a server. There's also the promise of backups on other servers. It's not great for data security unless if there's some clever kind of encryption involved.

By the way, I mentioned Diaspora before as dead, but they just sent me an e-mail begging for money. The world already gave them $200,000. How the fuck did they blow through that? It's a lot of fucking pizza. I'm glad I bowed out of that loser when I did.

Mencius is advocating a software "Year Zero," when the Brendan Eichs of the world (and I number myself among them) are f**ed, broken, and driven across the land (as HST once put it).

I happen to like the world's software stack as it stands. It's like a growing crystal; deeper layers become indistinct and difficult to see with the activity on the surface.

I think it's late in the day to still be bitching about the x86 instruction set. The folks at Intel have basically routed around it in the hardware. I'm old enough to have wished that the 68000 had won, since it was marginally less braindamaged, but so what? That was a long time ago.

Get rid of Unix?! Debian has 300 million lines of code. You're talking about $2 billion development cost.

And I don't care how pure and functional your lisp OS is. Most of the bulk of that code is low level systems code that has to be written in raw C.

Has anyone anywhere ever developed a good general purpose non-Unix OS? Let's see we have Windows, OS/2, Amiga, OS9. All shit everywhere.

Reproducing pthread alone on another platform and trying to get it anywhere near the performance of Linux kernel 2.6 would take years. And trust me, you're going to want a damn good pthread library with the thousand core processors dominating in the 2020s.

As for the network stack. The idea is good, but why not just call a spade a spade. You're essentially just talking about IPSec/VPN.

I think your diagnosis and analysis is pretty damn good and insightful. But you're like a cardiologist who just expertly diagnosed me as needing having a quadruple bypass... then making the suggestion that we just cut out the heart entirely and replace it with a carburetor that you picked up at the junkyard.

then making the suggestion that we just cut out the heart entirely and replace it with a carburetor that you picked up at the junkyard.

If you've seen and understood the approach he's taking, I don't think you'd call it that. It's more like replacing your heart with Iron Man power supply. Maybe Iron Man power supplies are too hard for one man to build, maybe he'll fail, but he's not exactly building a carburetor in there.

Heh. I founded the unix-haters mailing list way back in prehistoric times, when I was forced onto unix after previously using Lisp Machines (an OS written in an actual high-level language).

But I've since mostly given up my desire for beautiful, elegant computational infrastructure, because it doesn't win, as Richard Gabriel articulated in an essay I hope everyone has read.

So I (a nominal left-winger) feel like I'm actually conservative in that I have a respect for the actually existing natural order of things, while our host, ostensibly on the right, actually seems to be a radical in the mold of Le Corbusier, wanting to tear down the world and rebuild it according to first principles (this applies both to computing and politics).

> Heh. I founded the unix-haters mailing list way back in prehistoric times, when I was forced onto unix after previously using Lisp Machines (an OS written in an actual high-level language).

Out of curiosity, were also you among those who wrote the Handbook?

You are the fifth or sixth person whom I have heard confess to having once used Lisp Machines, then pine for the old days and what was lost, and finally make peace with Unix/Wintel decay.

The alternative to thinking that the decay served some higher good is, of course, to regard nearly the whole of your profession as evil charlatans and deluded Stockholm Syndrome sufferers. That way lies madness. But madness has its up-sides!

> ...I've since mostly given up my desire for beautiful, elegant computational infrastructure, because it doesn't win, as Richard Gabriel articulated in an essay I hope everyone has read.

"It doesn't win." If the proverbial cockroaches left standing after World War Three could talk, that is what they would say about mankind and all of its works. If you value man over the cockroach, or for that matter the roach over the bacterium, a Bach symphony over a dinosaur fart, you will be on the side of the beautiful and elegant always and no matter what. Whether or not it tends to "win" in some imaginary "natural" order of things. The beautiful and elegant can beat that which is ugly and good-specifically-at-exterminating-opposition, but only if we help it.

> our host, ostensibly on the right, actually seems to be a radical in the mold of Le Corbusier, wanting to tear down the world and rebuild it according to first principles

Le Corbusier was evil because he wanted to re-build the world from scratch while (militantly!) having no taste, rather than because he wanted to re-build the world from scratch. Understanding this fact is almost the definition of having taste.

DR: Reproducing pthread alone on another platform and trying to get it anywhere near the performance of Linux kernel 2.6 would take years. And trust me, you're going to want a damn good pthread library with the thousand core processors dominating in the 2020s.

Will everyone with a virtual private cloud computer really be allocated thousands of cores? Will they even need or want them? Or will the processing hosts run a different OS, optimised for their own task, and have thousands of cores available to them - but make available to each of the myriad virtual private cloud computers they are hosting perhaps only one, or a small number like two or four maybe, virtual processors? Since the device will mostly be coordinating resources at the user's choice of service providers and client terminal devices, will they need as much processing power even as is used up today?

I don't think MM is saying the whole world will change every computer to this same OS, just the commoditized personal cloud computers. They don't need to be "general purpose" in the way you meant, because their required functions are limited to those defined by the requirements of the small footprint private cloud OS. That OS, it seems to me, only needs to fulfil a limited set of functionality.

Since the user doesn't directly interact with the "hardware" of this virtual private cloud computer, there can be one reference hardware platform specification and it will be adhered to for all of these such machines. The OS code will not have to deal with potential wild divergences in its environment - it will only ever run on a very simple and standardised set of "hardware" — most likely virtualised. There will be no, for example, massive choice of video or audio hardware issues to cope gracefully with -- because the hardware will not have video or audio hardware in it if it doesn't have a monitor or speakers connected directly to it. The video and audio will be presented using standard protocols over the Internets to the client terminal device. The client terminal device will render it in whatever way it sees best, which will meet the expectations of the end user (or the end user would not accept it and will choose instead another client terminal device which does what they expect of it). Think webserver, HTML, and a browser. Apache isn't big, but that's why it's so clever.

I think MM is getting at not a recentralisation of all computer processing, back to the days of the mainframe and dumb terminals, but instead a massive decentralisation of processing but with a centralised simple personal orchestrating system (your personal cloud computer). So your data will be processed in your chosen apps, which service providers will offer and perform the processing on your supplied data for you and hand back the results for you to store (all via the orchestration at your personal cloud computer). Your choice of data stores will similarly be provided by other service providers: so your data will be outside of the personal cloud computer (perhaps spread around a number of providers, perhaps using some as backup for others, perhaps the same data stored redundantly across more than one provider - I don't know where this could go, I'm just spitballing), but all controlled by you through it.

The private cloud computer OS itself will be extremely simple, providing just enough functionality to do basic things like:

interface with other Internet systems using standardised protocols

communicate with the standardised browser terminal so that it can render all necessary forms of user interaction via standard protocols

communicate with the user's choice of apps, supplying and receiving back copies of their personal data and dealing with updating storage

probably the app management layer will be able to deal with dependencies, so that apps can build on the nested functionality of other apps, ensuring that all required apps are automatically loaded --> without user intervention

orchestrate the interaction of all the above facilities across the Internet on behalf of the user, to join up all their chosen resources from their connecting terminal of choice, their choice of data storage facilities, their choice of plug-in apps that they like to use, their choice of plug-in peripherals that they like to use, etc

Simple is best. Strip everything out that isn't just the raw core processing and communications interfaces, and that is all the private cloud computer OS will need to do. The simple hardware reference platform to run it on will then be easily defined and there will be no reason to deviate from it.

If you keep as much as possible out of the core OS and put it into plug-in apps, you can rip out and replace almost all functionality in the future without having to tear the whole thing up and start over. Again.

But anyway, what about that simple physical private gold cloud computer?

"All current empire-building entities will have to deal with their failure, using their existing economic structures, to comprehend the importance of COMPUTING RESOURCES with respect to Economics. This failure will lead to significant miscalculation by all potential players, and also provides an opportunity for new players who understand the game to either dominate existing players, or to enter the game."

mtraven, it shouldn't be that surprising. Mencius takes after Rothbard, acolyte of Mises, and neither was really a conservative in any way. They were rationalists (in contrast to Hayek) who simply wound up camped on the right. His denunciation of banking practices that go back centuries (the only example given by Rothbardians of an historically sound bank was a failed public one) is another example of rationalist bent.

Seasteading can appeal to both rationalists and pluralists/empiricists. If rationalist schemes are really all they're cracked up to be, their seasteads will prosper and be imitated. If not, their failure will have limited effects and the rest can continue with their trial and error.

Since this thread has slowed down, I'll continue posting about subjects in the previous two comment sections. Anyone who wants to continue discussing computers is free to do so, unless our Forum Duce rules otherwise.

That accomplished, more on how Metternich got Napoleon to agree to hand the future of French Sovereign shares to the House of Habsburg via Napoleon's half-Habsburg son, Napoleon II:

http://en.wikipedia.org/wiki/Klemens_von_Metternich

One of Metternich's first tasks was to push for the marriage of Napoleon to Archduchess Marie Louise at a time when Napoleon was also asking after the Tsar's youngest sister Anna Pavlovna. Metternich would later seek to distance himself from the marriage by claiming it was Napoleon's own idea, but this is improbable: by the date that Metternich claimed Napoleon made his feelings known (21 January 1810) the wedding project had already been discussed widely within the French court. In any case Metternich was happy to claim responsibility for the marriage at the time.[24] By 7 February Napoleon had agreed and the pair, still estranged, were married by proxy on 11 March. Marie Louise left for France soon after and Metternich followed, albeit by a deliberately different route. He agreed to French demands that his visit would be an unofficial one, allowing Metternich to transport his family home and to report back to the Austrian Emperor how Marie Louise carried herself upon her arrival in France.[24]

Instead, Metternich stayed six months, entrusting his office in Vienna to his own father, He set about using the marriage, combined with flattery, to renegotiate the terms set out at Schönbrunn. The concessions he won were ultimately trivial, however: a few trading rights, a delay in paying the war indemnity, restitution of some estates belonging to Germans in the Austrian service including the Metternich family's, and the lifting of a 150,000 man limit imposed by the treaty on the Austrian army.[nb 5] Vienna rejected an additional commerce agreement as too pro-French and the French rejected his attempts to mediate for them, despite what seemed an increasing friendship between Metternich and Napoleon.

He was also keen to take advantage of any opportunities to regain Austria's influence in Europe, proposing general peace talks headed by Austria whom he saw as uniquely placed, given her continuing strong ties with all sides in the war. Indeed, throughout 1813 the Austrian foreign minister attempted desperately to get the French Emperor to agree to a peace that would secure a place in Europe for the combined Bonaparte-Habsburg dynasty.[27] This grew out of a deep concern that, if Napoleon were conclusively defeated in battle, Russia and Prussia stood to gain too much.[28] Napoleon rejected all his proposals, however, and the fighting (now officially the War of the Sixth Coalition) continued.

The French, who are gifted with much imagination, think the can understand the Revolution because they have endured it. This is just as if a woman who has had several children should say she perfectly understands confinements. Both forget that there are two entirely different things-the fact of enduring and the art of assisting. There was but one single man in France who understood how master the Revolution, and that man was Bonaparte. The King's Government inherited from him, not the Revolution, but the counter-Revolution, and they have not known known how to make use of this inheritance. I judge of the Revolution more truly than most men who have been in the midst of it. It is with me as with those who watch a battle from very high ground. It is only from thence that everything is seen: in the midst of the fray the eye cannot reach beyond a given circle, and that circle is always small. From the mistakes which the French Government have already made in Spain, no one can say what the end will be: if it turns out well (which is possible), then it will be the good bursting forth and triumphing of itself over everything in spite of both friends and foes. This is my view, and experience will confirm it. France is to-day like a vessel on a stormy sea guided by inexperienced pilots.

I expect to leave Vienna on September 16, stay four or five days at my house in the country, go to Czernowitz on October 3, and return to Vienna about Octover 25 or 26.

The French, who are gifted with much imagination, think the can understand the Revolution because they have endured it. This is just as if a woman who has had several children should say she perfectly understands confinements. Both forget that there are two entirely different things-the fact of enduring and the art of assisting. There was but one single man in France who understood how to master the Revolution, and that man was Bonaparte. The King's Government inherited from him, not the Revolution, but the counter-Revolution, and they have not known known how to make use of this inheritance. I judge of the Revolution more truly than most men who have been in the midst of it. It is with me as with those who watch a battle from very high ground. It is only from thence that everything is seen: in the midst of the fray the eye cannot reach beyond a given circle, and that circle is always small. From the mistakes which the French Government have already made in Spain, no one can say what the end will be: if it turns out well (which is possible), then it will be the good bursting forth and triumphing of itself over everything in spite of both friends and foes. This is my view, and experience will confirm it. France is to-day like a vessel on a stormy sea guided by inexperienced pilots.

I expect to leave Vienna on September 16, stay four or five days at my house in the country, go to Czernowitz on October 3, and return to Vienna about October 25 or 26.

647. May 22.-Spanish affairs go on as they must go now that they have been taken in hand. What a miserable Power is that which is founded on error, is only supported by lies, and has no strength but the weakness of its opponents. This a portrait of Liberalism. No sooner are its pretensions examined than they are seen to be without foundation; and when its resources are investigated nothing is forthcoming. And yet there are people who claim to be intelligent who hold by Liberal theories and glory in their results.

That which hinders so many persons from obeying truth, from giving themselves up to it entirely, is the utter want of all tinsel peculiar to it. It is the destiny of truth to be developed with ever-increasing power; we grasp it in its early immaturity, and when the day comes that it shines forth in all its innate splendour it makes its way without our help, and all merit seems to belong to it alone. Those who have nourished it in its early beginning, and have watched over its progress to perfection, are quickly wiped out of the memory of men. This is not a result flattering to vanity, and they are few who devote themselves to that which confers so little on their love of self. This is my confession of faith and my judgement on myself.

641. March 5.-I am busy about a very anxious work. Paris now presents a most peculiar spectacle. I know the ground in Paris very well, and my knowledge of the city in the time of strength enables me to judge of its position in its present time of weakness. In this country everything is unexpected; even what seems reasonable is only so outwardly, not really: commotion is here the consequence of excited passions, and of all these not one springs from true feeling. Never since there was such a thing as business in the world was an affair handled as it is at this moment in France. It really looks as if people in this country were trying to refine upon suicide. They drive forward, but at the same time bring the car so close to the precipice that it must inevitably turn over.

Le Corbusier was evil because he wanted to re-build the world from scratch while (militantly!) having no taste, rather than because he wanted to re-build the world from scratch. Understanding this fact is almost the definition of having taste.

rofl, because nerds like Moldbug apparently have taste.

Last time I looked on /. or hackernews the nerds over there believed the height of culture is my little pony, star trek, and wearing running shoes with jeans.

She pointed to her twitching, puckered anus. ‘See this?’I nodded eagerly.‘I want you to wreck it.’I spit on my skeezer-pleaser and, prying her ass cheeks apart like a hot dinner roll, drove it home, into the biggest browneye I had ever seen. She gurgled contentedly. Every thrust of my babymaker was met with a wrenched squeal as I grabbed her by the hips and began really leaning into it.‘Harder!’ she begged, ‘Harder!! ‘You’re slowing down!’ she snapped. ‘DON’T SLOW DOWN!’I went back to punishing her asshole as her chocolate socket gnawed on my pork pipe. She was babbling now, as out of a delirious reverie.‘Feed it,' she rasped. 'Feed my hungry asshole!'I buried her face in a throw pillow and she swiveled her hips back on my fuckstick with obvious appreciation. My pace quickened as my man-magma built towards eruption.

What a schmuck. Especially that he drags Sailer, a genuinely good human being, into his belief in genetic determinism, despite Sailer's own thoughts on the topic.

Also, love the weaselly "an argument can be made" formulation, and the concern trolling for these individuals "genetic interests".

All these traumas that the "offspring" go through just proves the shit was a bad idea in the first place, and they should probably kill themselves. Couldn't just be psychological due to the treatment of race in the modern world. What are you, a flat Earther? Genetics explain EVERYTHING, man. And the pendulum swings...

It seems to me that these people are making the same mistake that race-is-a-construct people make: insisting that evolution just up and stopped at some point (they pick a different point, though). I can imagine a few dystopian futures where our descendants would be very happy to have some African genes.

Also, never to go unconsidered: what is the "real meaning" of this? Probably a pathetic attempt to increase the relative sexual market value of some beta. Did I phrase that right?

A new problem for post-democracy has occurred to me: Who will the new political class be?

If you ask a democratic politician why they have the right to rule, they'll say it's because they were elected by the people. If you ask a hereditary monarch why they have the right to rule, you'll get divine right or some other such answer. If you ask a revolutionary committee why they have the right to rule, the answer will again be that they represent the people.

There's a great variety of political views apparent among the commentariat here, but I think the dominant one is right-libertarian. Democracy is detestable because it leads to communism and the confiscation of the individual's wealth by the collective.

If that's what you believe, then you want a society where the 1% get to keep their money. OK. But that still doesn't answer the question, who gets to rule, and why? If you're a democratic libertarian, then what you want is a minimal state, and you want to be "ruled" (albeit in a hands-off fashion) by libertarian politicians.

But if you're rejecting democracy, that's not how things will work. There will be a new, non-democratic ruling class. It may be hands-off or it may be tyrannical, but either way, its legitimacy will not come from having been voted in - that defines the thought-experiment we're conducting here. So who are the new ruling class to be, and on what grounds will they legitimate their rule?

You may want the 1% to be free to keep their money (unless they got it from the banker-socialism of late-Keynesian political economy) - but do you want to be ruled by them? Do you want them to be, not just the propertied class, but the ruling class? This is a very basic issue. And whether you think that some CEOs are fit to be statesmen too, or that ultimate sovereign power should reside in individuals drawn from some other class, the further question will remain: how will they justify their rule, to themselves and to others, in the absence of the democratic criterion of legitimacy? What will be the intellectual rationale of the new, post-democratic state of affairs?

I'm interested in what other people have to say on this, because I probably represent the minority view around here. But here's what I mean.

I take it as a given that this guy is godless. So all his ideas of what is right and proper come down to (personal) aesthetics. He has no standing to tell any liberated females they shouldn't behave as they choose, whether that means sleeping around or choosing to settle down with someone of another race. Yet he feels entitled to criticize because people of his race that have children with someone of another race are diminishing the presence of his genes in the population (as compared choosing a mate of the same race). And this makes him sad.

As a godless man of science, he really needs to grow up and take the long view. If it's maladaptive, so be it. Darwin is his god, and he rules with an iron fist. If the miscegenators really are doing something maladaptive, it will be sorted out. Whether it is maladaptive is far from proven. Statistics on psychological disorders at the population level? Give me a break.

This genetic interest talk really amuses me. Raising these concerns to the level of moral imperative is perfectly dehumanizing. "...multiracial families often do not possess the harmony, cooperation and purposefulness of same-race families, because mixed-race families lack the focus of genetic investment and returns that same-race families possess." Awesome. As I said, schmuck.

"I don't think they'd deny that evolution is occurring as more polygynous groups invade less polygynous groups across tribal, ethnic, national, racial, etc. lines."Is Darwin making a mistake? Are those polygynous groups devils? Come out of the closet, you crypto-Puritan. If your line goes extinct, it deserved to.

While there are certainly problems with an absence of accounting for race (i.e. IQ studies, athletic ability, etc.) "miscegenation," to use an old and often ugly term, is a net good for humanity as "hybrid vigor" is a well-known genetic fact.

In light of the tragic termination of former Libyan CEO Gadhaffi by Anglo-Protestant forces, I pray Moldbug is preparing an appropriate piece of historical revisionism and crackpot political theorizing thinly guised as a "eulogy" for the great man.

Indeed, Moldbug should dedicate more blog space to the non-Hitlers/non-Stalins/non-FDRs dictators of the 20th century.

There's no reason why Hitler - as memorable and explosive as his well videotaped rule of Germany was - should get all the attention from 20th century revisionist historians.

Let's give some love out to those "caretaker-Hitlers" of the 20th century; let's praise the names of the Kim Jong Ils, the Pinochet's, the Ataturks, the Mubaraks, the Saddams (whose reign I must say is looking better and better with each passing roadside IED) the Castros, and, yes, the Gadhaffis who did so much to withstand the tide of democratic rebellion and bring order to the galaxy.

He has no standing to tell any liberated females they shouldn't behave as they choose, whether that means sleeping around or choosing to settle down with someone of another race.

No one wants to even consider what the counterpart to female liberation might be. But consider: A female's godhood is exercised when she chooses which genes will pass through her to the next generation. A male's godhood is exercised when he chooses which other male he will meet in a natural duel to prevent his genes from passing into the next generation -- or die trying.

Under natural law the ultimate power -- the power that shapes the future -- of female individual sovereignty is the choice of which genes make it into the next generation and that power is exercised through birth.

Under natural law the ultimate power of male individual sovereignty is the choice of that which is to be killed in single combat.

In nature, males and females have two respective powers: To destroy and to preserve. People think that civilization is founded on control of destruction and seem to forget that civilization also depends on controlling female power to preserve. With the return to females of choice, hence their power, something equivalent must be done for males, such as enforcing natural duels to the death (natural meaning just putting the two disputants out in the wilderness with one to return).

Raising these concerns to the level of moral imperative is perfectly dehumanizing.

What's really dehumanizing is preventing population groups - tribes, ethnies, nations, races, etc. - from practicing free association and excluding others from their territories. Uniform human ecologies reduce human genetic variation. They literally dehumanize by reducing the various representations and manifestations of humanity. They reduce the amount of human information in the world.

Is Darwin making a mistake? Are those polygynous groups devils? Come out of the closet, you crypto-Puritan. If your line goes extinct, it deserved to.

I'm not sure what your point is here.

Also there are many ways evolution can happen. So an invading more polygynous group might get wiped out in a war or through bioweapons or something.

While there are certainly problems with an absence of accounting for race (i.e. IQ studies, athletic ability, etc.) "miscegenation," to use an old and often ugly term, is a net good for humanity as "hybrid vigor" is a well-known genetic fact.

I'm not sure what exactly a "net good for humanity" is, but there can also be outbreeding depression:

Sorry to interrupt the sociobiological seminar, but I'd like to repeat my question from a few days ago: What will be the intellectual rationale of the new, post-democratic state of affairs? In particular, what will be the definition of legitimate rule?

I wonder if there are any actual opponents of democracy here. I am not an anti-democrat myself; the function of this blog was simply to open my mind to the logic of nondemocratic political cultures. But this blog does exist to promote, in the long run, the overthrow of democracy, first in the minds of a few people who can make a difference, and then in the real world. Are there any genuine anti-democrats out there? If so, I'd love to hear how you think political legitimacy could, would, or should work in the post-democratic world.

My guess: this is the latest incarnation of "Rienzi", lover of all things Salterian.

>What will be the intellectual rationale of the new, post-democratic state of affairs? In particular, what will be the definition of legitimate rule?

I find Moldbug persuasive on democracy. On the other hand I'm not sold on neocameralism, because I'm not convinced that a king who optimises economic profit would turn out to be as benevolent as Moldbug implies.

I think that this Moldbug post goes some way to answering your question (NB: it presumes that the post-democratic government will be specifically neocameralist).

A prerequisite is that a large number of people are turned into reactionaries - the Antiversity is supposed to achieve that. Given that a mass of support for a reactionary transition from "Plaingov" to "Plaincorp" exists, a "Trust" should be brought into existence to manage the transition responsibly.

Since we are supposing that the public in general is accepting of the reactionary way of doing things at this future point in time, Moldbug suggests that (for example) professional pilots could be selected as a suitably trustworthy and competent (if arbitrary) group of people to constitute the Trust.

The Trust is then responsible for electing a Board, who select a Receiver (i.e. king).

So legitimate rule is rule of this Receiver, who was chosen by a Board, which was elected by a Trust, which was handed sovereignty via the consent of a majority or large element of the populace which had been converted by the Antiversity.

Once neocameralism has been established, the Receiver has no need to care about the opinions of the populace because he has a cryptographic command chain at his disposal - therefore under neocameralism, "legitimacy" becomes largely a moot point.

As for the legitimacy of other possible post-democratic governments, I suppose that differs depending on the design of that government.

A prerequisite is that a large number of people are turned into reactionaries - the Antiversity is supposed to achieve that. Given that a mass of support for a reactionary transition from "Plaingov" to "Plaincorp" exists, a "Trust" should be brought into existence to manage the transition responsibly.

If that's what you believe, then you want a society where the 1% get to keep their money. OK. But that still doesn't answer the question

It does answer the question. You want a society that prevents masculine defense of natural property rights against confiscation by mercantile forces under artificial property rights. So if a landlord from the 1% evicts you from your homestead, which would effectively kill your children, you have no recourse but to go and let your children die since any “legal” remedy would be under the authority of the 1%. Legitimacy is determined by whomever or whatever upholds artificial property rights.

Vladimir - the point of "cloud computing" is to be able to have all your data available to you wherever you are - at your home desktop, at work, in a hotel room in a different continent, using your phone on a dark desert highway, etc.

I suspect that unless bandwidth costs drop much faster than memory, it might end up being more convenient to just put everything on a flash drive, and plug it into whatever computer you're using at the moment.

Well, Mitchell, assuming you ever get back to see who has taken you up on your question, I am going to address it here.I am in the extremely uncomfortable position of having lived the past thirty years in a country (France) which pays obsequious lip service to the republic, while maintaining the basic structures and ATTITUDES of the previous monarchy.Along Tocqueville's dividing lines, I grew up in a country (the U.S.) where the assumption was that we all had equal rights, and the monarchy did NOT color underlying attitudes.I have no answer to your question about founding legitimacy elsewhere, in our time. Shakespeare examined your question in the Tetralogy, culminating in Henry V. Legitimacy is acquired over time, in the best of cases. It is a matter of winning hearts, minds, and FAITH. All matters that are extremely problematic at this time of incredible cynicism, if not outright despair.I CAN say that I believe that democracy wreaks havoc on the planet. That it goes hand in hand with the consumer society : MORE stuff, for MORE people, all the time, stretching out into infinity. With the best of intentions, of course. Here in the West, we are great crusaders. We go from one crusade to the next.As an ongoing ideology, egalitarian democracy CONSTANTLY attacks privilege in any and all forms, whether that be... the privilege of detaining land, the privilege of having an education. As soon as a value is perceived as "privilege", it is attacked and underminded. To be equal.... we must all be.. THE SAME. These days, that is.In all fairness, it is the way of the world for ANY ideal to become corrupted. Perhaps from the outset ? Perhaps only over time ?The early Christian church was an amazingly democratic experience... COMMUNIST, even, that's how democratic it was.... (nobody was coerced into sharing property, if I understand correctly, they voluntarily submitted to it) The Church fathers themselves recognized that "the corruption of the best engenders the worst".Before there can be ANY form of legitimacy, Mitchell, first there must be.. FAITH.Faith founds legitimacy. Along with trust.Just out of curiosity... how many of y'all have READ Darwin's "The Origin of Species" ? I haven't. But I know, thanks to Jacques Barzun, that Darwin's theory did NOT originate with him, but with others.I think that we assume we know too much about what Darwin said, and we also assume that what Darwin said was the GOSPEL TRUTH. Many of us, that is. (I am very circumspect about letting the word "truth" pass my lips...)A clear case of.. BLIND FAITH...I sometimes think that the monarchy is not a bad choice of government. That the "reasons" why the monarchy failed in France had little to do with the nature of monarchy itself, and all to do with THE PEOPLE's perception that they were being treated with a lack of respect, talked down to, etc etc.A bad harvest, people starving, bread too expensive, general impatience, tempers running high, and PFFT. Revolution. (No need to talk about wolves here, even if we are all animals, we are NOT wolves...)The consummate irony would be if our SO CALLED democratic governments bit the dust FOR EXACTLY THE SAME REASONS ?