What Could Have Been?

[This is a blog post about MyPrivatePla.net written by the original owner of this domain, Shannon Larratt, taken from his blog ZENTASTIC.]

A few years ago a friend and I started work on a new community site,
something that could grow an IAM type site to much larger Facebook-scale
levels (generating and running not just the social network but also
many media projects from within its communities), while addressing many
of the shortcomings that we saw in community software. It was what we’d
hoped to host on myprivatepla.net (dead site; don’t bother going there) —
although of course with Jason’s passing this project was shelved.

The two major shortcomings that we saw in existing engines, coming
from our experiences running community sites (me with BME’s IAM, and
Jason with his own IAM2 software) were scalability and trust. For the
latter, we asked ourselves how a userbase could trust a website if they
didn’t trust the people running it — for example, the cases where
Facebook admins have poked around inside the private messages of
celebrities or otherwise abused the privacy of the site’s members, to
say nothing of government warrants violating privacy. Jason and I solved
this by encrypting absolutely everything possible — but doing it
client-side. That is, all of the encryption takes place in the browser,
so by the time it makes it to the server it’s completely encrypted. If
you’re sending a message from one person to another, only the people
involved are able to see it in its true form.

Given how easy it was to achieve the encryption aspects, I’m frankly a
little surprised that nobody has implemented something like this — even
if no mainstream company wanted to support it, I don’t think it would
be difficult to write a browser plugin that added this functionality to
Facebook or any other social sites.

Scalability was the other concern. The original IAM software, using
ten year old hardware, was seriously optimized and could handle about
20,000 users on a single server (plus a fileserver) — and I’d wager with
today’s technology could easily handle well over 100,000 users per
server due to its optimization bias. However, it didn’t scale cleanly
past that, although there were drawing-board multi-server
implementations capable of handling far more users… Nonetheless, it was
far from cheap to host, and would have only become more expensive.
Facebook is said to have approaching a quarter million servers (or four
or five thousand users per server, not surprising given the inefficiency
of code necessitated by their AJAX-heavy philosophy and extreme “live”
design), and given by the desperate attempts they’ve made recently to
monetize, it’s clear that it’s a challenge paying for it all — and even
if you do figure out how to monetize a site that large, it’s often
difficult to stay above water for the first few years before you hit
critical mass. Since we were eyeballing a mainstream site, we
brainstormed solutions to avoid all of this, perhaps to bypass most
hosting costs altogether by changing the rules.

Since we’d built a design philosophy in which the servers didn’t have
to be trusted — public key cryptography both kept content private and
perhaps more importantly protected it from tampering, the idea evolved
to completely get rid of all or the majority of the servers by
offloading the servers to the client side as well… Since these days
people’s home computers are often always online, they would act as the
hosts, both hosting for the data (public and private), the bandwidth
(again, both between users and to the public), and “brains” of the site.
I’m sure there would be a way to achieve this via a massive browser
plugin, but the thought was to do it via a standalone app that people
could run.

The home server end of things drew from a lot of the technologies
developed for the darknet, projects like Freenet and Tor. For example,
when relevant Onionskin routing with encrypted data was used so that
when data needed to moved around, it wasn’t clear what was being moved,
to whom it was being moved, or what it even was. To deal with the
reality that home computers, even if theoretically always on, rarely
are, the home servers acted in clusters — private content (as well as
acting as the server for mobile devices) was echoed across the computers
of your friends. Public content replicated as it was viewed, both so
that the more people wanted to see something, the faster it would get
(like a torrent), but also so that content was extremely secure against
censorship.

There was also a mountain of work done to define how the system
worked from an interface point of view, how personal sites and business
sites could be managed, how people could group together to publish
content like magazines and blogs, and so on, and some of that’s quite
clever (all the things that I wanted the next IAM to become — and
looking back now, there were a lots of things IAM — founded well over a
decade ago — did that were many years ahead of their time), but I think
the encryption and distributed nature are more important and what I
wanted to mentioned here before I forget.

The thought behind all of this was to create a trusted social web
where every user that’s added, adds power to the system (rather than
draining it, as you’d get in a central-server architecture). None of the
ideas or technologies are fundamentally new, but I don’t think I’ve
ever seen them synthesized quite in this way. My own project is
completely dead in the water for a wide range of reasons, but I sure
would to see someone implement it… Whoever does may well find themselves
being a big part of creating a better world.