"I started Facebook, I run it, and I'm responsible for what happens there," he told a Senate committee on Tuesday. Zuckerberg's tone was contrite, of course. He had traveled to Washington to explain the cascading data privacy scandals that have descended upon his company. But taking responsibility inherently involves detailing what you have power over. It's a sweeping and jarringly intimate list in this case.

"It's not enough to just connect people, we've got to make sure those connections are positive. It's not enough to just give people a voice, we've got to make sure people aren't using it to harm others or to spread misinformation," Zuckerberg continued. "Across the board, we have a responsibility to not just build tools, but to make sure they're used for good."

If that sounds less like a business and more like a ruler, well, you're not the only one who's noticed. "In a lot of ways Facebook is more like a government than a traditional company," Zuckerberg once said.

At this point, it's probably best to think of Facebook as not just any government, but a regime desperately clinging to power.

First off, if Facebook is a government, it's specifically a monarchical one. Zuckerberg is its CEO, he chairs its corporate board, and he's the controlling shareholder on that board. Facebook's user base is larger than the population of most countries. Its internal algorithms have the same society-shaping force as laws: They decide how people connect, what information they see, and whether other businesses live or die.

Over at Vox, a trio of political scientists recently took this analogy and ran with it. They noted that monarchical governments tend to be unstable. Kings have a hard time winning the broad trust of society, precisely because they're kings; humans simply tend to be less trusting of people with absolute and unchecked power. So the problem isn't an individual ruler; it's the system of rule. Hence the rise of democracy.

This is an existential problem for Facebook. Its structure resembles that of a monarchy because it's a for-profit company still run by its founder. Zuckerberg owns almost 60 percent of Facebook's voting shares all by himself, so he calls the shots. And even if ownership of voting stock was more dispersed, you'd still have an oligopoly. The populace over which Facebook rules is its user base, not the shareholders.

To make the platform truly democratic would require turning Facebook into a kind of user-owned-and-operated cooperative. Or it would have to be nationalized and run through the democratic governmental structure we already have. Or it would have to be broken up into multiple smaller companies, forcing them to compete with each other, making them more accountable to regular users. (As much as we talk about antitrust law as a way to promote competition, its traditional purpose was to bring the substance of democracy to the economic market.)

All those options would scuttle the vast wealth that Facebook's business model has conferred on Zuckerberg and his fellow shareholders. Needless to say, none of those options are being discussed. Even when asked about resignation — a move that would simply replace him with another corporate overlord — Zuckerberg demurs.

Yet he also desperately wants the people's trust. On Tuesday, Zuckerberg repeated the now-tired line that Facebook's sin was being too idealistic. But to its credit, Facebook has also voluntarily decided to implement new disclosure rules for political ads on the platform, institute tougher European-Union-style data privacy rules, and hire as many as 20,000 new people to deal with data privacy and security concerns.

Most dramatically, a committee of academics will be given unprecedented access to Facebook, to study the platform's effects on society and democracy. It will be financed by a cross-ideological spectrum of groups, including the arch-conservative Koch Foundation. To Facebook's credit, the company says it will not be able to review the committee's work before it's published.

As the Vox piece notes, this is a genuine concession on Facebook's part. At the same time, all of this is ultimately voluntary. Facebook could rescind these changes as soon as they become inconvenient. Even the committee cannot force change. The most it could do is embarrass Facebook with its findings.

That's the trouble with reforming monarchies: By definition, all the reforms happen on the monarch's terms. They only go as far as the monarch is comfortable with.

How far is Zuckerberg willing to go? I don't doubt he means well. But he also comes off as spectacularly naive. He repeatedly refers to Facebook as a "community," even though, as Zeynep Tufekci noted, "a community is a set of people with reciprocal rights, powers, and responsibilities." Zuckerberg also insists Facebook isn't about the bottom line. "My top priority has always been our social mission of connecting people, building community, and bringing people closer together," he reiterated on Tuesday. "Advertisers and developers will never take priority over that as long as I'm running Facebook." Coming from a for-profit corporation, this sounds absurd. The only reason it's even halfway plausible is that such a huge portion of the profits go to Zuckerberg himself. And maybe he'll have an attack of conscience.

On the other hand, Facebook has been around since 2004, and Zuckerberg has basically been on an almost continuous apology tour ever since. Over and over, his company has fumbled privacy concerns and mishandled user data. We do not have to posit that Zuckerberg is evil to know he'll misuse his extraordinary power. We simply have to posit that he is human.

"We don’t have many good models showing how corporate accountability and democratic accountability can be combined," the Vox piece concluded. That's probably because the two things just don't go together. Corporate rulers, like the kings of old, don't voluntarily step down. They're either conquered or overthrown.