This week, Mark Zuckerberg is due to appear before the US Congress. There’s no shortage of people offering hard questions which should be fired in his direction. Here’s what I think he he ought to say before the interrogation begins:

“I know that members of the committee will want to know about attempts to interfere in American elections, whether social media ruins childhood or whether democracy is damaged by digital communication. We’ll get to those issues, but in opening I’d like to draw the committee’s attention to some of the bigger questions below the surface of the recent controversies.

Apologies first. We at Facebook have made many avoidable mistakes. We were blind to the scale of civic responsibilities inherent in what we do. We were warned and we didn’t want to hear. When I said the other day that in retrospect ‘we clearly should have been doing more all along’, I was drawing attention to our repeated failures to be imaginative in seeing our responsibilities to many different societies, and not just to this one.

I’m going to stop senior people at Facebook saying that were surprised when our network turned out not to be the uncomplicated force for good which we said it was. Quite a number of us have said that one failure or another was ‘unforeseen’ or that we were ‘caught out’. You can occasionally say ‘we didn’t see that coming’. But you can’t say it all the time. Truth is, we’re up to our necks in problems and we should have foreseen at least some of them.

Facebook’s two announcements about its ‘news feed’ – that it would make news a lower priority and let users determine quality rankings – triggered an extraordinary explosion of self-pity on the part of the news media. Given Facebook’s reach (2bn users) and the quantity of advertising it has removed from established media, that’s hardly surprising. But much of this indignation is short-sighted. Some advice to newsrooms and those who run them.

Don’t say you weren’t warned. Facebook never guaranteed – as far as I’m aware – any particular income stream to any publisher or that they would not switch their policy and algorithms. As ‘Instant Articles’ came on the scene plenty of wise voices said to publishers: ‘By all means experiment with this, but don’t rely on it. Ever.’ Don’t pretend you didn’t hear this.

But unpredictability is now Facebook’s greatest threat to news media. Instead of trying to blackmail Facebook into returning money you think should be going to you, try something more likely to work. If Facebook wants to make nice to news (and at least part of the company seems to want to), get them to understand early warning of decisions which will affect news publishers’ revenue would be wise.

Beware of striking private deals with platforms. Wider and deeper transparency requirements for platforms (almost certainly enforced by legislation) is the key to making sense and improving the slew of issues around misinformation, election manipulation and other dark arts to which Facebook has lent itself, both willingly and unknowingly. The relationship between a citizen and her/his devices will be central of twenty-first century democracies. How we know what we know (and can trust) is an issue which goes wider than the business agonies of news media. Facebook is first and foremost a gigantic advertising machine but is also now a society machine. Or a politics machine if you prefer. Politics is about how power is allocated in a society. The devices we use to connect and to collect data play an ever-larger large role in that distribution of power. And on the subject of power: try to help Mark Zuckerberg talk about Facebook’s power. The word never comes up in his bland ramblings about ‘community’ and ‘connection’.

Forget any idea that platforms can be forced to pay to prop up mainstream media. Rupert Murdoch took advantage this week of the Facebook fuss to suggest that online platforms should pay news publishers for their content in the same way that cable TV companies pay programme-makers for rights to air their work. Murdoch should send whoever drafted that statement to run something obscure in Tasmania: the parallel is nonsense. Cable companies have nothing to sell to consumers until they have content. Social networks sell advertising space by leveraging network effects between friends. They have no need of news. Given that news has never been more than a low single percentage of Facebook’s total activity and that it set light to a firestorms of controversy, I can easily imagine its executives arguing that they should leave it alone for good.

But that’s not possible. Facebook can’t now avoid being entangled with news. So people in Menlo Park who dream of a news-free network will be disappointed. For now, the network is simply too big and too widely used to sidestep the dilemmas which come with news and all the strong tensions and emotions it provokes. Facebook is part of the infrastructure of free speech. Period.

Let’s bury the phrase ‘fake news’. As an example, British government spokesmen made two announcements this week of initiatives to ‘combat’ (as headline writers like to say) fake news. One change was actually new machinery about detecting misinformation spread by states, a quite different thing. Even making allowance for the fact that the current (Conservative) government is in slow-motion and terminal decline, it is clear that the announcement-makers have no idea what they’re talking about. Quite apart from the problems of defining the news the government doesn’t like because it is ‘fake’, this sort of knee-jerk is liable to reverse long traditions of protecting free speech. A professor of artificial intelligence who goes by the splendid name of Yorick Wilks nailed it: “Someone in Whitehall has lost all sense of what a free society is about if they think government should interfere in determining what is true and false online.” One simple test for evaluating policies about misinformation: is it an exact remedy for a specific harm?

Everyone – journalists and publishers included – has a duty to help Facebook through the philosophical, political and moral issues it has landed in. Treating the platforms as if they are merely technical means of transmission open to exploitation by bad actors is exactly what Facebook, Google and Twitter have begun (at differing speeds) to acknowledge that they are not. Beating up Facebook and gloating over its difficulties is not going to make it go away. These questions of colliding rights (e.g. free expression vs privacy vs right to know) have been giving editors and lawmakers migraines for centuries. Similar countries – across Europe for example – take radically different approaches based on history, culture and experience. Example: the contrasting approach to privacy rights in Spain or France with Scandinavia.

Keep repeating: news isn’t nice. Facebook’s hard problem is the tension between its business model and the public interest value of news. The business model relies on interaction between users and time spent on the network (which Mark Zuckerberg now wants to be ‘time well spent’). That gives priority to emotion and ‘shareability’. Indignation and outrage go viral easily. News published and distributed in the public interest may appeal more to reason than to emotion. It may be unpleasant, complicated. Worse still (from Facebook’s point of view) news may be best conveyed by people who know more than other people, thus undermining any idea that everyone in a social network knows as much as everyone else. News may even insist that you learn what you may not want to know. None of this aligns easily with ‘community’ or ‘connection’, which are so central to Facebook. Tough problem, but not insoluble.

Two things you can repeat to Facebook as often as you like: be transparent and take advice. The network is doing all sorts of research (psychological affects of social media, manipulation risks etc); they share frustratingly little detail. It astounds me that neither Facebook nor Twitter has ever set up advice groups of independent experts to advise them on hard public interest problems. (Google has done so, to good effect). Even better, try to persuade Facebook to combine the transparency and the advice. Allow experts and scholars outside the company to be involved in the research and allow them to talk and write about it. If these questions are really Facebook’s tests for quality of news, they need help to strengthen them. (Powerful academic version of this case from the Dutch scholar Natali Helberger here).

During the final months of 2017 a lot of public and private attention was being directed at opening up the secrets of the algorithms used by social networks and search engines such as Facebook and Google. They have edged cautiously towards opening up, but too little and too late. The attacks on their carelessness have mounted as their profits have climbed.

The public pressure came from voices (including mine) arguing that inquiries into misinformation/disinformation in news were all very well but missed the main point. Attention is also being paid to this in private negotiations between the social networks and news publishers.

These discussions have included the suggestion that the networks might make much more detailed data on how they operate available to the publishers, but not to the public. This kind of private deal won’t work if it’s tried. The functioning of the networks is crucial to publishers, but it matters to a lot of other people as well.

Demands to regulate hi-tech companies like Google, Facebook and Apple are being heard at deafening pitch almost every day. This rush by the political herd on both sides of the Atlantic to make new laws (or to enforce the breakup of these corporations) is no better focussed or thought-out than the extraordinary degree of latitude which the same political classes were prepared to allow the same online platforms only a couple of years ago.

The cry for regulation and the laissez-faire inertia of the recent past have a common origin: ignorance. The cure for ignorance is knowledge. And knowledge of exactly what these companies do and don’t do must be the foundation of any further action to get them to shoulder their moral and civic responsibilities. If laws are needed to prevent harm, let them first compel transparency. Any politician pushing that line has my vote.

But suppose that Facebook is open to inspection by national agencies or commissions which supervise elections. That would not necessarily mean open to public inspection, but perhaps to bodies whose duty is to check electoral fairness and compliance with the law. Why would that be so hard?

Watching Facebook wrestle with ageless questions about privacy, free speech and fair play rules for democratic elections is a little bit like watching a group of students produce an occasional essay on political philosophy – without the benefit of any reading or teaching on the subject. The Facebook executives struggling with these questions want to start over without the clutter of received ideas.

Mark Zuckerberg’s latest post, on dealing with the problems of Facebook being used for electoral interference, gives us a lot of sensible changes which the network will make. It also hands down the great man’s definition of freedom. Given that Zuckerberg is a de facto electoral commission for many states on the planet, this a statement of some importance for civil society.

The key sentences are:

Freedom means you don’t have to ask permission first, and that by default you can say what you want. If you break our community standards or the law, then you’re going to face consequences afterwards.

This is not a post about the causes of the American election surprise and its implications of journalism (there’s an informative survey of opinions here). This is another bulletin on the progress that Facebook is making in absorbing and acting on the fact that it has moral and democratic responsibilities which stem from its colossal informational power.

At the weekend, Facebook’s chief honcho Mark Zuckerberg responded to charges that Facebook had influenced the election outcome, in particular by circulating fake news stories. No surprise either that Zuckerberg guesses not. But he is guessing. And I’d guess that subsequent research may show infuence. We’ll see.

Fake news is an issue, but it is not the heart of the question. The question which matters is how Facebook – the techies, the software and your community – decides what to show you. Anyone with a smartphone can now distribute information, true, false or debatable. The group of people who used try to sift the truth information likely to matter to society (aka journalists) no longer control the distribution of what they produce. Facebook is the first news distribution platform which operates at scale across the whole planet. Plainly that gives it power and influence; we just don’t yet know precisely how that works. Facebook’s responses to the dilemmas raised by this have been hesitant, crabwise, half-admissions that it may have some ‘editorial’ responsibilties and is not only a big, neutral tech-only company.