As I’m sure you know, my Facebook feed has exploded with strong positive and negative opinions about a study of nearly 700,000 people that researchers at Facebook, working with scientists at Cornell and the University of California, San Francisco, conducted in 2012.

What your researchers did was to increase or decrease the odds of users having Facebook posts with positive-sounding words in them compared to a control group, which was selected randomly. They did the same with negative-sounding words. What they were able to conclude was that when people’s feed had more happy talk, they wrote more happy posts; when there was more sad or angry talk, their feeds likewise changed in tone; and when the feed was neutral (that is to say boring) they didn’t post. The results were published in the Proceedings of the National Academy of Sciences.

I realize that the likely result of this outrage is that Facebook will continue to do experiments like this when they seem helpful to its business practices, but will no longer publish them in journals like Nature, Science, and PNAS. I want to implore you to do the opposite: please publish more.

To make it clear how serious I am, I hereby volunteer to be an experimental subject in any randomized study Facebook designs to test changes in its news feed for effects on user behavior. But only if you publish the results.

I want Facebook to live by the same rules I expect drug companies to live by: if you do research that gives us insight into the social or psychological impact of your website, which nearly everyone I know now uses, I want you to let everyone – everyone! – know. If can sign on to the AllTrials.net campaign to do this with its clinical trial data, you can let us know what you've found, whether you can find a scientific journal wants to publish it at all.

By the way, though you obviously stepped in stank here, I don’t think you violated standards of informed consent. I tend to agree with Michelle N. Meyer, an ethicist and lawyer at the Union Graduate College–Icahn School of Medicine at Mt. Sinai, not only that you, as a private company were not subject to an academic institutional review board, but also that if you had been, the study would have probably passed muster. One key factor, Meyer argued to me, is that you couldn’t have done this research if you’d disclosed exactly what you were doing – it would have biased the results. (See Meyer's more detailed legal reasoning here.)

Besides – and Meyer and I agree on this too – it’s unlikely that anyone was being harmed by this experiment. The risk was in line with the risk of using Facebook, which is to say, minimal. Your researchers brought this partly on themselves. The effect on people’s language of making positive posts more prominent was a few more positive words in a thousand; the same with negative posts and negative words. And that whole business that you’re affecting their emotions? Come on, please. It’s just as likely that nobody cracks jokes at a funeral. If my feed is full of downers, I save the funny kitten video for the morning.

And this doesn’t let you off the hook entirely. As a disclosure that all Facebook users could be part of a giant randomized study at any moment, your terms of use are abominably unclear. You need to re-consent everybody right away. Trust me, many of us will check the box if you clearly explain what you are doing and why. We know that we need more good data about ourselves and the world, not less. I think we need your data, and I'm willing to contribute if it does more than help you pelt me with ads.

You also need to assemble an independent group, something like the institutional review boards that universities use, to vet what you’re doing with an outside eye. But I want to make sure that you’re making the data public, as you did with this study, and the one where you were able to influence whether people voted in Congressional elections, or signed their organ donor cards. (You didn't get informed consent on those either. Eventually you will need it. You do need to think more deeply about this stuff.)

Please, find ways to put your research into the public domain. Because at some point, you may find something that’s too big for you as a corporation to handle ethically. You’ll learn that you can predict when people are at risk for suicide, or that you can make people more or less likely to marry. And then you’ll need the wisdom of crowds – of all the people who are angry now – to figure out the right thing to do.

Withdrawal from public debate about your data would be the most evil thing you could do. Now is the time to open up even more -- we're sharing, and so should you.