As reported by the Guardian on February 8, the English language Wikipedia has (mostly) banned the Daily Mail as an acceptable source for citation, after declaring it “unreliable”. The report touched a nerve; the Mail swiftly issued a shrill and rambling retort, and the story was swiftly picked up in more than a dozen other publications.

But this feud more than a mere “pass the popcorn” moment. It’s also a learning opportunity, highlighting important dynamics in how media outlets function. The widespread interest in how Wikipedia evaluates its sources is welcome and overdue. In this post, I’ll consider:

What’s the context of Wikipedia’s decision? What exactly was the decision, how was it made, and how binding is it?

Was it the right decision?

What insights does the media response to Wikipedia’s decision offer?

1. What Wikipedia decided, and how

The decision about the Daily Mail may be the first such decision to be widely reported, but Wikipedia editors routinely make decisions about the suitability of sources. We have to! Hundreds of thousands of volunteer editors write and maintain Wikipedia. Evaluating the relative merits of competing sources has been a central approach to resolving the inevitable disagreements that arise, throughout the site’s history. In fact, Wikipedia has a highly active discussion board devoted to the topic. A 2008 discussion about Huffington Post (with cogent arguments for both inclusion and exclusion, though it did not result a blanket decision one way or another) is just one of hundreds of discussions where Wikipedia editors weigh sources against the site’s criteria.

With topics like “fake news” and “alternative facts” dominating recent headlines, much has been said about Wikipedia’s diligence in evaluating sources. The central role of human evaluation sets Wikipedia apart from other top web sites like Facebook and Twitter; and the relative transparency of Wikipedia’s process sets it apart from other top publishers. These distinctions have been highlighted in many venues, by prominent Wikipedians including former Wikimedia Foundation (WMF) trustee James Heilman, WMF director Katherine Maher, and Wikipedia cofounder Jimmy Wales.

The discussion, and the formal finding at its conclusion, are available for public review; click above for the details.

In the context of Wikipedia’s usual process, the Daily Mail decision was pretty unremarkable. A few dozen Wikipedians deliberated, and then five site administrators made the decision based on their understanding of the discussion, and its ties to Wikipedia policy and precedent.

Consensus has determined that the Daily Mail (including its online version, dailymail.co.uk) is generally unreliable, and its use as a reference is to be generally prohibited, especially when other more reliable sources exist. …

One angle seems under-reported: the decision was not based on a mere “majority rule” vote of partisan Wikipedians; it was (as always) an effort to determine the best path forward in light of what is publicly known.

Numerous independent evaluations of the Daily Mail’s diligence were considered by Wikipedia editors. That’s at the heart of how we work, in this and similar cases; we consider what reputable publications have had to say.

2. Wikipedia’s decision

Wikipedia editors made their ruling. Public domain image from Wikimedia Commons

Criticism of the Wikipedia decision boils down to two things:

The legitimacy of the process: Were there enough Wikipedia editors involved in the decision? Was the discussion open for long enough before a decision was made?

The reasoning of the decision: Was the deliberation thorough and sound? Was it infected with partisan bias, or did it ignore important facts?

To properly address the process questions would require a detailed breakdown of how Wikipedia decisions are made. I’ve taken on such questionselsewhere. Without getting into the details, consider: would you want an analysis of a U.S. Senate decision from somebody who knows little of the Senate’s parliamentary rules, or of the U.S. court system from somebody who’s never read a law journal? Views on Wikipedia’s process from arbitrary media commentators should be viewed with a skeptical eye.

As a longtime Wikipedia administrator, I’ll say this: The number of people involved and the length of time were entirely sufficient to satisfy Wikipedia’s requirements. Like nearly every Wikipedia decision, this one can be overturned in the future, if there’s reason to do so. The questions about Wikipedia’s process are without merit.

So, how diligent were those making the decision? Well, I’m not necessarily better qualified to answer that than any other Wikipedian, so I’m not here to give an overall endorsement or rebuttal; instead, I’d mainly encourage you to read the discussion and decision, and decide for yourself. But, here are a few observations:

A 2014 USA Today article questioned not whether, but when the paper had lied — was it in the original story, or in the correction?

The New Statesman reported in 2014 on “687 complaints against the Mail which led either to a [Press Complaints Commission] adjudication or to a resolution negotiated, at least partially, after the PCC’s intervention,” noting that the Mail had by far the most such complaints of any publication.

It seems the Mail had been widely criticized long before Wikipedia took up the question.

I expect you’ll agree that those in the discussion considered a wide variety of evidence — much of it from independent media commentators. Perhaps, if you’re a careful media observer, you know of something they missed. Wikipedians tend to be receptive to new information; so the best thing to do, if you do have further information, is to present it for consideration. You could, for instance, start a new discussion. But before you do so, consider carefully: is your evidence truly likely to sway the decision? You’ll be asking a lot of volunteers for their attention; please exercise that option with appropriate caution.

You might also ask yourself whether there is evidence of partisan bias in what you read. I didn’t see any, but perhaps you disagree. Again, if it’s there, it’s worth pointing out. As a rule, Wikipedians don’t like the idea that politics might influence the site’s content any better than you do. If such a bias can be demonstrated (which is a lot more than a mere accusation), perhaps something can be done about it.

One point, raised in several venues since the decision, does sound out. To quote the Guardian’s Charles Arthur (archive link):

There’s … a distinction to be made between the [Mail’s] website, which churns stories at a colossal rate and doesn’t always check facts first, and the newspaper, which (in my experience, going up against its writers) does. The problem is that you can’t see which derives from which when all you do is go for the online one.

Andrew Orlowski of the Registernoted the brand confusion among the Mail’s various properties, as well. A 2012 New Yorker profile delves deeper, offering useful background on the various brands within the Daily Mail brand.

3. The Guardian’s solid analysis, rooted in non-sequitur

The Daily Mail didn’t like the statement. Public domain image from Wikimedia Commons

The Daily Mail issued a rather amusing statement on the matter; there’s no substance to it, but curious readers may enjoy my line-by-line rebuttal.

But some of the coverage the episode sparked contained genuine insights.

The Guardian offered a solid followup piece, delving into many of the issues involved. But while the reporting was accurate and helpful, the Guardian inadvertently further illustrated the great gulf between traditional media and Wikipedia. Without explanation, the reporter declined to build the story around an interview with one of the decision-makers involved in the case.

Five people ultimately made Wikipedia’s decision about the Daily Mail; they would have made worthy sources for the Guardian story. If they weren’t available, there are more than 1,200 of us with the authority to make such a decision, who can therefore provide expert commentary on the matter. But instead, the Guardian centered its story on Wikimedia executive director Katherine Maher. Maher is of course aware of the various issues, and represented Wikipedia admirably. The choice to feature her in an interview, though, was roughly equivalent to seeking out the U.N. secretary general for comment on a domestic decision of the U.S. Supreme Court.

The Guardian story acknowledged the point in its second paragraph, but inexplicably chose to focus on Maher’s views anyway. What’s going on here?

To earn a comment from the top executive at a ~$100 million organization, it helps to have status, and it helps to have connections. The Guardian, of course, has both. It’s one of the world’s top news outlets, and Wikipedia co-founder Jimmy Wales (one of Maher’s bosses on the organization’s board of trustees) sits on the Guardian’s board.

To get a comment from one of 1,200+ volunteer administrators of the world’s most widely read source of original comment, though, takes good old fashioned legwork. You send a message, you pick up the phone, and if you don’t get a decent response, you go on to the next person. It’s not sexy, and it’s not always fun but if you stick to it, sooner or later you have a real basis for solid reporting.

Wikipedia, of course, is famous for its hordes of detail-oriented volunteers. If the media needed a clear demonstration that Wikipedia might just be better equipped for certain tasks than traditional publications like the Daily Mail or the Guardian, it need look no further.

The Hunting Ground, a 2015 documentary about sexual assault on college campuses exposed conflicts of interest, malfeasance and cover-ups.

Drawing by Nicholas Boudreau, licensed CC BY 4.0.

To learn about a complex topic—especially if powerful institutions have a major stake in it—we rely on experts. People who devote substantial effort toward understanding all facets of a topic can offer the public a great deal of value. We routinely refer to their perspectives and analysis when forming opinions on important social and political issues.

But of course, our reliance on experts makes their interests and motivations highly significant. To what extent do an expert’s motivations inappropriately drive their opinions and judgments? Do those opinions and judgments color how they present the facts? As critical readers, we should always pay attention to conflicts of interest (COIs). And if they’re insufficiently disclosed, we’re at a significant disadvantage. If you learn that a product review you relied on was secretly written by the company that made it, you might feel some indignation—and rightly so.

Publishers that care about accurate information face the same issue, but have a greater degree of responsibility; and if a publisher inadvertently amplifies biased information on to its readers, its reputation may suffer. So publishers establish standards and processes to eliminate COIs, or—since it’s often impossible to gather information that is 100% free of COI—to manage them responsibly. Wikipedia is no exception.

But as a publication that invites participation from any anonymous person in the world, Wikipedia has unique challenges around COI. Despite Wikipedia’s efforts to require disclosure, COIs often go undeclared and unnoticed, which leaves everybody (understandably) a little skittish about the whole topic. Blogging pioneer Dave Winer’s words in 2005 illustrate this point: “Every fact in [Wikipedia] must be considered partisan, written by someone with a conflict of interest,” he said. But significantly, every change to the site is rigorously preserved and open to public review. Wikipedia editors routinely investigate and deliberate additions and edits to the site. Every change can be reverted. Every user can be chastised or blocked for bad behavior. The process can be messy, but since 2005, researchers have repeatedly found that Wikipedia’s process generates good content. A properly disclosed and diligently managed COI on Wikipedia is rarely a big deal; it’s part of what makes Wikipedia work. Disclosure is a key component that supports informed deliberation. Disclosing a COI doesn’t give a Wikipedia user carte blanche to do as they see fit; but it does express respect for Wikipedia’s values and for fellow editors, and it gives Wikipedia editors more information to use in resolving disagreements.

One of the principle methods Wikipedia employs to minimize the impact of COI is an insistence on high quality sourcing. But on occasion, Wikipedia editors are overly swayed by sources that match up poorly against the site’s standards.

A Wikipedia case study

The Hunting Ground (2015), a documentary film which investigated the issue of sexual assault on U.S. college campuses, received widespread acclaim, but it also ignited controversy. The production company, Chain Camera Pictures, retained Wiki Strategies beginning early that year to assist with developing and improving the Wikipedia articles related to the film’s focus, as well as the article about the film itself. (See “Disclaimer” below.)

Conflict of interest is a central focus of TheHunting Ground. Universities are required to investigate any report of a sexual assault involving their students; but they also have a strong financial and reputational interest in avoiding scandal. By vigorously investigating sexual assault cases, universities might associate their campuses with violent crime, which could impact recruitment and alumni donations.

In one of the incidents explored in The Hunting Ground, Florida State University (FSU) football star Jameis Winston was accused of rape. A state attorney, when announcing months later that he had insufficient evidence to prosecute, noted substantial problems in the initial rape investigation carried out by both FSU officials and Tallahassee police. Independent investigative pieces from Fox Sports and the New York Times both suggested that COI might have been a factor.

While the influence of a COI in any specific case is difficult to prove, it’s clear that the financial interests of entities like FSU―whose athletics programs bring in more than $100 million a year―sharply conflict with the interests of the women portrayed in TheHunting Ground. FSU is one of the many institutions that had reason to feel threatened by the film, alongside numerous universities, law enforcement agencies, and athletic programs.

The Hunting Ground earned substantial accolades and validation. It received two Emmy nominations, including Exceptional Merit in Documentary Filmmaking, and was one of 15 documentaries shortlisted for the Academy Award for Best Documentary Feature. CNN vetted and broadcast the film, along with a series of panel discussions, putting its own journalism reputation on the line. And student groups, faculty, and university administrators screened the film on hundreds of college campuses.

But unsurprisingly, given the threat it posed to powerful institutions, the film drew pushback as well as praise.

Among the film’s more persistent critics has been the Washington Examiner’s Ashe Schow, who has written columns about it or mentioning it more than 20 times since March 2015. In November 2015, during the runup to the Academy Awards, Schow announced Chain Camera’s Wikipedia efforts, under a headline proclaiming that they had been “caught” editing Wikipedia. But of course, you can’t be “caught” doing something you were open about from the start; and Chain Camera had been diligent about disclosure. Wikipedia editors working on the various articles had known of the efforts of Chain Camera’s employee Edward Alva for many months. As Chain Camera stated in their rebuttal, Schow’s charges were inaccurate and ill-informed.

The complaints about Alva’s editing, made initially by Schow and amplified by Wales, were considered in detail. The graphic highlights the formal decision by administrator Drmies. Click the image to see the full discussion.

Wikipedia co-founder Jimmy Wales took Schow’s words at face value, praising her piece for “embarrassing” Alva. As Wikipedia editors took up the issue in a public discussion, several included Wales’ statement as part of the evidence of Alva’s wrongdoing. Much of the early discussion was characterized by a lack of diligence in considering Alva’s efforts. One comment stands out: “I don’t have enough time to do a thorough investigation,” said a Wikipedia editor. “But as I now see it, this situation could be dealt with very quickly and justly with a permanent ban of [Alva].” A lack of thorough information was apparently not enough to stop this editor from recommending strong sanctions.

But as many Wikipedians recognize, diligent investigation is important. In the following week, several Wikipedians did indeed take a close look at the edit history. They ultimately rejected the accusations leveled by Schow and Wales. Drmies, the administrator who made the formal determination to close the discussion, stated that “the [Chain Camera] editor declared their COI early enough,” and that “the editor’s defenders present very strong evidence that [Wikipedia’s] system worked.”

Drmies’ closing statement carries weight in the Wikipedia world, and it is archived publicly. But from an outside perspective, it might as well be invisible. The media world is used to covering traditional decision-making processes, like court decisions and the acts of public officials, but it’s rare that a media outlet will understand Wikipedia well enough to track a contentious discussion effectively. Schow is ahead of the curve: she knows enough about Wikipedia to find some tantalizing tidbits, to generate copy, to generate clicks, and to influence those readers who lack deep familiarity with Wikipedia.

Specific problems with Schow’s account

But this story, despite its ramifications for an Academy Award shortlisted film and the National Football League’s #1 draft pick, was never picked up in any depth by a journalist who understands Wikipedia’s inner workings. Commentator Mary Wald did briefly note that Alva had observed Wikipedia standards, in a Huffington Post piece that highlighted the strength and stature of the interests taken on by the film. But this brief mention in a single story did not turn the tide. Anyone who follows the media coverage would likely be left with the incorrect impression that Chain Camera had done something wrong—to this day.

If a journalist had covered the story in depth, paying close attention to Wikipedia’s policies, norms, and best practices, they would have noted several flaws in Schow’s analysis. For instance:

Schow began with a common—and erroneous—premise: she assumed Wikipedia’s COI guideline, which recommends against editing an article while in a COI, is a policy. Wikipedia makes a distinction between the two, and explicitly notes that guidelines “are best treated with common sense, and occasional exceptions may apply.” Guidelines are not to be treated as rigid requirements. The COI guideline, in particular, has been scrutinized and deliberated extensively over the last decade. Wikipedia’s need for experts and its philosophical commitment to open editing have both prevented it from ever adopting a formal policy prohibiting editing while under a COI. Wikipedia’s relevant policy does not prohibit someone like Alva from making edits, but it does require disclosure in one of three places. Alva made that disclosure from the start, and in fact exceeded the policy’s requirement by disclosing in multiple places.

In her second column on the topic, Schow attaches significance to Jimmy Wales’ important-sounding words about changing Wikipedia policy in light of Schow’s report. This, again, is an understandable mistake; with most organizations, it’s safe to assume that a founder and board member’s ambitions have a close connection with reality. But with this particular board member and this particular issue, that assumption couldn’t be much further from the truth. Jimmy Wales has a long history of strongly advocating the “bright line rule,” which—had Wales’ efforts to have it codified a policy not been rejected—would have forbidden certain COI edits. Wales has even unequivocally stated that it doesn’t matter if the public thinks it’s policy; in his view, such details are unimportant. To put it simply, Wales is an entirely unreliable source on the topic of conflict of interest on Wikipedia. And despite Wales’ “renewed interest,” as Schow called it, his commentary on the topic ended as soon as it became clear the facts did not support his initial reaction to Schow’s column.

Schow doubled down on some of her strongest words about Alva’s approach, in her third column (November 30): she claimed that Alva had failed to sufficiently disclose his editing of topics related to The Hunting Ground until September 2015. But he had in fact exceeded Wikipedia’s disclosure requirements, as mentioned above. As she did acknowledge, Alva disclosed his connection to The Hunting Ground as early as March 2015, prior to any edits to related Wikipedia articles. He made further, more specific disclosures on April 23, July 27, August 10, and again on August 10, all before the September edit noted by Schow. A columnist, of course, might not be expected to fully grasp the intricacies of Wikipedia editing; but to vet such strong opinions before doubling down, she might have interviewed an uninvolved Wikipedia editor or two.

Schow’s errors may well have resulted from a good faith effort; but that doesn’t make them any less important. Her influence on the public perception of the connection between Chain Camera and Wikipedia has been substantial (see coverage at the Independent Journal Review and the Hill). So it’s significant that she got major parts of the story wrong.

Let the Wikipedia process work – don’t try to shut it down

In covering any story that challenges powerful institutions, Wikipedia editors have to sort through strong messages from various parties. Ultimately, Wikipedia relies on the sources it cites as references. High-quality source materials, not the interests or organizational affiliations of Wikipedia editors, should be the main factor in crafting its content. Wikipedians should not ignore those affiliations, and should always be mindful of the COI of various parties―not only of the editors, but of the people and institutions who generate and influence the stories they cite.

Any COI can be either disclosed or obscured, and even a fully disclosed COI can be managed well or poorly. Of course, it’s impossible to know whether other, anonymous editors have undisclosed COIs; but it would be foolish to conclude with any certainty that those who disclose are the only Wikipedians with a COI, when more than a decade of experience tells us that secretive paid editing – despite being a policy violation – is commonplace. Wikipedians should applaud Alva and Chain Camera Pictures for disclosing from the start. Even if they disagree with his specific suggestions or edits, they result from a good faith effort to improve the encyclopedia. When Wikipedians disagree with a good faith editor, they should talk it through—not discuss whether to block them from editing.

Wikipedia needs more, not fewer, expert contributors

When experts engage openly with Wikipedia, seeking to improve the encyclopedia, we should celebrate and support that effort. Chain Camera Pictures brought something to the table that few Wiki Strategies clients do: they sought to improve Wikipedia’s coverage of a broad topic they knew well through their work. Does this mean that they alone should determine the content of relevant Wikipedia articles? Of course not—Wikipedia’s model demands that any Wikipedian present convincing arguments, with reference to independent reliable sources. That is exactly what Alva did.

The approach Alva took, overall, is the right one. It should be readily apparent to any Wikipedian who looks at the edit history that Alva’s overall intent was to be transparent about his affiliation. Alva made several disclosures, and engaged other Wikipedians in discussion on points of contention multiple times. He added independent, reliable sources, sorting out disambiguation pages, reverting vandalism, and expanding content, and removed poorly-sourced, inaccurate information.

For a topic as important as sexual assault allegations on university campuses, Wikipedia benefits when experts engage with its content. Every day, non-expert writers do their best to to place snippets of information into a narrative, to build Wikipedia articles; but it often takes some expertise to evaluate and refine that narrative. Wikipedians recognize this need; there is even a banner placed on articles deemed to lack an expert’s perspective.

This banner is placed on a Wikipedia article when somebody thinks an expert opinion could help.

The creators of The Hunting Ground are not, of course, the only experts on this topic. Investigative reporters like Walt Bogdanich of the New York Times and Kevin Vaughan of Fox Sports reported extensively on the subject. They reviewed thousands of pages of documents and interviewed many and various parties. In so doing, they surely developed significant expertise. If Wikipedia seeks to excel at summarizing all human knowledge, it should engage people like investigative filmmakers and journalists, who often have the strongest understanding of a given topic. As readers and as Wikipedia editors, we rely on these people to report on difficult stories that institutions often try to keep secret. We should applaud and welcome experts of all stripes when they bring their skills and knowledge to Wikipedia, as long as they are upfront about relevant affiliations. If it keeps the focus on including experts and sorting through disagreements, Wikipedia will be a more robust and comprehensive platform. Its editors, and more importantly its readers, will benefit.

Disclaimer

Chain Camera Pictures is a Wiki Strategies client. Our statement of ethics addresses cases like this; specifically, see item #4 under “Broad commitments & principles,” and the second paragraph of “Article composition and publishing.” It is unusual for us to blog about a client’s project, as we do here; in this case, the client made the decision (in consultation with us) to disclose our work together. In this blog post, we focus on the process Chain Camera Pictures followed in editing Wikipedia; in light of the issues addressed in our statement of ethics, we do not comment on the specific content of the Wikipedia articles in question.

Congresswoman Sheila Jackson Lee of Houston is the latest prominent figure to confuse Wikipedia with WikiLeaks. This confusion goes back many years; it often flares up when WikiLeaks releases capture the public’s attention. In 2010, for instance, when WikiLeaks released a string of controversial documents and video, journalists including Charlie Rose turned to Wikipedia cofounder Jimmy Wales for commentary — only to learn, sometimes on live camera, that Wales and Wikimedia have nothing to do with WikiLeaks.

But the mistake is an important and troubling one, especially when made by a public official. Wikipedia and WikiLeaks do not merely lack institutional ties; they also reflect profoundly divergent philosophies about the public’s role in information stewardship.

Wikipedia invites everybody in the world to participate in nearly every decision.

WikiLeaks, while it might solicit key information from anybody who has it, is completely opaque and centrally driven in its decisions; founder Julian Assange may be the sole decision-maker (or perhaps there is a small inner circle he consults).

Any member of Congress should care about the public’s role in information management and dissemination. And anybody who cares about that topic should know, as a basic point of literacy in 2016, that Wikipedia and WikiLeaks are at opposite ends of the spectrum.

The news ain’t what it used to be…a recent story that ran in multiple Portland, Oregon news outlets took a single, anonymous Yelp comment as evidence of a “controversy.” Fortunately, I got to be at the bakery the story covered when a TV news team came in to…actually interview to sources.

This story reminds me of how Wikipedia is often covered: two random people disagreeing over something in a Wikipedia article might be presented as an “edit war” or a “something-gate.” As the way we communicate evolves, it’s taking traditional media a while to catch up.

The Wikimedia Foundation (WMF) is inviting commentary on how to recognize and encourage informal leadership in the volunteer community. (The consultation runs from September 20 through October 16, 2016.) This is a welcome initiative; the Wikimedia movement has not done well, over the years, at capturing stories of volunteers who successfully focus attention on important areas, and who do good work building consensus and forging and executing plans.

There are exceptions. The announcement linked above hails achievements by Liam Wyatt and Vassia Atanassova; and the WMF has consistently highlighted successful volunteer-driven projects on its blog and elsewhere. Still, many who have taken on big challenges and risks to advance our shared values and vision go unheralded.

Below, I consider an important 2011 initiative and conflict, known informally as ACTRIAL (short for “Article Creation Trial”). I learned of it many months after it took place; I had to jump through jargon-filled discussions in multiple venues before I began to understand what had happened. It’s an important story, though. In recent years, WMF staff have often asserted that Wikipedia’s volunteers are “change averse,” and incapable of generating or agreeing on new ideas. But the characterization is neither fair nor accurate, and is typically asserted out of mere political convenience.

ACTRIAL, however, provides a clear and valuable counterexample, in which Wikipedians self-organized to advocate for change, and WMF staff blocked the effort. This post highlights an important piece of Wikimedia history, and offers a little recognition to unsung heroes.

The Blade of the Northern Lights, a Wikipedia volunteer, wanted to address a persistent problem that irks many regular Wikipedians: brand new Wikipedia volunteers who write articles that are far from meeting Wikipedia’s content standards.

The Blade proposed that the creation of new articles be restricted to users with a bit of experience (“autoconfirmed”: 4 days, 10 edits), and then guided more than 500 English Wikipedia volunteers in considering and ultimately approving the proposal.

ACTRIAL was designed as a six-month experiment rather than a definitive policy change. That point is an important one; the WMF often encourages volunteers and affiliate organizations to test hypotheses before making long-term commitments. Volunteer Rich Farmbrough, in spite of his skepticism about the proposed change, praised ACTRIAL’s design as an experiment in 2014, and explained the significance:

I am against preventing article creation by IPs let alone non-autoconfirmed users. But this trial might well have provided compelling evidence one way or the other.

Once the English Wikipedia community had agreed to move forward with ACTRIAL, Scottywong, another Wikipedia volunteer, formally requested a necessary technical change. In a haphazard discussion driven by WMF staffers, the request was denied. Apparently ignoring the extensive deliberation that had involved hundreds of volunteers, one WMF employee stated: “this entire idea doesn’t appear to have been thought through.” Several seemed to agree that the proposal was at odds with the strategic goal of improving editor retention, though no clear argument supporting that position was advanced.

To date, the WMF has not explained this extraordinary rejection of a good-faith, volunteer-driven initiative. The closest approximation to an explanation was a mailing list discussion in 2014. In that discussion, then-WMF staffer Philippe Beaudette asserted that the WMF had ultimately solved the underlying problem in another way:

What I remember was that a pretty good number (~500) of [English Wikipedia] community members came together and agreed on a problem, and one plan for how to fix it and asked the WMF to implement it. The WMF evaluated it, and saw a threat to a basic project value. WMF then asked “what’s the problem you’re actually trying to solve?”, and proposed and built a set of tools to directly address that problem without compromising the core value of openness. And it seems to have worked out pretty well because I haven’t heard a ton of complaints about that problem since.

However, Beaudette’s statement had several problems (edited: see note below):

If there was indeed an evaluation, it was never made public.

While several individuals argued that a “basic project value” was at risk, no decisive case was made, nor any formal conclusion presented. Others disagreed, and the matter was never resolved decisively.

If the WMF had asked “what’s the problem you’re actually trying to solve?”, the question was not (as far as I can tell) posed in a public venue.

It’s unclear what “set of tools” were developed; but regardless of what that was referring to, any claim that it “worked out pretty well” should have been evaluated by a more robust process than listening for complaints. As volunteer Todd Allen said: “You haven’t heard more complaints, because the complaint was pointless the first time and took a massive effort to produce.”

When I requested clarification, Beaudette did not respond; but James Alexander, another WMF staff member, did step in. Alexander speculated that the software inspired by ACTRIAL was the Page Curation tool. He may have been correct, but no other staff member confirmed it in that email thread; and neither the page on Page Curation nor its parent page on the Article Creation Workflow make any mention of the discussion of the 500+ volunteers that may or may not have have inspired them.

The sequence of events around ACTRIAL has not been publicly documented by the Foundation – and accordingly, five years later, the leadership shown by the volunteers who guided the discussion remains unrecognized. The initial reactions of WMF staff, therefore, loom large in volunteers’ memory. As Rich Farmbrough opined: “The dismissal as a ‘we know better’ was a bad thing.”

Now that the Foundation seeks to explore stories of leadership in the Wikimedia movement, it would do well to look into the story of The Blade of the Northern Lights and Scottywong. These two played important roles in guiding the English Wikipedia community to define a problem, and to map out a viable (if unimplemented) way to explore a solution. Moreover, if any Wikimedia Foundation staff worked with the volunteer community to make something worthwhile out of those deliberations, their leadership merits recognition as well.

Note: This blog post does not cover the role of Erik Moeller, then the Deputy Director of the WMF. He made a couple of substantive comments in the discussion. Some of them speak to Beaudette’s points, and the extent to which the WMF engaged with the problem surfaced in the volunteer community. I will update this post or follow it up with more detail when I have time. -Pete

Script from the Koran adorns much of the Taj Mahal. Photo by Jean-Pierre Dalbéra, licensed CC BY 2.0

A slide flashed on the screen—the Taj Mahal. The audience was initially taken by its physical beauty. But upon closer inspection, we were told, one would find much of the text of the Koran chiseled into this wonder of the ancient world.

What a brilliant way to preserve text against the ravages of time. Carve the most important words into the stone of a marvelous structure.

Text—its physical structure, its preservation, its manipulation, interpretation and cultural transformation—was the topic of the sixth Future of Text Symposium on the Google campus in late August.

For the second year, Wiki Strategies founder Pete Forsyth was among the speakers, who each had 10 minutes to describe a particular personal text passion, followed by five minutes for questions and discussion. Wiki Strategies co-sponsored the symposium.

The symposium exists because Frode Hegland cares deeply about text and had been searching for a vehicle to bring together kindred spirits to probe text’s evolution. As he has said in explaining the need for such a gathering, “The written word is a fundamental unit of knowledge and as such is of universal importance.”

Hegland, a teacher, lecturer, software developer and author, hosted the first Future of Text symposium in London six years ago. He intentionally keeps the crowd small, intimate and engaged; expanding to a three-day conference in a hotel ballroom is not his idea of thought leadership.

Eileen Clegg presenting.

His co-organizer is Houria Iderkou—without whom, he says, the symposium would not be possible. Houria is an e-commerce entrepreneur who flawlessly manages the many details of hosting a symposium.

The passions unleashed and the intellectual exchanges that occurred that day in Mountain View paid tribute to text’s contributions to human culture. For one day, the written word was celebrated as the miraculous gift to mankind that it truly is. Frode admitted to being a bit discouraged at day’s end–not by the discussions that took place, but by the weight of responsibility mankind has to hand text along from generation to generation, and to use it to its full potential to support the human endeavor.

Jim Strahorn, Pete Forsyth, and Bonnie DeVarco dig into the technicalities of text.

It is, after all, the accounting and preservation of the human experience, as well as an essential ingredient of that experience. When one considers what has been irrevocably lost of the experiences of humans who did not have a written language to pass their tales on to those who came after them, then perhaps Frode’s anxieties come sharply into focus. The responsibility is on us to preserve in text what we have learned, what we have seen and heard and touched and felt and smelled.

Howard Rheingold and Pascal Zachary at the Future of Text Symposium, 2015.

Futurists are fond of predicting that, one day, humans will communicate telepathically, thus drastically reducing the need for the written and spoken word. Until that day arrives, however, mere mortals must continue to communicate primarily with text, a vehicle fraught with pitfalls but loaded with potential.

It is the latter that will be the focus of a fast-paced, one-day symposium in the Bay Area August 25. The Future of Text Symposium is most certainly becoming one of the most esoteric intellectual endeavors of our time.

And while it has been steadily evolving since first presented in 2011 at the British Library in London, the concept remains the same: Brilliant minds gather to share their most cogent thoughts on where text has been, where it is and where it’s going–and they do it within a interactive structure that keeps the idea stream rushing along in a torrent.

We must say with considerable pride that Wiki Strategies is both a sponsor of this year’s FOT, and, through founder Pete Forsyth, a participant. Pete, like the others on the panel, will have his 15 minutes of, if not fame, foment–10 minutes to present his most critical thoughts on text, and what lessons about its future may be drawn from 15 years of Wikipedia, followed by 5 minutes of discussion.

He’ll be sharing the stage with, among others, the following luminaries:

His co-host is Houria Iderkou, founder and owner of skin care company Néfertari. She and Frode have worked together on various projects for more than a decade, and have nurtured FOT along since its inception.

We’d love to have everyone come to Mountain View on the 25th to join in the repartee, but, unfortunately, the event is already maxed out capacity wise. You see, the in crowd in the text world avidly awaits the announcement of upcoming symposia and seats disappear quickly. But we intend to do a bit of filming while we’re there, and we’ll be writing about the event in this space. So stay tuned and we’ll bring you the highlights–in text and video, and, hopefully, so robustly communicated that you will feel as though you were there in person.

Some thoughts and links about Wikipedia, to support the Professional Development Workshop led by Joe Cox at the 2016 Academy of Management annual meeting.

What is Wikipedia?

Wikipedia is the largest, and most widely read, publication in history; but perhaps more significantly, it has been built by hundreds of thousands of disparate volunteers, making it arguably the most extensive and impactful collaborative project in history.

It’s based on wiki software, invented in 1995. Wikipedia was launched in 2001, initially as an experiment. The policy framework and the social norms are as vital to Wikipedia’s identity as the software; basic principles were articulated early on, and each language edition’s volunteer community writes its own more specific policies.

My personal experience: Being part of a learning and teaching community, meeting smart, passionate, and knowledgeable people. This perspective is somewhat “taboo”; strong sense that “we are not Facebook.”

A dojo is a great place to prepare to write a wiki page! Photo by Flickr user “superwebdeveloper”, licensed CC BY 2.0.

When we started the monthly Bay Area WikiSalon meetups a few months back, I was eager to try out a newly-invented way for a group to build a wiki page together. The purpose of a Wikidojo, as the event is called, is twofold. Of course, it’s a way to quickly build a page. But much more importantly to me, and to the purpose of the WikiSalon series, is that everybody involved learns something from seeing how other people approach the task. How does an experienced editor add a footnote? Where does a new editor stumble? How do you add a photo? What’s up with categories and talk pages, anyway? Watching somebody at work, and hearing them describe what they’re doing and why, can be tremendously informative and inspiring.

So at our June event, I proposed that we try out a Wikidojo, and write a Wikipedia article together. It was a delightful experience, though as you’ll see below, it didn’t go exactly as I had expected!

Eugene and Ben got the article started. If you look closely in the background, you can see the article growing…

…Peter and Andrea continued the fun…

…Wayne, an experienced Wikimedian, showed Mike how to add a photo to the article. Photos by Pax Ahimsa Gethen, licensed CC BY-SA.

I offered the group a few suggested topics, and was pleased that they preferred my first pick. We wrote about the Ghost Town Royals, a little league baseball team (and league) founded a decade ago in Oakland, with the aim of offering kids an alternative to the street life all too many get drawn into. It’s a topic close to my heart: the league is in my neighborhood, and presents opportunities to kids with big needs. The topic passes Wikipedia’s rigorous notability standard, in that it’s been covered in several news articles; but it’s something most people would have a hard time learning about even if they knew what they were looking for, because much of the news coverage doesn’t easily show up in a web search. (You can see the results of our work at the link above.)

The Wikidojo model was invented by Nikola Kalchev, inspired by Vassia Atanassova’s 2014 talk about educational approaches with wiki; and it has been conducted in many places around the world. I learned about it from my friend Asaf Bartov, and was immediately drawn to the seductively simple idea: everybody takes a seven minute turn at building a wiki page, as the “pilot.” Every pilot has a “copilot,” who engages them in discussion during their shift. And everybody else gets to listen and absorb what’s going on.

Asaf (and various online writeups and videos) had warned me of the common pitfalls. Above all, I was supposed to be rigid on one point…ensure that the audience keeps quiet! That’s necessary, I was assured, to permit the “magic” to develop between the pilot and the copilot, and to permit each team to approach the task in their own way without too much interference or distraction.

Well, there’s only one thing to say on that point: I failed. After the very first round, I was seduced by an eager participant to mix it up. Our second copilot wanted to act as a facilitator, moving around the room with the microphone, to take suggestions and comments from the audience. To be honest, I didn’t want to do that — I knew this was just the thing Asaf had warned me about. But I’m a sucker for enthusiasm, and our copilot had it in spades. So we gave that a shot. To my eye, it made it more difficult for the pilot, who had a cacophony of ideas coming from multiple sources, making it difficult to chart his own course — which is the main thing Wikidojo is designed to showcase. It was also difficult to get back to the original format, as the audience enthusiasm for voicing suggestions lasted into the later rounds. But, my preferences aside, a show of hands afterward (9 to 7 vote?) indicated it was actually the preferred approach. So, this might be a variant to be explored further. As I see it, there’s so much experimentation built into the activity to begin with, that changing the format midstream is disorienting. I’d advise anyone trying a Wikidojo for the first time to be clear about the rules from the beginning (one way or another), and stick with them throughout the event.

This photo was already stored on Wikimedia Commons, making it accessible for inclusion in an article. Photo by Flickr user “wildernice,” licensed CC BY 2.0.

Another factor that made it tough to stick to the original format was…me. I wasn’t so great at following my own rules! Several times, I jumped in with suggestions for the pilot and copilot. I tried to restrain myself, but in some cases I think it was the right thing to do. For instance, the topic we had chosen has only sparse media coverage, and because I had researched it prior to the event, I knew some approaches to searching that would prove to be time-consuming dead ends. I didn’t think having a team take up most of their seven minutes in a fruitless search would be very satisfying, so I intervened. And at another point, a new wiki editor suggested adding an image to the article. He wasn’t worried about it being perfect for the article, as he was mainly interested in learning the process; the pair seemed poised to run a time-consuming search for an image which, at best, would have yielded a photo under copyright, and complications that couldn’t be resolved in the allotted time. I again jumped in, to suggest adding an unrelated photo of a child playing baseball. I think it was a good suggestion, as it permitted them to successfully post a picture in the time allotted.

Overall, I had a lot of fun, and am looking forward to trying it out again. Before encountering the Wikidojo model, I have often wanted to engage a larger group in improving an article in person, but I didn’t see any effective way to do it. Edit-a-thons tend to follow a less formal structure than Wikidojo; at an edit-a-thon, smaller groups typically form. That’s a great approach too, but can limit cross-pollination of ideas. Wikidojo is much better suited to a shared experience and shared learning.

The informal feedback I heard was positive; participants enjoyed watching each other work, and a number of people learned new wiki skills. There were some good suggestions, as well; I have collected more detailed notes and feedback on this wiki page. The next time I do it, I think I’d choose a less esoteric topic, that’s easier to research, and that the audience already knows to some degree. But even with a few unexpected twists and turns, it was a lot of fun, and it stimulated some great discussion about how different people approach wiki.