For our course about the impacts of the internet, we developed an exercise to get our students thinking critically about the credibility of things they find on the web. As a number of colleagues have expressed in interest in this, I thought I would post it here. Feel free to use it and adapt it!

Near the beginning of the course, we set the students to read the chapter “Crap Detection 101: How to Find What You Need to Know, and How to Decide If It’s True” from Rheingold & Weeks’ book NetSmart. During the tutorial, we get them working in small groups, and give them several, carefully selected web pages to test their skills on. We pick webpages that we think are not too easy nor too hard, and use a mix of credible and misleading ones. It’s a real eye-opener exercise for our students.

To guide them in the activity, we give them the following list of tips (originally distilled from the book by our TA, Matt King, who wrote the first draft of the worksheet).

Tactics for Detecting Crap on the Internet

Here’s a checklist of tactics to use to help you judge the credibility of web pages. Different tactics will be useful for different web pages – use your judgment to decide which tactics to try first. If you find some of these don’t apply, or don’t seem to give you useful information, think about why that is. Make notes about the credibility of each webpage you explored, and which tactics you used to determine its credibility.

@Victor: The checklist doesn’t say poor grammar correlates to low credibility. It merely prompts the students to think about this. The key idea is to get them to think more about the relationship. The amount of effort that’s been put into copy-editing is a useful indicator, at least of the seriousness of the website owners, even if that doesn’t always correlate with credibility. People who don’t care about truth and accuracy also tend not to put that kind of effort into careful writing either, but that’s quite different from the occasional slip that non-native speakers make.

Nice, though I find the criteria under 4) Connectedness and 5) Design & Interactivity a little odd, as it seems to confuse popularity (4) and slickness (5) with quality. A pseudo-scientific blog such as WUWT would rank highly on the criteria under 4 for example.

In addition to the other criteria, which make a lot of sense, I think there could be more, especially in terms of spotting logical fallacies. My example of a bullshit detector for scientific topics (with examples from climate science, but most of the heuristics are generally applicable I think) is here:

As the Tofflers say: “Science is different from all the other truth-test criteria. It is the only one that itself depends on rigorous testing.” They go on to say: “In the time of Galileo . . . the most effective method of discovery was itself discovered.” [Namely Science.] The Tofflers also say that: “The invention of scientific method was the gift to humanity of a new truth filter or test, a powerful meta-tool for probing the unknown and—it turned out—for spurring technological change and economic progress.” All of the difference in the way we live now compared to the way people lived and died 500 years ago is due to Science.”

One suggestion — you mention using Alexa, perhaps this covers it: use the method Perlman suggested in _The_Long_Con_ — consider the advertising that appears on the site, and ask what kind of people these advertisers are expecting to reach by advertising there.

The guidelines are pretty good. However, sometimes it helps to have an oversize well marinated brain. For example, in late 2002 and early 2003 almost all USA media insisted that Iraq had it was developing weapons of mass destruction. I could write a book about how this lie was created, polished, disseminated and bandied around by all the cognoscenti, including my former hero, Colin Powell. And I could write about other examples, some of which I write about, and some of which I can’t because I’m likely to turn up literarily dead in a trash can. The worksheet is fine, but in the end one has to be educated, and learn to smell.

Fernando, so you agree but finish suggesting to trust your gut (which is warned about in #7) above all? Weird.

Your example is a particular good example where this list does not work because all information, good or bad, was coming from a very short list of secret intelligence agencies. However, the situation is quite different with climate change science.