An anonymous reader writes: Cory Doctorow tells us that in 2007, John Goerzen scraped every gopher site he could find (gopher was a menu-driven text-only precursor to the Web; I got my first online gig programming gopher sites). He saved 780,000 documents, totalling 40GB. Today, most of this is offline, so he's making the entire archive available as a.torrent file; the compressed data is only 15GB. Wanna host the entire history of a medium? Here's your chance!