The publishers Springer and IEEE announced they would remove the papers from their subscription services after French researcher Cyril Labbé demonstrated the works were not original scientific work at all, but rather "computer-generated nonsense."

Among the works were, for example, a paper published as a proceeding from the 2013 International Conference on Quality, Reliability, Risk, Maintenance, and Safety Engineering, held in Chengdu, China. (The conference website says that all manuscripts are "reviewed for merits and contents".) The authors of the paper, entitled 'TIC: a methodology for the construction of e-commerce', write in the abstract that they "concentrate our efforts on disproving that spreadsheets can be made knowledge-based, empathic, and compact".

It's still unclear who submitted the papers, or why, but they're easy to spot according to Labbé, who has built an applet that users can use to upload papers and determine whether they've been generated using SCIgen, so it's unsettling to think that they could make their way to such esteemed publications (and in their subscription offerings, no less).

Labbé developed a way to automatically detect manuscripts composed by a piece of software called SCIgen, which randomly combines strings of words to produce fake computer-science papers. SCIgen was invented in 2005 by researchers at the Massachusetts Institute of Technology (MIT) in Cambridge to prove that conferences would accept meaningless papers — and, as they put it, "to maximize amusement" (see 'Computer conference welcomes gobbledegook paper'). A related program generates random physics manuscript titles on the satirical website arXiv vs. snarXiv. SCIgen is free to download and use, and it is unclear how many people have done so, or for what purposes. SCIgen's output has occasionally popped up at conferences, when researchers have submitted nonsense papers and then revealed the trick.

Read more about Labbé's revelation at Nature News. Curious if you can distinguish between a spoof paper and the real deal? Have a go at arXiv vs. snarXiv (I managed to get 7 out of 10 on my last go, but I'll admit I've scored much lower in the past).