You are here

The Number of Textual Variants: An Evangelical Miscalculation

In the Baker Encyclopedia of Christian Apologetics, by Norm Geisler (Grand Rapids: Baker, 1998; p. 532), there is a comment about the number of textual variants among New Testament manuscripts:

Some have estimated there are about 200,000 of them. First of all, these are not “errors” but variant readings, the vast majority of which are strictly grammatical. Second, these readings are spread throughout more than 5300 manuscripts, so that a variant spelling of one letter of one word in one verse in 2000 manuscripts is counted as 2000 “errors.”

From one point of view it may be said that there are 200,000 scribal errors in the manuscripts, but it is wholly misleading and untrue to say that there are 200,000 errors in the text of the New Testament. This large number is gained by counting all the variations in all of the manuscripts (about 4,500). This means that if, for example, one word is misspelled in 4,000 different manuscripts, it amounts to 4,000 “errors.” Actually in a case of this kind only one slight error has been made and it has been copied 4,000 times. But this is the procedure which is followed in arriving at the large number of 200,000 “errors.”

In other words, Lightfoot was claiming that textual variants are counted by the number of manuscripts that support such variants, rather than by the wording of the variants. This book has been widely influential in evangelical circles. I believe over a million copies of it have been sold. And this particular definition of textual variants has found its way into countless apologetic works.

The problem is, the definition is wrong. Terribly wrong. A textual variant is simply any difference from a standard text (e.g., a printed text, a particular manuscript, etc.) that involves spelling, word order, omission, addition, substitution, or a total rewrite of the text. No textual critic defines a textual variant the way that Lightfoot and those who have followed him have done. Yet, the number of textual variants comes from textual critics. Shouldn’t they be the ones to define what this means since they’re the ones doing the counting?

Let me demonstrate how Lightfoot’s definition is way off. Among the Greek manuscripts of the New Testament, we know of about 3000 Gospels manuscripts, 800 Pauline manuscripts, 700 manuscripts of Acts and the general letters, and just over 300 manuscripts of Revelation. These numbers do not include the lectionaries, over 2000 of them, that are mostly of the Gospels. At the same time, not all the manuscripts are complete copies. The earlier manuscripts are fragmentary, sometimes covering only a few verses. The later manuscripts, however, generally include at least all four Gospels or Acts and the general letters or Paul’s letters or Revelation. But an average estimate is that for any given textual problem (more in the Gospels, less elsewhere), there are a thousand Greek manuscripts (this assumes that less than 20% of all the Greek manuscripts “read” in any given passage; that’s probably a conservative estimate).

Now, textual variants are also counted among the non-Greek manuscripts—the Latin, Coptic, Syriac, Georgian, Armenian, Ethiopic and other early translations. There are somewhere between 15,000 and 20,000 such versional manuscripts of the New Testament. We don’t really know the exact amount because the work in cataloging them has been a lower priority than that of the Greek manuscripts. But let’s assume that there are only 15,000 such manuscripts. And let’s assume that an average estimate is that for any given textual problem, there are 3000 manuscripts (again, assuming 20%). However, the vast majority of Greek textual problems can’t even be translated into other languages, so we need to restrict this even further. Only about 20% of the textual problems that we have in our manuscripts can be found in the versional witnesses. Thus 20% of 3000 manuscripts is 600. Thus, on average, there will be 600 versional witnesses for every textual problem (because 20% of the time there will be an average of 3000, but 80% of the time there will be none).

Putting all this together, we can assume an average of 1600 manuscripts being involved in any textual problem. We won’t even count the writings of the church fathers. They quote from the New Testament more than a million times, but the work in them has been painfully slow.

Now, assume that we start with the modern critical text of the Greek New Testament (the Nestle-Aland27). Most today would say that that text is based largely on a minority of manuscripts that constitute no more than 20% (a generous estimate) of all manuscripts. So, on average, if there are 1600 manuscripts that have a particular verse, the Nestle-Aland text is supported by 320 of them. This would mean that for every textual problem, the variant(s) is/are found in an average of 1280 manuscripts. But, in reality, the wording of the Nestle-Aland text is often found in the majority of manuscripts. So, we need a more precise way to define things. That has been provided for us in The Greek New Testament according to the Majority Text by Hodges and Farstad. They listed in the footnotes all the places where the majority of manuscripts disagreed with the Nestle-Aland text. The total came to 6577.

OK, so now we have enough data to make some general estimates. Even if we assumed that these 6577 places were the only textual problems in the New Testament (a rather ridiculous assumption, by the way), the definition of Lightfoot could be shown to be palpably false. 6577 x 1280 = 8,418,560. That’s eight million, just in case you didn’t notice all the commas. Based on Lightfoot’s definition of textual variants, this is how many we would actually have, conservatively estimated. Obviously, that’s a far cry from 200,000!

Or, to put this another way: this errant definition requires that there be no more than about 150-60 textual problems in the whole New Testament (150 textual problems x 1280 manuscripts that disagree with the printed text = 192,000). If that is the case, how can the United Bible Societies’ Greek New Testament list over 1400 textual problems? And how can the Nestle-Aland text list over 10,000?

And again, this eight million is not even close to the actual number. I took a very conservative approach by only looking at the differences from the majority of manuscripts. But if one started as his base text Codex Bezae for the Gospels and Acts and Codex Claromontanus for Paul’s letters, the number of variants (counted the wrong way, of course) from these two would be astronomical. My guess is that it would be well over 20 million. Or if one started with Codex Sinaiticus, the only complete New Testament written with capital (or uncial) letters, the numbers would probably exceed 30 million—largely because Sinaiticus spells words in some strange ways that are not shared by very many other manuscripts. You can see that the definition of a textual variant as a combination of wording differences times manuscripts is rather faulty. Counting this way results in tens of millions of textual variants, when the actual number is miniscule by comparison. And that’s because we only count differences in wording, regardless of how many manuscripts attest to it.

All this is to say: a variant is simply the difference in wording found in a single manuscript or a group of manuscripts (either way, it’s still only one variant) that disagrees with a base text. Further, there aren’t only 200,000. That may have been an accurate estimate in 1963, when we knew of far fewer manuscripts. But with the work done on Luke’s Gospel by the International Greek New Testament Project, Tommy Wasserman’s work on Jude, and Muenster’s work on James and 1-2 Peter, the estimates today are closer to 400,000.

Although this may leave us feeling uneasy, we absolutely must be honest with the data. I would urge those of you who have used Lightfoot’s errant definition to abandon it. It’s demonstrably wrong, and citing it reveals an ignorance about textual criticism. I would hope that the publishers of numerous apologetics books would get the data right. The last thing that Christians need to do is to latch on to some spurious ‘fact’ in defense of the faith. Instead, we should pursue truth at all costs, even at the risk of making us feel uncomfortable.

Daniel B. Wallace has taught Greek and New Testament courses on a graduate school level since 1979. He has a Ph.D. from Dallas Theological Seminary, and is currently professor of New Testament Studies at his alma mater.
His Greek Grammar Beyond the Basics: An Exegetical Syntax of the New Testament ... More