Tag: biblical inerrancy

In a nutshell, the book is a brief on alleged problems with the New Testament—how transcription errors, going back even to the first and second centuries, may have altered the original authors’ meaning.

If these errors exist, this is especially problematic for Christians today, as transcription errors compound over time. For example, if a second-century scribe of the Gospel of Mark made a small error when copying the original text, then further errors made on the same inaccurate copies theoretically may drive the text’s meaning further and further from the original meaning. Then when the text is translated from the original Greek into Latin, then into the language we speak today, the errors become fatal to our understanding of what the author intended to say.

Now, many copies of second- and -third century New Testament transcriptions exist, such that scholars are able to compare across versions to identify, and even correct, copyist errors. Trying to find the text’s original meaning is anything but a hopeless task. And much of what we read in study Bible footnotes today is a result of such research, helping to clarify the meaning of the text based on new developments in textual criticism of the Bible (some of which support Christians’ historical and accepted interpretations).

But Ehrman uses several examples to argue that errors do exist in the Bible, and that some of these errors alter the text’s meaning in arguably significant ways. Some of these errors may even have implications for how we interpret the Bible’s teachings about larger, central doctrines (the origin of Christ’s divinity, for example).

Whether Ehrman is right or wrong about these specific, his larger point about the possibility of copyist error is worth considering. It doesn’t have to steal from your belief in the Bible’s inerrancy, and it definitely does not serve, standing on its own, as a strong argument against the Bible’s historicity or overall accuracy, even in regards to its claims about Jesus as the Son of God. Again, the accuracy of almost the entire book is undisputed, and Erhman’s examples (conceivably the best ones he knows) are few and far between, and do not necessarily or directly alter the authors’ obvious and oft-repeated points about central Christian doctrines.

That said, here are my own thoughts and questions on the topic—some of which I’ve alluded to above. Some oppose Ehrman’s general argument (“anti-skeptic” below), and others support (“pro-skeptic”).

Anti-skeptic

Given the sheer word count of the New Testament, Ehrman’s handful of examples still leave almost the entire book untouched by error—at least given what we know from the set of ancient manuscripts we have today.

Was the possibility of copyist error not well-known to ancient Christians? Did they have no idea that their version of a text might be slightly altered? The third- and fourth-century Church Fathers discussed this heavily—wouldn’t such concerns have existed even in the first and second centuries, when extant copies were only one or two generations from the original? Would this influence the copies that Christians chose to keep, and would they not consider correcting errors they uncovered? Ehrman does note that early Christians were likely to come from lower, illiterate classes of people, but he also argues that it was likely wealthier, educated Christians who oversaw the texts’ transcriptions. Even if the first copyists weren’t “professionals,” how was error understood by educated people? Did they expect it from copies of other texts?

Pro-skeptic

Words do not speak for themselves. Our understanding of language—even our attempt to understand language in its original historical context—is shaped by our cultural and intellectual setting. Case in point: The impossibility of perfect translation across languages, and the inevitable shortcoming of our attempts to even convey these differences. This is an argument against the notion of an infallible text, in general, as a sensible and/or useful concept. What does it matter if a text is infallible if we can never be sure of the author’s true meaning?

Why “the Word of God”? From where does this idea come? Why inerrant? Why infallible? I’m sure there’s a specific answer to this—the origins of the notion (doctrine?) of Biblical inerrancy—but I don’t know it. My first guess, since I see no claim to inerrancy in the text itself, is that the concept arises out of necessity—that without it, we have nothing. Being 2,000 years removed from the New Testament’s writing, we have no way to argue against those who level new claims about Christ and his teaching. If we have have not Scripture, have we nothing? In general, the concept of inerrancy confuses me.

Random

If God’s hand was over the writing of the New Testament, why not over the translation(s) thereof? I make no allusion to what I actually believe with this statement. I’m simply raising the question both to those who claim the Bible is God’s Word and those who claim it is not.

When Constantine cemented the role of Christianity into Roman political life, beginning in 313 AD with the Edict of Milan, did that not create a huge incentive on the part of state leaders to perfectly define Christian doctrines and eliminate ambiguity about the text? Did not the Roman Church-State want to put down serious inquiries about the veracity of Scripture or the doctrines even the emperor himself believed? Did not transcribing the text then become an act with potentially severe political consequences, thereby encouraging nefarious copyists to alter or distort (or clarify, for that matter) the text where it helped the cause of one powerful force or another? I’m not a historian, and maybe someone will correct me on this point.