# The Majority Text — Statistical Model



## Robert Truelove (Dec 1, 2015)

This is s simplified presentation of the statistical model for the Majority Text of the Greek New Testament.

https://www.youtube.com/watch?v=Lsl9XOFZao4


----------



## Captain Picard (Dec 1, 2015)

Some questions on my first walkthrough of the video.

1) Do you believe the theological position of men such as Zane Hodges has any impact on their influence of the position described?

2) Would you recommend the book at 2:00-ish as representative of your position?

3) What is the number one source for a beginner that you would cite regarding the theory of the Lucian Recension?

4) Disregarding a "deliberate" recension, can we make room for a "persecutorial recension" regarding the Islamic conquest of Egypt? Dr. White has advanced this explicitly. I hope you will address this in your next video.

5) Can you walk through our real, extant, manuscript evidence on the LEM (for example) and show that it holds to your statistical model? Why or why not?

Thank you so much for your zeal for God's Word, brother.


----------



## Bill The Baptist (Dec 1, 2015)

Good stuff as always Robert. Thank you for sharing.


----------



## Logan (Dec 1, 2015)

Thanks! Good things to think about.

I want to be careful though of making claims based on a model that itself is based off certain assumptions. Using this model to critique the Critical Text seems only marginally helpful, as it requires certain assumptions, and this model only applies to certain scribal errors. Under other models perhaps the Critical Text is "vindicated" against the Majority, but it depends on the assumptions made.

For example, if an error commonly occurs but is such that no one corrects it, by this model you would end up with far more copies with errors than correct copies. 
Or if more errors are introduced in subsequent generations, it might be harder to catch earlier errors.

The model can be helpful but is heavily reliant upon its assumptions, and cannot bear more than the assumptions will allow (i.e., words like "vindicates" or "proves" are beyond the realm of its capabilities).


----------



## Bill The Baptist (Dec 1, 2015)

Logan said:


> Thanks! Good things to think about.
> 
> I want to be careful though of making claims based on a model that itself is based off certain assumptions. Using this model to critique the Critical Text seems only marginally helpful, as it requires certain assumptions, and this model only applies to certain scribal errors. Under other models perhaps the Critical Text is "vindicated" against the Majority, but it depends on the assumptions made.
> 
> ...



I think you make a good point as it pertains to the concept of "proof". This model cannot really prove anything, but I think its power lies in showing how exceedingly unlikely it is for a reading that is found in only 5-10% of manuscripts to be the original reading. As demonstrated in the model, a reading that was found in just 33% of the first generation copies would increase to almost 50% within 12 generations. That means that a variant that is found in only 10% of manuscripts today would have only been present in far less than 1% of the first generation copies, and it seems exceedingly unlikely that a reading present in less than 1% of the first generation copies could possibly have been the correct reading. Of course this assumes that the model is correct, which is an assumption, but an assumption that is based on what we know about how manuscripts would have been reproduced. One of the key weaknesses of the critical text, as pointed out by Dr. Robinson and others, is that it utterly fails to provide any theory for how the text came to be as it exists today, nor does it even attempt to. Perhaps we should ask ourselves why this is the case.


----------



## Robert Truelove (Dec 1, 2015)

Logan said:


> Thanks! Good things to think about.
> 
> I want to be careful though of making claims based on a model that itself is based off certain assumptions. Using this model to critique the Critical Text seems only marginally helpful, as it requires certain assumptions, and this model only applies to certain scribal errors. Under other models perhaps the Critical Text is "vindicated" against the Majority, but it depends on the assumptions made.
> 
> ...



Bill nailed it!

The idea that there are errors that commonly occur that no one corrects resulting in them being in the 80+% majority is highly improbable and very difficult to square with "reasoned transmissionalism". The text was copied by thousands of Scribes over a broad geographical area and for 1500 years. It's not that it would be absolutely impossible for this to happen, it's that it is highly improbable considering how the New Testament was copied. 

Also...this is why the model is adjusted for 80+% determinative for original readings and 20-% for minority readings. The reality is, most of them are at 90+% for the MT and 5-% for the CT. It's demonstrating the mathematical probably for overwhelming majority and overwhelming minority in relation to the original. Where there is not an 80+% attestation to the extant manuscripts, it doesn't factor and one has to use another form of criticism (I prefer Robinson's Byzantine Priority), but MOST cases fall into the 80+% category. 

This is what virtually every argument that has been made against the statistical model for the MT fails to comprehend. It's not simply "counting noses". If there was a historical situation that "breaks the model" (Lucian Recension Theory?), it must be demonstrated. Rather, what I see the CT guys doing is simply asserting "historical problems" with the Byzantine majority without presenting any solid, historical transmissional model to account for the Byzantine Text.


----------



## Robert Truelove (Dec 1, 2015)

Captain Picard said:


> Some questions on my first walkthrough of the video.
> 
> 1) Do you believe the theological position of men such as Zane Hodges has any impact on their influence of the position described?
> 
> ...



*1) Do you believe the theological position of men such as Zane Hodges has any impact on their influence of the position described?*

I didn’t know Zane Hodges (and I know I have sharp differences with what he taught concerning “free grace/easy believism”). 

However, I do know Dr. Maurice Robinson, and have had quite a bit of correspondence with him and even had lunch with him a couple weeks ago. For his part, I can tell you he is thoroughly committed to the doctrine of preservation. 

However, Dr. Robinson’s concern is that his position not be taken for a “faith argument”. He believes the science is so sound, reasonable and convincing on its own for the Majority Text/Byzantine Priority that the theological issue is not necessary to prove it. 

Rather, for Robinson, the fact that the Majority Text/Byzantine Priority approach happens to line up stunningly with the Bible’s own teaching on textual preservation is “icing on the cake”.

I think Dr. Robinson’s work in textual criticism represents the next step in this scholarship carrying forward the work of those that have come before him and I personally find it to be, in general, thoroughly Biblical.

*2) Would you recommend the book at 2:00-ish as representative of your position?*

I think much of Pickering’s book is good. His ultimate conclusion is that manuscript Family 35 within the Byzantine tradition is THE text (settling all textual issues). I think that is a bit too optimistic as an absolutist position BUT it presents a text that has less than .5% differences of that of both the Majority Text of Hodges and Farstad and The Byzantine Text of Robinson and Pierpont. 

Once you narrow it down to the Traditional Text, the varying positions within this camp all produce editions very close to one another because the Byzantine Text is so well attested.

For me, I’d say I find Dr. Robinson’s position to be the most compelling in part because he utilizes some of the better observations from the broader school of textual criticism to deal with variants within the Byzantine Text where a clear majority does not settle the matter.

Dr. Robinson’s Case for the Byzantine Priority is a must read…

http://rosetta.reltech.org/TC/v06/Robinson2001.html

…However, it suffers from a lack of accessibly to the beginner as it presupposes one has already attained a solid grasp of the field of textual criticism (it’s a scholarly article for scholars). It’s going to be over the heads of even most pastors unless they have invested the time into attaining a solid grasp of the principles and nomenclature of the field of textual criticism.

One of the things I am trying to do is boil such scholarship down into simple, bit sized bits for a popular audience.

*3) What is the number one source for a beginner that you would cite regarding the theory of the Lucian Recension?*

Here is a rather detailed article from Bruce Metzger on the Lucian Recension Theory…

http://textualcriticism.scienceontheweb.net/AA/Metzger-Lucianic.html

*4) Disregarding a "deliberate" recension, can we make room for a "persecutorial recension" regarding the Islamic conquest of Egypt? Dr. White has advanced this explicitly. I hope you will address this in your next video.*

This is precisely the sort of thing I’m addressing in my next video dealing with textual transmission. It is honestly absurd to speak of the Islamic expansion as having interrupted the normal transmission of the Greek New Testament (at least before the 15th century) principally because the eradication of a regional text (upper Egypt—a place where Greek was not even the native language!) had virtually no effect upon the broader transmission of the text. Likewise the issue of the Latin traditional dominating the West. The Greek Text was always principally preserved in the Greek speaking church. There we have a consistent, 1500 year history of copying.

*5) Can you walk through our real, extant, manuscript evidence on the LEM (for example) and show that it holds to your statistical model? Why or why not?*

While no one can present an actual transmissional history showing a parent/child family tree, we can demonstrate a theory of transmissional history that lines up very well with the evidence found in our extant manuscripts. Dr. Robinson refers to this as “reasoned transmissionalism” and sees the lack of this within the popular approach to textual criticism (that behind the Critical Text) as its weakest link.

I’ll try to demonstrate this somewhat in the next video in the series.


----------



## Logan (Dec 1, 2015)

I'm simply trying to warn, based on one of my areas of (partial) expertise (mathematical modeling) that predictions and conclusions are always limited by the assumptions: the model cannot show more than it is designed to show, and words like "validate" simply won't apply. Models can be helpful in building confidence or understanding but there are limitations. A model is a tool, not a fact.

And even then the probabilities of certainty are dependent on these assumptions. What is the probability of a false positive? A false negative? What is a statistically significant sample size? I don't have any answers but I'd just be cautious without thoroughly understanding the limitations of the model.


----------



## Robert Truelove (Dec 1, 2015)

Logan said:


> I'm simply trying to warn, based on one of my areas of (partial) expertise (mathematical modeling) that predictions and conclusions are always limited by the assumptions: the model cannot show more than it is designed to show, and words like "validate" simply won't apply. Models can be helpful in building confidence or understanding but there are limitations. A model is a tool, not a fact.
> 
> And even then the probabilities of certainty are dependent on these assumptions. What is the probability of a false positive? A false negative? What is a statistically significant sample size? I don't have any answers but I'd just be cautious without thoroughly understanding the limitations of the model.



Agreed. "Validate" in its technical sense would indeed be incorrect when we are discussing models demonstrating probability. 

This thing is far more advanced than my presentation (which is why most people don't even understand the basic premise). That's why I broke it down in such a simple fashion...so people can get a handle on the basics of the argument. 

Most still seem to thinh the MT is simply about "counting noses". When you hear that phrase in relating to the MT, you can rest assured whoever just said it doesn't really understand the position.


----------



## Robert Truelove (Dec 1, 2015)

Dr. Maurice Robinson emailed just now reminding me of the relevance of this quote from Dr. Hodges to this discussion...

"No one has yet explained how a long, slow process spread out over many centuries as well as over a wide geographical area, and involving a multitude of copyists, who often knew nothing of the state of the text outside of their own monasteries or scriptoria, could achieve this widespread uniformity out of the diversity presented by the earlier forms of text. Even an official edition of the New Testament -- promoted with ecclesiastical sanction throughout the known world -- would have had great difficulty achieving this result as the history of Jerome's Vulgate amply demonstrates. But an unguided process achieving relative stability and uniformity in the diversified textual, historical, and cultural circumstances in which the New Testament was copied, imposes impossible strains on our imagination. (Hodges,Appendix C, in Pickering _Identity of the NT Text_, 166).


----------



## Captain Picard (Dec 1, 2015)

The videos are always interesting Pastor Truelove. Thanks again!


----------



## MW (Dec 1, 2015)

Bill The Baptist said:


> but I think its power lies in showing how exceedingly unlikely it is for a reading that is found in only 5-10% of manuscripts to be the original reading.



This demonstrates Logan's original point about "assumption." One has to assume something concerning the mss. which have come down to us.

Once again my criticism of this theory is its claim to be empirically based when its starting-point has no empirical basis.

It seems to me that the first statistical analysis pertaining to a "majority" text should concentrate on determining what the "majority" is supposed to be the "majority" of. Invoking the "New Testament" as a canonical concept only serves to hide the fact that this theory is attempting to reconstruct the text while it presupposes its preservation.


----------



## Bill The Baptist (Dec 1, 2015)

MW said:


> Bill The Baptist said:
> 
> 
> > but I think its power lies in showing how exceedingly unlikely it is for a reading that is found in only 5-10% of manuscripts to be the original reading.
> ...



Indeed you are correct in pointing out that this theory does not prove anything and relies on presuppositions that may or may not be true. Despite this, it is still helpful in the same way in which philosophical arguments for God are helpful. They do not prove anything, but they do help us to understand and to think clearly about the evidence that we see all around us. Regarding biblical manuscripts, the evidence that we see is that there are an exceedingly large number of manuscripts that are largely in agreement, and an exceedingly small number that disagree with the majority to a much greater extent. This reality did not occur by accident but is the result of a process of transmission that has descended down through the centuries. Again, this proves nothing in and of itself, but it does demonstrate the greater likelihood of the Byzantine readings being the correct readings, and further helps us to see that the Ecclessiastical text is reliable and that God was indeed faithful in preserving the text. When viewed in this fashion, theories such as the one Robert has presented do not serve to undermine or replace the Ecclesiastical position, but rather serve to establish it.


----------

