Post by Admin on Aug 11, 2016 20:28:23 GMT
When discussing any academic discipline and the allegedly "scientific" methodology it employs, it is instructive to know the following....
In Statistics vs. Judgment at DelanceyPlace.com, August 11, 2016, Richard Vague focuses on the book Thinking Fast and Slow by Daniel Kahneman,
summarizing an excerpted section of that book that deals with the work of Paul Meehl.
Vague writes:
"In today's encore selection -- from Thinking, Fast and Slow by Daniel Kahneman. Statistics versus judgment. In his book Clinical vs. Statistical Prediction: A Theoretical Analysis and a Review of the Evidence, psychoanalyst Paul Meehl gave evidence that statistical models almost always yield better predictions and diagnoses than the judgment of trained professionals. In fact, experts frequently give different answers when presented with the same information within a matter of a few minutes...."
Meehl's findings have been confirmed, without exception, over the decades. The above knowledge is important in assessing the conclusions about man's history reached by academic disciplines that have traditionally followed the "authoritarian" model, best exemplified by one professor quoting another professor quoting another professor, etc., i.e. the assessment of evidence based on academic "authority".
Empirical evidence shows CLEARLY that such a model of the allegedly "scientific method" as based on "authority", i.e. the "opinions" of the "experts", is in fact inferior to simpler algorithms using fewer variables based on actual raw analysis of the probative evidence, viz. statistical data. In other words, LOOK at the things themselves, not to complexity.
One of the things that we emphasize in our own research work is that it is often the SIMPLER solutions that are correct in resolving mysteries of the ancient world.
Put simply: the "experts" will tell you all kinds of complex creative things about what they claim to have discovered, whereas we demand, "just the facts please", as far as possible.
That is why we are relentless and unforgiving about the countless erroneous conclusions found in archaeologically-related professional publications and in the humanities in general.
Time and again we find e.g. megalithic sites that are not fully documented, indeed poorly documented, a situation which however does not prohibit the "experts" from casting their opinions prematurely and propagating erroneous conclusions to the next generation of spoon-fed academia, who do the same. Avebury Henge is a good example. There are many books about Avebury but not a single work that properly documents the evidence, which in the last analysis consists of the raw stones themselves. Simple, really.
Do that FIRST, folks.
This does not mean, of course, that experts are of no value. They can be quite valuable, if they do their work right.
In that spirit, it is now useful to read Pauketat in the previous thread to this board, whose statements on the state of archaeological research may now make more sense.
In Statistics vs. Judgment at DelanceyPlace.com, August 11, 2016, Richard Vague focuses on the book Thinking Fast and Slow by Daniel Kahneman,
summarizing an excerpted section of that book that deals with the work of Paul Meehl.
Vague writes:
"In today's encore selection -- from Thinking, Fast and Slow by Daniel Kahneman. Statistics versus judgment. In his book Clinical vs. Statistical Prediction: A Theoretical Analysis and a Review of the Evidence, psychoanalyst Paul Meehl gave evidence that statistical models almost always yield better predictions and diagnoses than the judgment of trained professionals. In fact, experts frequently give different answers when presented with the same information within a matter of a few minutes...."
Meehl's findings have been confirmed, without exception, over the decades. The above knowledge is important in assessing the conclusions about man's history reached by academic disciplines that have traditionally followed the "authoritarian" model, best exemplified by one professor quoting another professor quoting another professor, etc., i.e. the assessment of evidence based on academic "authority".
Empirical evidence shows CLEARLY that such a model of the allegedly "scientific method" as based on "authority", i.e. the "opinions" of the "experts", is in fact inferior to simpler algorithms using fewer variables based on actual raw analysis of the probative evidence, viz. statistical data. In other words, LOOK at the things themselves, not to complexity.
One of the things that we emphasize in our own research work is that it is often the SIMPLER solutions that are correct in resolving mysteries of the ancient world.
Put simply: the "experts" will tell you all kinds of complex creative things about what they claim to have discovered, whereas we demand, "just the facts please", as far as possible.
That is why we are relentless and unforgiving about the countless erroneous conclusions found in archaeologically-related professional publications and in the humanities in general.
Time and again we find e.g. megalithic sites that are not fully documented, indeed poorly documented, a situation which however does not prohibit the "experts" from casting their opinions prematurely and propagating erroneous conclusions to the next generation of spoon-fed academia, who do the same. Avebury Henge is a good example. There are many books about Avebury but not a single work that properly documents the evidence, which in the last analysis consists of the raw stones themselves. Simple, really.
Do that FIRST, folks.
This does not mean, of course, that experts are of no value. They can be quite valuable, if they do their work right.
In that spirit, it is now useful to read Pauketat in the previous thread to this board, whose statements on the state of archaeological research may now make more sense.