Thirty-five years ago, J.M. Adovasio, an archaeologist from the University of Pittsburgh, was doing field work at a rock shelter in southwestern Pennsylvania. The first 10 feet of the excavation at Meadowcroft, as the property was known, had revealed evidence, mostly in the form of prehistoric fireplaces, of 20 instances of previous human occupation going back thousands of years. By the time they hit a layer of rocks—the product of a rock fall, what archaeologists call a “spall”—the artifacts were more than 10,000 years old.
Then, in a move that would change their lives forever, Adovasio and his team broke through the layer of rocks to see what was underneath. What they found were “decidedly unfamiliar but undeniably human” artifacts. At a level that corresponded to 12,000 years ago they found a spear point. At a level corresponding to 15,000 years ago, they found firepits and remnants of basketry and cordage.
Field archeology 101: that an artifact is found in a layer or in strata that is 15,000 years old doesn’t necessarily mean that the artifact itself is 15,000 years old. To establish their age, Adovasio, a careful and meticulous archaeologist, sent charcoal samples from the firepit for radiocarbon testing. As he wrote in The First Americans: In Pursuit of Archeology's Greatest Mystery, his reaction to the lab reports wasn’t joy or excitement, it was an expletive. According to the radiocarbon dating, people were living in southwestern Pennsylvania at least 15,000 years ago. As Adovasio knew, as soon as these findings became public, the most important rocks in his life would no longer be the ones strewn on the floor at Meadowcroft—they would be the ones his colleagues would be throwing at him.
That’s because in 1974 the strong consensus among American archaeologists was that the ancestors of American Indians arrived in the Americas no earlier than approximately 13,000 years ago at the end of the last Ice Age. These people, known as Clovis after their distinctive spear points found near Clovis, New Mexico, then spread from Canada to Tierra del Fuego. Along the way, they killed off much of the great beasts of the Americas, such as the Columbian Mammoth, Mastodon, and the Western Camel. They did all of this and a lot more in, at most, a few thousand (perhaps as little as 1,000) years.
The consensus was strong but not without its critics. Linguists argued that the time line provided by what came to be known as the “Clovis First” theory wasn’t long enough for the hundreds and even thousands of languages spoken in the Americas to evolve. Geologists and glaciologists noted that this consensus assumed things about the extent of glaciation in late-Pleistocene North America that was, at best, unproven and possibly wrong.
And then there were findings like Adovasio’s. In both North and South America, archaeologists kept finding evidence of human occupation that couldn’t be reconciled with the dates provided by the consensus.
The response to these criticisms was a mixture of close-mindedness, nitpicking and, when these didn’t suffice, personal attacks. (Ironically, less than a century ago, suggestions that humans occupied the Americas as far back as 13,000 years ago were subjected to much the same treatment. Then, the consensus was that humans had migrated to the Americas approximately 4,000 years ago.) Adovasio experienced all three: when he wasn’t being called a sloppy archaeologist, he had to fend off a series of increasingly implausible objections. When these failed, he was simply accused of fraud.
The experience, as you can tell by the tone of his book, left him more than a little angry. Eventually, vindication came from a finding more than 5,000 miles away: Monte Verde, Chile. There, archaeologists found evidence of human settlement that predated Clovis by at least 1000 years. While there are still a few “Clovis First” die-hards (there are people who still hold to a geocentric theory of the universe, after all), the consensus has been demolished. While there is no shortage of hypotheses about how and when modern humans first arrived in the Americas, the answer remains, as Adovasio put it, “archeology’s greatest mystery.” In the absence of a strictly-enforced consensus, archaeologists are free to say “we don’t know.”
Those are three words you don’t hear very often, at least not in public discourse. Much of modern life, from science to politics to entertainment requires a claim to knowledge far of excess of our actual knowledge.
An obvious example is the debate over anthropogenic global warming. (“Climate change” are weasel words: the climate has changed throughout Earth’s 4.5 billion-year history.) At the heart of debate is the assertion that we understand the geophysical processes that drive climate well enough that we can construct a computer model that will accurately mimic (and thus predict) the real thing.
It’s not “anti-science” or “anti-environment” to question that claim to a kind of omniscience. Physicist Freeman Dyson, who cannot seriously be called either, has written that the models “do not begin to describe the real world that we live in. The real world is muddy and messy and full of things that we do not yet understand.”
Nobel Laureate Steven Weinberg has said that he has “the sense that when consensus is forming like ice hardening on a lake, Dyson will do his best to chip at the ice.” That’s because Dyson knows that “in the history of science it has often happened that the majority was wrong and refused to listen to a minority that later turned out to be right.” Whether or not this is such a time, talk of “consensus” is little more than an argument from authority—a far cry from the motto of the Royal Academy, Nullius in verba, “on the words of no one.”
It isn’t only “science” as the word is popularly understood—the claim to this kind of knowledge (scientia) is everywhere. In Wired Felix Salmon told the story of “The Formula That Killed Wall Street.” A single equation, a “Gaussian copula function,” purported to calculate the cumulative risk involved in countless financial transactions involving countless actors facing countless circumstances each of which could affect their ability to hold up their end of the bargain. This formula “made it possible for traders to sell vast quantities of new securities, expanding financial markets to unimaginable levels” and set the stage for the near-death experience of global economy. As Salmon says “the damage was foreseeable and yet foreseen.” It didn’t matter, the consensus had solidified thanks, in no small part, to the unprecedented amounts of money people were making.
This unwarranted claim to knowledge is a (perhaps the) defining characteristic of modernity. What makes us modern, as John Milbank and others have argued, is the belief that the world and the fullness thereof is ours to dominate, consume and, most importantly in this context, to understand. This of course requires believing that we can know and describe the universe without reference to God. As Ivan Illich put it, the world has fallen from the hands of God into the hands of humanity.
The problem is that we are not nearly as smart as we think we are, and even if we were that isn’t nearly smart enough for our knowledge to match our vanity. So we drop the world time and time again. Sometimes the results are rancor and hurt feelings and sometimes the result is a global economic meltdown that leaves hundreds of millions of people even more desperate than they were before. Who knows? Maybe one day we will really go too far. No doubt there will be strong consensus in favor of that bit of folly, as well.
Roberto Rivera is a senior writer for BreakPoint.
Articles on the BreakPoint website are the responsibility of the authors and do not necessarily represent the opinions of Chuck Colson or Prison Fellowship. Outside links are for informational purposes and do not necessarily imply endorsement of their content.