<th the accomplishments and the approach of science are revered as gospel in Western society. If something can be legitimately justified in the name of science, it is respected. As a result, science formulates our view of the world, and the scientific methodology is a gold standard. Apparently, to find what we call “truth,” there is no need to look further; according to the familiar doctrine, if the ways of science have not brought us there yet they surely will do so soon.
I do not aim to provide an alternative vision of “truth” or to parry science with capricious and yet possibly more poignant and edifying disciplines such as English, philosophy, or my own major, political “science” (I’d sooner call it “politics” or “political theory” than pretend such fuzzy matters can be quantified as effectively as the periodic table of elements, but, hey, I don’t make the rules). I wish only to offer grounds for serious introspection on science, its history, and the way we see it. For all we want to believe about the boundless and incontrovertible results science can give, it is imperfect, and it would be beneficial to keep that in mind.
Too often science is seen as a repository to which an additional piece is added each time a novel discovery is made, like strands of thread woven into a blanket. This “process of accretion” is, according to philosopher and historian Thomas Kuhn, a highly misleading way to view the subject. He writes in The Structure of Scientific Revolutions that textbooks often espouse such a view and that “a concept of science drawn from them is no more likely to fit the enterprise that produced them than an image of national culture drawn from a tourist brochure.”
Kuhn asserts that a more appropriate way to view science and its development involves recognizing its basis: each of the subject areas under the broad umbrella of “science” operate from sets of assumptions which must be true in order for anything that follows from them to be true. Much of the success of normal science, according to Kuhn, “derives from the [intellectual] community’s willingness to defend that assumption, if necessary at considerable cost. Normal science, for example, often suppresses fundamental novelties because they are necessarily subversive of its basic commitments.”
At any given period in the progression of a scientific discipline, the community that forms that discipline is operating off of a certain set of core beliefs. A shift in those beliefs is indeed revolutionary, for it “requires the reconstruction of prior theory and the re-evaluation of prior fact.” Newton’s laws of motion were an earth-shattering shift in physics, at once crowding out and capitalizing on much previous achievement in the subject. They involved a complete paradigm switch, a profound change in the very basis of the way scholars studied movement. Kuhn writes that Newton’s discovery and others like it are scientific revolutions, and they completely excavate the discipline. Rather than simply adding on, they scrape away the foundation and build a new one. Pre-Newtonian physics is incompatible with the classical mechanics we know today.
Do we, then, continue to call pre-Newtonian physics “science”? Kuhn argues that we should: “Out-of-date theories are not in principle unscientific because they have been discarded.” But including in “science” such theories as geocentrism, which was a widely held view before Copernicus’ day, seems to necessitate a change in what we mean when we talk about science. We see that it is not an accumulation of fact but rather an ongoing conglomeration of theories that develop and give rise, often by accident, to new, better theories that allow scientists relatively more explanatory power. And we see also, and most importantly, that what scientists study and develop today may well contribute to future theories and ideas but is not necessarily or absolutely “true” in the sense that we want it to be. It is important, it is useful, and it is a contribution – but current scientific developments may very well be (and, thinking historically, probably are) operating under a paradigm that will eventually be called into question and cast aside.
None of this is meant to be a slight on the hard sciences, only to point out that science has, throughout history, contradicted itself and will inevitably continue to do so, meaning that the degree to which the current scientific judgments may be called true can and should be constantly called into question. Havelock Ellis, a writer and physician of the early twentieth century, noted, “we make our own world; when we have made it awry, we can remake it, approximately truer, though it cannot be absolutely true, to the facts.” The paradigms and methods we use across disciplines to penetrate the mystery of knowledge will never be impregnable, and it’s the best we can do to “remake” them the best we can to fit the world we are constantly discovering. Doers of science and the people who make it into something it is not and broadcast it (I’m looking at you, new media) should bear this in mind.
All of this is simply a call for the uncertain. Being more uncertain, and calling into question the pillars of belief and knowledge we think we have, will allow us to advance further still. Admitting that we’re all on fairly shaky ground with regards to “The Truth,” that elusive beast that every discipline seems to believe it has come close to, takes us one step closer to opening our minds and releasing ourselves from the prison of dogma.