The Use Of Uncertainty PDF Print E-mail
Swift
Written by Tauriq Moosa   
               Chase after truth like hell and you'll free yourself, even though you never touch its coat-tails. - CLARENCE DARROW

All action must, to a certain extent, be planned in a mere twilight, which in addition not infrequently — like the effect of a fog or moonshine — gives to things exaggerated dimensions and unnatural appearance. - CARL VON CLAUSEWITZ, SPEAKING ABOUT WAR DATA

 

Certainty is one of the oldest problems in philosophy. How do we know what is “true”? Many people, too many in fact, use science as the benchmark for absolute truth. Yet what better opposition exists to science other than absolute thinking? We must carefully reflect on science itself as an arbiter of truth and falsehood, carefully outlining why it is that we do not need absolute truth to prove that something is almost certainly absolute nonsense.

Strange as it is, my medical colleagues openly state that cures for polio, smallpox and so on are not one-hundred-percent effective. That is, it is not absolutely true that taking this or that medication will have the desired effect on the patient. Yet, we don’t let uncertainty guide our investment in the medical profession; indeed, as patients, we are usually told the chances of success with surgery, treatment and so on. They are not absolutely guaranteed except in that there is absolutely no certainty. Should we let the mere admission of uncertainty prevent the medical profession from continuing? Should we never partake of life-saving surgery because there is no absolute guarantee of success? Of course not. Nearly everyone who wants the treatment partakes of it if the odds are in their favour; that is, aware that the treatment is not absolutely guaranteed.

We can’t be absolutely certain there is no god, no “memory retention” in water, no fairies, no Loch Ness monster, or psychic abilities. But our admission of uncertainty must flow both ways: Our opponents need to admit that there is uncertainty within their beliefs, too. And they need to be open to the likelihood of their claims‘ veracity. If we are open about ours, should it not be the same for them? This is where the gap of faith or belief fits in nicely: Because there is a very small likelihood of truth to the claims of, say, homeopathy, they fill in such gaps with appeals to mystery and uncertainty. Whilst science does not deal with mystery – It’s useless as a scientific appeal, but perhaps excellent as a catalyst for further research – it does deal with uncertainty. It is built into the methodology of science. Uncertainty is why Einstein could replace Newton. Uncertainty is why we cure with drugs, not prayer. Science is the powerful discipline that can send us to space and cure diseases but can’t disprove, with absolute certainty, that there are no invisible, dancing fairies on my fingertips.

The admission of doubt is the first step toward success. But this admission of doubt should be further admitted as an admission of mere human fallibility. It is built into us as a species. We are the half-blind man looking for his spectacles in the dark; the deaf composer with sonatas performed in his head. We try to impose some rationality on to the world all the while being imperfectly rational, and wonderfully fallible. An admission of being wrong is the first step toward being right.

I am reminded here of my favourite literary character, Sherlock Holmes. In The Sign of the Four, Holmes says: “How often have I said to you that when you have eliminated the impossible, whatever remains, however improbable, must be the truth?”

Science works with the statistical likelihood of causal phenomena. If we do x, what are the chances that y will occur? When considering the claims of science as an arbiter of certainty, we must be aware that striving for certainty is actually unhelpful. Instead, we need to think of the statistical likelihood of an event or action occurring – “however improbable”. This is the best we can do as a species of imperfect and fallible beings. Being okay with uncertainty is the first step to being okay as a fallible being; thus being okay with the fact that the universe or reality doesn’t bend to our concerns, why bad things to happen to good people, and so on. If you deny your fallibility, then somehow you have access to some (source of) information beyond us mere mortals. (Hence the claims of the religious.)

Admissions of uncertainty also reign in our opponents. Since skeptics can say openly we operate with uncertainty, it means no claims are absolutely dismissed. There can be no shouts of close-mindedness because skeptics “disbelieve” homeopaths. We are operating on statistics, not on “beliefs”.

And whilst we are on belief, we should begin eradicating such terms from our discussions. Evolution, medical science and so on, do not rest on belief, but evidence and statistics. Belief should not apply to physical phenomena, which obey their own natural laws. You can believe gravity is a myth or that prayer works better than the medical cures for polio, but you will still hit the ground if you leap off a building, and children will no longer have a debilitating disease after treatment—all despite your beliefs. As Winston Churchill said, “The truth is incontrovertible: malice may attack it, ignorance may deride it, but in the end; there it is.” The world does not care for our imposing belief-systems, which is why beliefs are suited more for ideologies rather than physical phenomena.

Beliefs are undermined by uncertainty. Uncertainty, though, seems counter- intuitive. We automatically associate uncertainty with resignation based on uncertainty. Someone says he’s afraid to enter a darkened cave because he’s “uncertain of the dangers”. Yet, such uncertainty is different to the essential – yes, essential – property of uncertainty within science and skepticism. The cave- fearing person is afraid because he has no assurance that it will be safe: he is unaware of the statistical likelihood of harm. Here the term “uncertain” is defined as “unaware of the likelihood”.

This isn’t uncertainty in the way we are talking of; indeed, he is basing his decision by weighing up the likelihood of harm. The uncertainty I’m talking about is the idea that, no matter what, we still cannot be absolutely certain the cave is safe. Even if we explored every part of the cave, there is still no absolute guarantee it is safe: this is the uncertainty we should admit. Here uncertainty is defined as “the essential property of reality that we cannot be absolutely guaranteed of the consequences, events or likelihood of a particular outcome.” But this applies to everything: Whether caves or cures. Embracing this is important.

Fear and uncertainty usually go together, so perhaps we should look further into it. Dan Gardner’s book Risk: The Science and Politics of Fear deals with all sorts of scare mongering gone wrong (as if there‘s any other kind!). He identifies what he calls “conscious rationalization of an unconscious judgment”; that is, when one is forced through non-rational mechanisms, such as emotional advertising, to justify a viewpoint. For example, most advertisements rest on emotion because of the powerful, knee-jerk responses from potential customers to purchase their product. Think of the music and actors portrayed in life-insurance ads; hints of coffins and the macabre are kept on the borders, but the implication screams at the viewer. This is similar to Michael Shermer’s article in Scientific American, “Smart People Believe Weird Things”. Smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons. This is very similar to Gardner’s claims.

Gardner's goal is to identify phenomena that we should be concerned about, due to higher statistical likelihood of harm or negative influence (such as death, disease, and suffering). What is more likely: Getting killed from shark attacks or poorly wired Christmas trees? That legalizing drugs is always a bad thing for a country or that in some instances it might help, as in Portugal? By identifying trends, i.e. statistical likelihood of an event occurring, we can more clearly devote resources to those areas that require it. For example: Considering that shark-deaths are usually less than a hundred world-wide annually, we should be more concerned about protecting ourselves from bad wiring in Christmas trees.

But in order to make such statements, we need to be clear about what we are looking for. If we are concerned about, for example, things that lead to death, we should realistically look at the highest causes of death, harm, or suffering and focus on combating them. Unfortunately, common causes of death make uninteresting news so media outlets won’t push for it. Yet, the data exists. We should constantly keep that in mind.

This is why uncertainty matters. Uncertainty demands: (1) the awareness of non-absolute thinking and (2) the reprisal of understanding the likelihood of phenomena occurring. If we realize that we are not certain, we should find out about whether, for example, mass shark-hunting is beneficial to reducing unnecessary death and suffering. To a small degree, perhaps, but not compared to the effort in being careful, buying protection, etc., when putting up Christmas lights. Here we see that uncertainty helps orientate ourselves, like a compass needle pointing north no matter the direction it faces.

Uncertainty reminds us that we cannot be certain about the outcomes, but we might have good reasons for thinking of their arriving. Counter-intuitive as it may be, we must come to grips with uncertainty. Science itself is counter- intuitive: Who would think that an elephant and a feather fall at the same speed? Who would think that the sun revolves around the earth and not the other way around? Does it seem obvious that we share an ancestor with fish? Something being counter-intuitive does not make it wrong; it only makes it harder to accept. Science is used to this.

But if we want to defend science, we cannot use it as a pillar of perfect truth and flawlessness. We should discard such terms as not only unhelpful, but also dangerous. “Absolute” implies no change in the future. And what would science be if it fossilised into catechisms of equations and mantras of laws. Science then would be swathed in robes, hooded and ignorant of anything other than its own absolutist thinking, its fingers tracing the lines from Principia Mathematica, careful never to amend or distort. Then, it would no longer be the wonderful discipline that saves lives and the best method for engaging with the world. By removing uncertainty, by purging doubt, science loses its essential features. What makes science the best way of engaging with the world is the realisation that we cannot be certain, that we operate through fallibility, but we are doing the best we can with the current, available data.

Let us admit, then, uncertainty. By admitting it in our greatest accomplishments, it becomes easier to admit it in our everyday lives and in ourselves. And that is a step toward a more rational approach to a universe that doesn’t, and will never care about us, despite our brilliant methods in understanding it.

 

Tauriq is currently an M.Phil (Masters in Philosophy) Student, specialising in Applied Ethics, with a further specialisation in Biomedical Ethics, at the Centre for Applied Ethics, Stellenbosch University, Cape Town, South Africa. He is also a contributing editor to Secular Humanist Bulletin and has written for Free Inquiry, Skeptic Magazine, and numerous European humanist magazines: including original articles for Fritanke, in Sweden. Translations of his work have appeared in the magazine for the Polish Rationalist Association. Tauriq's more focused writings on practical ethics can be found at 3quarksdaily.com.