Categories
linked down

The Anti-Authority Authority [A Doctor’s Disclaimer]

http://tinyurl.com/vaccinebook

I’ve begun reading Robert “Bob” Sears’ The Vaccine Book. Despite what you might be thinking, the impetus to begin learning more about baby (and child) vaccine’s actually came from my wife. I’m not always obsessed with needing for info!

Having just started the book (Just finished reading Stumbling on Happiness), I was immediately struck by a paragraph in the Preface which not only plays the skeptic to doctors generally, but also disclaims the authority of the writer. It’s this anti-authority slant, where someone who is perceived as an authority casts doubt on himself and other perceived experts, that I find so important*. In the field of medicine, where the egos of doctors are bigger than the size XL scrubs they so frequently don, this sort of disclaimer strikes me as particularly unusual, but nice to see!

Some people feel that vaccine books aren’t necessary; after all, why not just ask your doctor if vaccines are abolutely necessary and safe and leave it at that? It takes all of one minute, then you’re done. no research or effort on your part is needed. Here’s the problem with that approach. Doctors, myself included, learn a blot about diseases in medical school, but we learn very little about vaccines, other than the fact that hte FDA and pharmaceutical companies do extensive research on vaccines to make sure they are safe and effective. We don’t review the research ourselves. We never learn what goes into making vaccines or how their safety is studied. We trust and take it for granted that the proper researchers are doing their jobs. So, when patients want a little more information about shots, all we can really say as doctors is that the diseases are bad and the shots are good. But we don’t know enough to answer all of your detailed questions about vaccines. …

Even though vaccines are important, you as a parent are still entitled to know what you are giving your child. you have a responsibility (and a desire) to make informed health care decisions for your family.

*Notably, a real slick charlatan knows the importance of this disclaimer, too, so it’s by no means an “all clear” indicator that the disclaiming expert actually knows anything.

Categories
articles

Nassim Taleb on Experts and Negative Advice

Nassim Taleb’s latest from Opacity #113 titled Negative Advice; Why We Need Religion makes the brief case that human beings are “suckers for charlatans who provide positive advice (what to do), instead of negative advice (what not to do).” Below is the entirety of his post, take a read (Emphasis mine):

At the core of the expert problem is that people are suckers for charlatans who provide positive advice (what to do), instead of negative advice (what not to do), (tell them how to get rich, become thin in 42 days, be transformed into a better lover in ten steps, reach happiness, make new influential friends), particularly when the charlatan is invested with some institutional authority & the typical garb of the expert (say, tenured professorship). This is why my advice against measuring small probabilities fell on deaf ears: I was telling them to avoid Value-at-Risk and the incomputable rare event and they wanted ANOTHER measure, the idiots, as if there was one. Yet I keep seeing from the history of religions that survival and stability of belief systems correlates with the amount of negative advice and interdicts — the ten commandments are almost all negative; the same with Islam. Do we need religions for the stickiness of the interdicts?

Telling people NOT to smoke seems to be the greatest medical contribution of the last 60 years. Druin Burch, in the recently published Taking the Medicine

The harmful effect of smoking are roughly equivalent to the combined good ones of EVERY medical intervention developed since the war. (…) Getting rid of smoking provides more benefit than being able to cure people of every possible type of cancer”

It is easy to read Taleb’s argument as meaning that negative advice is both more routinely followed and better than positive advice. However, this is clearly not the case as there are countless examples of bad negative advice. For example, look at the “Don’t eat fat” mantra that developed over the past few decades. This is negative advice that I believe Taleb has personally acknowledged as poor (Taleb is a friend of Art De Vany’s and an adherent on some level to the low-carb evolutionary nutrition/fitness theory). The low-fat or lipid hypothesis that has been the driving force behind public health policy over the past few decades may ultimately be proven to have caused the premature deaths of millions of human beings (via cancer, cardiovascular disease, Alzheimer’s, obesity, diabetes, etc.). Clearly, not all negative advice is good to follow.

However, negative advice or bright-line rules seem to take hold more strongly than positive advice. Christianity and Islam are the two most dominant religions of the world. Both contain prescriptive, bright-line rules. In the case of Christianity the prominence of rules is particularly ironic: Jesus openly argued for the destruction or irrelevance of the law (The bright-line rules of Judaism at the time). Regardless, the dominating sects of both Islam and Christianity appear to have more negative advice (What not to eat, drink, do) than positive advice (Love your neighbor), and the negative advice tends to be much more concrete: “Do not commit adultery” is much more cut-and-dry than “Love everyone.” It’s the time-tested success of hard-line, negative-advice-based religions that lends the most support for Nassim Taleb’s argument.

Agreeing somewhat with Taleb’s theory, I think it is too limited in scope, and should be expanded and clarified. Simply put: human beings are sucker’s for bright-line rules be they positive or negative; adherence to and success of these bright-line rules is dependent upon their prescriptive strength. Based on conclusions drawn from observing health and religion idealogies, it seems that negative advice promotes the greatest adherence and zealotry, both of which lead to idealogical success**.

That it is human nature to want others to tell us what to do seems hard to deny. Why are we this way?

I just finished reading Daniel Gilbert’s Stumbling on Happiness (SoH), which discusses how we perceive things and how that affects our happiness. One argument Gilbert makes is that it is human nature to prefer action over inaction. This is because it is easier to justify our action-based decisions after the fact because they have clearcut consequences whereas inaction does not, making inaction difficult to imagine and thereby difficult to justify. I would add to this that I believe it is human nature to put greater faith in our ability to control outcomes; therefore, we act out of the misguided belief that our action can elicit the responses we want.

Regardless of the source of our preference for action, I believe it’s from this bias that springs the need for bright-line positive advice. For proof of concept, look no further than the pervasive mentality that, “We must do something to mitigate the economic crisis!” Charlatans and politicians fully exploit the bias of action over inaction to propagate their own prerogatives.

On the other hand, there is a second contention in SoH that seems an extension of the preference for action over inaction, which is that the elimination of choice can trigger our psychological immune systems. Once triggered, these systems work to make us happy or content with a more restricted existence. Imagine this: having bought the farm, you’re quick to articulate the benefits of the purchase and figure out a way to love the cows. In keeping with this understanding, we can readily explain the human preference for ideologies that drastically reduce choice via negative, bright-line rules.

Thus, here we have two psychological explanations for why humans crave bright-line rules, both positive and negative.

I’d imagine Taleb would agree: life is incredibly more complex and uncertain than our bright-line rules, either positive or negative, allow. We should be aware of our tendency towards dogmatic over-simplifications and be wary of overly prescriptive, bright-line advice.

* It’s always interesting how Jesus is written to have claimed he came to free man from the law. Yet Christianity, via any number of particular denominations like Catholicism or Protestantism all adhere to stringent rules and edicts.

** I can’t help but wonder if its just easier to prescribe negative advice than positive advice even though both are likely to instill dogmatic behaviors.

Further reading

Categories
articles

Transcending the Authority Complex

Nietzsche
Creative Commons License photo credit: escolanomade

In researching Erwan Le Corre’s MovNat (Ref: Le Corre Link Repository) I continue circling back to two related concepts. The first is the idea of the “guru” and the second is human tendency to defer to authority, a problem I’m calling the authority complex.

We homo sapiens—enlightened apes—face a dilemma of awareness. The more we know about the world, the more we realize that we are little more than the by-products of our DNA’s self-perpetuating existence on a tiny planet that could disappear tomorrow without any noticeable impact on our galaxy (to say nothing of the Universe). There’s a sense of futility that arises from this awareness, our existential angst, which is probably why we so rarely think about it.

So we shelve our angst and continue living. It is our biological imperative, after all. It is in this living that we seek answers to all sorts of questions to improve our lives. Do I have kids? How do I best support my family? How can I be a better parent, friend, spouse? How do I increase my wealth (Some ideas)? What should I do with my career? What should I eat? How do I find happiness? What is my purpose? How should I live?

Our hunger for “the” answer to any particular question leads us to seek out gurus. A guru need not be a spiritual leaders (even as many “experts” often a distinct “spiritual” flair); today “guru” means more “expert” or “authority” on any given subject. On the Internet alone, I have plenty of go-to gurus on health, fitness, politics, and economics, all of whom I “follow” on a regular basis via Google Reader. It seems that gurus like to blog.

To some extent, I play the role guru (Don’t we all?). People ask me about diet, the economy, and technology. It feels good to be considered an expert, even as I secretly confess how very little I really know.

Whether we get answers directly from observation of the world combined with introspection/reflection or we turn to others—the gurus, experts, or authorities—our questions will get answered, and this can sometimes be a problem.

Buddha
“If you meet the Buddha on the way, kill him!” — Zen proverb
Creative Commons License photo credit: woordenaar

If it is answers we want, then it is answers we will receive. Of course, many of the answers we receive from consulting authority, which includes not just the gurus but also established traditions, religions, science, theories, etc., will be right. Unfortunately, many others will be wrong, and the trouble lies in telling the difference.

The tendency of deference to authority is what I’m calling the “authority complex*.” I think we are all affected by the authority complex. We’ve all drank the “Guru-ade” from time to time, and our only assured defense against this problem is awareness that it exists. It reminds me of an idea (probably a bad one) for a bumper sticker stating the imperative to “Question Authority!”

Why? It always comes back to this.

As much as we all want to find truth, many of the most important questions are simply unanswerable with any certainty. Even when we think we’ve figured things out, it is often only a matter of time and testing before our understanding is refined, corrected, and improved. This unanswerable quality applies to all understanding, be it scientific queries or more philosophical questions such as ascribing meaning to our lives. Beyond many questions just being unavoidably open-ended, there is the sense that whatever answers you seek are intrinsically dependent on you and not things that can be prescribed by some one-size-fits-all authority. Even supposing truths are discovered, how likely is it that an authority will be able to convey clearly to others the knowledge they’ve acquired from a lifetime of experience and learning?

Question authority. That is the imperative that arises from awareness of the authority complex. More pointedly, we must be critical of gurus and authorities who claim to have the answers because scarcely any claim is more telling that these so-called experts are no such thing. If you find the buddha, kill him (Nietzsche said something similar in Thus Spake Zarathustra as I recall). The point, as I take it, is that when you think you have all the answers, you most assuredly do not. Any philosophy, religion, or other authority that fails to account for the authority complex is at best incomplete.

Question authority! Question everything. Even if our questions remain forever unanswered, it is the asking that works to define our lives.

Finally, to bring these thoughts full circle, Erwan Le Corre is an emerging guru who seeks to rehabilitate humans suffering from modern day domestication, which is to say he seeks to set human beings free. I wonder if the authority complex is the fundamental barrier to human freedom. Perhaps if we can transcend the complex, even as we fail to find our answers, we might find a freedom that brings us peace.

* My first blog was “autodogmatic,” which is a made-up word that essentially captures the problem of human tendency to defer to authority.