Don’t Believe Everything You Think

Posted by Bryan Minogue From Ogilvy Health on April 1, 2019

 

Desmond has a rogue eyebrow hair. White, wiry, sticking straight out like an old man’s; not belonging on his 13-year-old face. It doesn't bother him. Because he can't see it. But me? I’m going freakin’ mental each time the sunlight catches it. Often, I try to set upon him with tweezers.

 

“Stop it daddy!”

“It won’t hurt.”

“I know. But if I know it’s coming, it makes it so much worse.”

 

His was a precise example of impact bias—an overestimation of the intensity to a future event. It’s not going to hurt. Much. But he runs away. A decision made relative to an emotional reaction to something that hadn’t even happened.

 

Impact bias is just one of the many cognitive biases known as heuristics.1 Mental shortcuts—wired into our brain’s decision-making algorithm—help us make judgments in moments of uncertainty. When faced with complex decisions, our brains rely on a vast array of memories, values, fears, experiences, likes, dislikes, rumors, moods, and even the latest news.

 

“When faced with complex decisions, our brains rely on a vast array of memories, values, fears, experiences, likes, dislikes, rumors, moods, and even the latest news.”

 

Hard-wired remnants of days when thinking fast could save lives (eg, Sabertoothed tiger = run!), heuristics force our brains to deviate from logic and reason. Ever have a hunch? Ever go with your gut? Spur-of-the-moment thought? Heuristics!

 

Our brains employ short cuts to think fast. So?

 

Do you really want your doctor making decisions on a hunch? An analysis2 of more than 580 physician-reported errors showed that approximately 75% of diagnostic errors had a heuristic component. The two overarching were:

 

1. Tendency of HCPs to seek only as much information needed to form an initial clinical impression

 

2. Tendency of HCPs to stick with their initial impression, even as new information becomes available

 

The most common heuristics in
clinical decision-making

 

2 systematic reviews3 (comprised of more than 230 studies) found the following to dominate the medical field:

 

Anchoring bias

Reliance on the first piece of information offered when making decisions (ie, “50% off” turns wildly overpriced merchandise into a steal)

 

Availability bias

Reliance on immediate examples that come to mind when evaluating a specific decision (ie, getting burned on a high-efficacy drug in the past spoils all high-efficacy drugs thereafter)

 

Loss aversion

Tendency to prefer avoiding losses over acquiring equivalent gains (ie, playing not to lose)

 

Omission bias

Tendency to judge harmful actions as worse than equally harmful inactions (ie, do no harm)

 

Overconfidence effect

Subjective belief that one’s own judgment is reliably greater than objective accuracy of evidence (ie, back off man, I’m a scientist)

 

Risk aversion

Preference for a sure outcome over a gamble with higher or equal expected value (ie, the devil you know…)

 

These and more than 180 more other heuristics4 (alone or in combination) create a phenomenon that directly counters our daily efforts here at Ogilvy Health: therapeutic inertia.5 The resistance to adopting a new treatment option, therapeutic inertia greatly hinders the consideration and evaluation of newly approved therapies despite well-recognized unmet need and clinical benefit supported by data.

 

“Therapeutic inertia is especially profound in clinicians who have been burned by past experiences.”

 

Therapeutic inertia is especially profound in clinicians who have been burned by past experiences. Think Tysabri 2004, Vioxx 2004, Avastin 2009. Those HCPs are going to have serious trust issues—a truckload of associative heuristics—with whatever similar drug you launch no matter how different your MOA, how clever your headline, or solid your strategy.

 

It is vital in our profession to slow down this automatic and biased decision-making if we want your brands to enjoy a fair evaluation of the merits of data and recognition of the benefit/risk profile.

 

What can we do as behavior change experts?

Never stop testing

 

Before we test the stopping power of a concept or the motivational effect of messaging, we must test the doctor. Discover the heuristics. What do physicians fear? How will physicians react to a newly launched drug in a class full of legacies? What makes HCPs avoid acting?

 

Ogilvy Health (on behalf of a client in the multiple sclerosis category) led extensive market research into heuristics, which disclosed how individual doctors, physician segmentations, and even an entire specialty were affected. The desired outcome from heuristics research is to understand, for each heuristic, its specific characteristics, the context in which it manifests, and its prevalence. This led to a complete revamp of how this client now engages with their audience—based on mitigating the most prevalent heuristics among neurologists.

 

Good news: every heuristic has a mitigation strategy

 

We can change our minds. We just need a good reason to do so. Same with HCPs. When we determine why doctors think the way they do, we can present our brands in a manner that best challenges their preconceived ideas, bias, fears, and beliefs. But first, it is up to us to convince our clients that adding heuristics testing into our SOWs and timelines is of vital importance. And if the client hesitates? Don’t worry. That’s just another heuristic to tackle.

 

Now that didn't hurt much, did it?

 

References

1. Blumenthal-Barby JS, Krieger H. Cognitive biases and heuristics in medical decision making: a critical review using a systematic search strategy. https://journals.sagepub.com/doi/abs/10.1177/0272989X14547740?journalCod.... Accessed March 26, 2019. doi:10.1177/0272989X14547740.

 

2. Schiff GD, Hasan O, Kim S, Diagnostic error in medicine analysis of 583 physician-reported errors. Arch Intern Med. 2009;169(20):1881-1887. https://jamanetwork.com/journals/jamainternalmedicine/fullarticle/1108559. Accessed March 26, 2019. doi:10.1001/archinternmed.2009.333.

 

3. Saposnik G, Redelmeier D, Ruff CC, Tobler PN. Cognitive biases associated with medical decisions: a systematic review. BMC Med Inform Decis Mak. 2016;16:138. https://bmcmedinformdecismak.biomedcentral.com/articles/10.1186/s12911-0.... Accessed March 26, 2019. doi:10.1186/s12911-016-0377-1.

 

4. Better Humans. Cognitive bias cheat sheet. https://betterhumans.coach.me/cognitive-bias-cheat-sheet-55a472476b18. Accessed March 26, 2019.

 

5. Saposnik G, Sempere AP, Raptis R, Prefasi D, Selchen D, Maurino J. Decision making under uncertainty, therapeutic inertia, and physicians' risk preferences in the management of multiple sclerosis (DIScUTIR MS). BMC Neurol. 2016;16:58. https://bmcneurol.biomedcentral.com/articles/10.1186/s12883-016-0577-4. Accessed March 26, 2019. doi:10.1186/s12883-016-0577-4.