How we interpret words differently — Reddit user zonination depiction of Kent’s research.

PART 2 — LEARN HOW TO COMMUNICATE UNCERTAINTY — WHY IT IS GOING TO BE A CRITICAL SKILL IN THE FUTURE

Frederick Fladmark

--

In the previous article I talked about how why learning how to communicate uncertainty is going to be a critical skill in the future. In this part (part 2) I explain how you can learn to communicate Probability and Uncertainty.

Imagine a man and woman in their twenties

Imagine a man and woman in their twenties. If the man and women interpret the phrase “in a relationship” differently — it might cause some arguments or even a break up. But the consequences for society are hardly noteworthy.

When the chief of a national intelligence agency communicates with a president and both interpret phrases differently, then the consequences can be vast.

As a result people in the world of intelligence have spent a lot of time and effort thinking about this stuff. In order to combat miscommunication within the area of probability and certainty they have developed a framework called WEPs — Words of Estimative Probability.

The best way to explain what “Words of Estimative Probability” is to tell a story. I will repeat a story told by Sherman Kent. He is widely recognised as the father of modern intelligence, responsible for making it a more scientific endeavour.

So this time let’s rewind to 1951…

Jazz artists like Nat King Cole and Tony Bennet play on the radio. Shows like the Lone Ranger play on TV. The first oral contraceptive in synthesised.

Joseph Stalin proclaims the Soviet Union has the atomic bomb. The Cold war is well underway. NATO and the USA is petrified about soviet expansion into eastern Europe.

So, around March 1951 a CIA paper titled “Probability of an Invasion of Yugoslavia” appears. If we scroll down to the conclusion in the report we find a judgment of serious strategic importance.

“Although it is impossible to determine which course the Kremlin is likely to adopt, we believe that the extent of Satellite military and propaganda preparations indicates that an attack on Yugoslavia in 1951 should be considered a serious possibility.

The authors were trying to convey a judgement — a judgement about the probability and certainty of a Soviet invasion of Yugoslavia. In 1951, with the cold war in full swing, this judgement was hugely important.

A few days after this report was briefed a Government official quizzed Sherman Kent. Kent was one of the authors. “what did he mean by a serious possibility, what kind of odds?”

Kent replied “about 65%”

The official jumped!

He and his colleagues had interpreted the odds to mean something “considerably lower”. Kent was worried and went back to the other authors of the report. He quizzed the various experts that had contributed to the phrasing of the report. He discovered that people interpreted the odds at anywhere from 20% — 80%!!!

Kent was truly shocked that he and the other CIA authors had totally failed to communicate effectively with each other. Let alone communicate effectively with the policy committee receiving the report.

The extremely critical report had essentially used Weasel Words to communicate. They sounded like they were making an assessment but really they weren’t — people could interpret the report any way they wanted to.

Think of all the risks taken by human spies. The effort to launch satellites and to take reconnaissance photos. The hundreds of millions of dollars spent on intelligence collection. All that effort boiled down to one sentence! And for that sentence to be so vague and open to interpretation!

What was Kent’s solution?

What odds come to mind when I say something is possible? Or likely? Or almost certain? Are we thinking of the same number?

Kent decided to research how we interpret “odds” associated with words. A user on Reddit has recently recreated his experiment and visualised the results.

When we are trying to communicate probability or uncertainty it is highly open to interpretation.

Just look at the phrase improbable. If you say “improbable” someone might interpret the odds as anywhere from 5–50%!

What is the Solution?

Kent proposed a system of word standardisation.

Kent proposed a system where communication of probability and certainty were standardised. The aim was to make language less open to interpretation.

Although Kent’s ideas were developed and taught they were never really fully implemented. The result — one could argue the intelligence failure of 9/11 and of WMD in Iraq.

Although this approach has not been implemented into strategic intelligence support. It has been implemented into Military intelligence doctrine. I would argue with great success.

It is by no means an ideal system, and one must remember the weaknesses when using it. However, it is much better than the alternative — no system, using weasel words. (check my last article for a description of weasel words)

What communicating uncertainty should look like –

Ok so here are two models that are pretty easy to implement.

The basic model: The first is inspired by the NATO model. It is taken from the Norwegian Police Intelligence doctrine. It looks like this

What to do — well imagine you are communicating an assessment on something like Brexit — you have calculated there is a 50% chance that the UK will vote to leave the EU.

Then if you are using the model above you should say “I assess there is an even-chance the UK will vote to leave the EU”.

Assuming the person receiving the information is used to using the same system — then they will understand the odds to be between 40 and 60%.

More nuanced model: Rachel F. Kesselman proposed the following system in her thesis while studying applied intelligence at Mercyhurst college. It can be found here.

To be honest it doesn’t really matter what model you use. So long as the communicator and audience have the same model in mind!

These words are used specifically for communicating the findings from the analysis process in order to support decision makers. This is by no means a perfect solution, but better than nothing.

Uncertainty

Ok so we have a model for communicating the odds of something happening.

But what about communicating certainty?

Imagine you are an analyst working at an investment fund. You are explaining an investment idea to a fund manager.

You have spent weeks studying a company to invest in. Your analysis leads you to conclude that a merger in ABC corp is a high probability. And you communicate to the Fund manager that you think that it is highly likely (71–85% chance) ABC corp will buy out their rival.

Great! You have massively reduced the chances of him misinterpreting your assessment. And he now has a solid assessment he can communicate to others — “highly likely”.

But how certain are you?” He replies…

How do you answer?

Why we should communicate uncertainty –

Communicating certainty has similar challenges as communicating probability. Communicating certainty is perhaps even more difficult than communicating probability.

This is because there are huge disincentives to appearing uncertain in the work place. Professionals are often scared that uncertainty is a bad thing. We are worried that we will be perceived as unknowledgeable or unprofessional.

Certainty is also persuasive. We have learnt from a young age that appearing certain will often help us influence others.

However, if an event is fundamentally uncertain then shouldn’t we communicate this? Think of the Trump — Clinton election. The result was uncertain all the way through the campaign and even after polling closed.

Is it better to fake certainty and say –

“Clinton will win. I am pretty certain because of the latest polling”

or is it better to say?

“We believe Clinton will win, but we are uncertain in this prediction because of the quality of the information available”.

Of course leadership needs to set an example and create incentives for people to work, act and talk this way.

Towards a solution

So how should we communicate confidence? Here is one simple model.

It is given by the US Directorate of National Intelligence. An example is outlined here.

  • High confidence — generally indicates judgments based on high-quality information, and/or the nature of the issue makes it possible to render a solid judgment. A “high confidence” judgment is not a fact or a certainty, however, and still carries a risk of being wrong.
  • Moderate confidence generally means credibly sourced and plausible information, but not of sufficient quality or corroboration to warrant a higher level of confidence.
  • Low confidence generally means questionable or implausible information was used, the information is too fragmented or poorly corroborated to make solid analytic inferences, or significant concerns or problems with sources existed.

How to calculate your confidence

Confidence is very much based on feeling. It is highly susceptible to personality and emotions. So how can we calculate how confident we should be in our assessment?

Imagine you are an analyst putting together a presentation. Perhaps you are going present an analysis on an oil well and chance of striking oil.

You calculate that there is an 80% probability. But you know if your bosses hear this they are going to start drilling — at a cost of tens of millions. So consequences are huge if you are wrong!

You suddenly feel that you aren’t so confident in your assessment — what do you do?

Mercyhurst College — the CIAs own intelligence university gets student analysts to determine confidence by considering the following variables -

  • Quality of analysis done
  • Reliability of information
  • Reliability of the information source
  • Number of information sources
  • Number of analysts tasked
  • Task complexity
  • Time available

One can either use a thorough quantitative approach using the following method. Available here.

Or one can simply glance at the above checklist or the photo on the right and create an intuitive feeling of confidence. Ask the following- given the variables listed above how confident am I?

This should give you an idea of your confidence in your assessment. You can group this into high, medium or low confidence.

What does it look like in practice?

Ok so you have done the following -

  • Completed your analysis
  • You have estimated the probability of something happening (say 80%)
  • You have calculated how confident you should be (say moderately confident)

What does it actually look like to communicate this. Well the answer looks something like this…

Is all this really worth the effort?

Well in reality we often have very little time to do these things. It will also depend on your job and industry. However I would argue that if your business is making any key decisions, especially when they are based on the analysis of others, then you should ensure that assessments are communicated in way outlined above. What is the alternative?

Facing the brutal facts

I quoted Ray Dalio in the last article and I would like to do it again here.

…[my] most fundamental principle is: Truth — more precisely, an accurate understanding of reality — is the essential foundation for producing good outcomes. — Ray Dalio

By learning how to communicate probability and uncertainty we are taking a step towards knowing the truth and facing the brutal hard facts of reality.

So remember your job as the analyst is not to artificially remove uncertainty but to accurately communicate uncertainty to the decision maker!

--

--

Frederick Fladmark

Writing on performance in business, health & life