Could you be a 'super-forecaster'?


Political forecasting is among the most vital roles played by the intelligence services: determining which country's government is most likely to collapse in the next few months, or whether a given nation has weapons of mass destruction that render them a threat. But what happens when there's no way to assess the quality of those forecasts – or the people making them?

In 2004, the Butler Review on the events leading up to the 2003 Iraq invasion found that the British Government's decision to invade – based on the premise that Saddam Hussein had WMDs – was the result of a major intelligence failure. It is just one example of how the predictions that go on behind closed doors can often be fallible.

But the work of Philip Tetlock and his team at the Good Judgment Project – funded by the US government's Intelligence Advanced Research Project (Iarpa) – points to new ways of thinking about geopolitical forecasting, and the question of what makes a person better-equipped to predict world events. A few people, the project has revealed, have extraordinary talents for seeing the future – might you be one of them?
Skilled ‘supers’

The Good Judgment Project is one of several funded by Iarpa to participate in a tournament-style challenge, and by far the most successful. It recruited over 2,000 forecasters to assess the likelihood of various world events: using models ranging from soliciting individuals' predictions to assigning forecasters to collaborative teams.
Tetlock found that the most successful predictions were made by a concentrated group of skilled “super-forecasters”. Their personality traits, rather than any specialised knowledge, allowed them to make predictions that, according to NPR, outstripped the accuracy of several of the world's intelligence services, despite the fact that forecasters had access to no more classified data than they could access with a Google search.

“Most people would expect to find domain experts doing well in their domain,” says Nick Hare, one of the super-forecasters (informally, they go by “supers”) whose performance in the project landed him an invitation to the Good Judgment Project's annual summer conference. But, in fact, “there are people who are good at all domains” – outperforming even specialists. And they could hold the key to reconfiguring the way intelligence services think about making predictions in the future.

Hare's interest in discovering a basis for good political forecasting predates the Good Judgment Project. For over five years, Hare served as head of futures and analytical methods at the UK’s Ministry of Defence (MoD): looking for ways to improve intelligence officers' performance while finding ways to create accountability in the wake of the Butler Report, “looking at how we can get intelligence analysts to approach their task to make them more likely to be right", he says. It's a “'dirty secret of the intelligence community,” he adds, that there are few formal structures in place to determine whether intelligence reports – which are likely to be narrative in character – in fact prove accurate. “[If we say] 'such-and-such a country is unlikely to back down on this issue' – what does 'back down' look like? What does 'unlikely' look like?... If somebody is not being rigorous to the point of tedious pedantry – it's difficult to say whether a prediction is right or wrong.”

Hare points to the failure of intelligence leading up to the 2003 Iraq War – which led to the Butler Inquiry into intelligence – as a turning point. “Traditionally, you got a bright person, you sat them down in front of a pile of intelligence, and then they wrote things. Nobody checked how good they were.” Now, however, it's more important than ever to ask how intelligence analysts can approach their task in a way that makes them more likely to be right – so that an intelligence failure is less likely.

‘Open-minded thinking’
Hare's interest in the Good Judgment Project was piqued by reading an article by Tetlock, who struck him as “one of the few people talking about futures who’s interested in getting it right, and not just guffing on”. He signed up to be a forecaster, only to find his skills were so good they put him into the “supers”.
So, what makes Hare such a good forecaster? His success, he says, comes down not to knowledge but his capacity for “active, open-minded thinking”: applying the scientific method to look rigorously at data, rather than seeking to impose a given narrative on a situation.

When asked to predict the likelihood of a nuclear test in North Korea in the next three months, for example, Hare didn't start by analysing the geopolitical situation there, or investigating whether its new leader was more likely to run tests; the arguments on either side, he says, cancelled each other out. Instead, he looked for a base rate probability. Concluding that there was, on average, one test every 30 months, it made the likelihood in the upcoming period around 10%. He then adjusted that base rate in accordance with additional data. North Korea's threats to run a test, numerically speaking, had in the past effectively doubled the likelihood of a test actually happening, so he adjusted his prediction to 20%. “That's basically the sort of approach you take,” he says.

But super-forecasters need not have a background in the intelligence services to apply that kind of logic successfully. This year's crop of “supers” includes a number of finance workers, as well as an animator, an oil painter, and someone who made factory machinery.
‘Something stranger’

“I think the advantage I have is that I was a massive ignoramus,” jokes Reed Roberts, another “super”, who joined the Good Judgment Project after reading about it in a blog. He’s finishing his PhD in chemistry, and was looking for a distraction from research and an impetus to follow the news more closely: only to find that he, too, had the skills necessary to become a super. He says he “didn't go into many of these questions with any particular attachment” or viewpoint he was hoping to prove or disprove. Instead, he thought narrowly – sometimes too narrowly – about “what it would take to resolve the question”.

Roberts cites the Isaiah Berlin essay “The Fox and the Hedgehog” – a comparison often used by Tetlock himself – which divides thinkers into those “hedgehogs'” narrowly invested in a single topic and “foxes” with a wider, if shallower, range of experience. “Foxes” like him, Robert says, tend to be better forecasters. “They don't get attached to one particular narrative” and are able to adapt their viewpoints to incorporate any new information, unlike “hedgehog” thinkers, who often force new information into a pre-existing mental framework, or discard it if it seems to contradict their initial view.

He did particularly well on one question about whether military presence would be involved in a fatality in the South China Sea, for example, because of that specificity: he thought a “calamity” was unlikely but didn't exclude the possibility of “something stranger”; ultimately, the shooting of an illegally present fisherman ruled the question in his favour.

It remains to be seen how international intelligence services will respond to the Good Judgment Project's findings. For now, however, many supers are finding ways to monetise their skills in the private sector. Hare left his position at the MoD a few months ago to start Aleph Insights: a consulting company specialising in “strategic decision-making”. The project, too, has evolved: a website for Good Judgment, LLC, now advertises its services in providing “independent geopolitical forecasts” in the wake of the project's success.

Hare and Roberts alike agree that an added benefit of the Good Judgment Project was facilitating ways for hyper-intelligent “supers” to find each other and develop ways to collaborate. Hare's first super-forecaster conference, he says, was something of a revelation. “It's like that bit at the end of ET,” he says, “when all the other ETs come and get him. He's not an alien anymore.”

Read More >>

0 comments:

Share

Twitter Delicious Facebook Digg Stumbleupon Favorites More