Should You Consider Fate when Planning Ahead?

I was recently asked on Quora whether there is some kind of a grand scheme to things: a destiny that we all share, a guiding hand that acts according to some kind of moral rules.

This is a great question, and one that we’re all worried about. While there’s no way to know for sure, the evidence points against this kind of fate-biased thinking – as a forecasting experiment funded by the US Department of Defense recently showed.

In 2011, the US Department of Defense began funding an unusual project: the Good Judgement Project. In this project, led by Philip E. Tetlock, Barbara Mellers and Don Moore, people were asked to volunteer their time and rate the chance of occurence for certain events. Overall, thousands of people took part in the exercise, and answered hundreds of questions over a time period of two years. Their answers were checked constantly, as soon as the events actually occurred.

After two years, the directors of the project identified a sub-type of people they called Superforecasters. These top forecasters were doing so well, that their predictions were 30% more accurate than those of intelligence officials who had access to highly classified information!

(and yes, for the statistics-lovers among us: the researchers absolutely did run statistical tests that showed the chances of those people being accidentally so accurate were miniscule. The superforecasters kept doing well, over and over again)

Once the researchers identified this subset of people, they began analyzing their personalities and methods of thinking. You can read about it in some of the papers about the research (attached at the end of this answer), as well as in the great book – Superforecasting: the Art and Science of Prediction. For this answer, the important thing to note is that those superforecasters were also tested for what I call “the fate bias”.

Neither one seems to work. Sorry ’bout that.

The Fate Bias

There’s no denying that most people believe in fate of some sort: a guiding hand that makes everything happen for a reason, in accordance with some grand scheme or moral rules. This tendency seems to manifest itself most strongly in children, and in God-believers (84.8 percent of whom believe in fate), but even 54.3 percent of atheists believe in fate.

It’s obvious why we want to believe in fate. It gives our woes, and the sufferings of others, a special meaning. It justifies our pains, and makes us think that “it’s all for a reason”. Our belief in fate helps us deal with bereavement and with physical and mental pain.

But it also makes us lousy forecasters.

 

Fate is Incompatible with Accurate Forecasting

In the Good Judgement Project, the researchers ran tests on the participants to check for their belief in fate. They found out that the superforecasters utterly rejected fate. Even more significantly, the better an individual was at forecasting, the more inclined he was to reject fate. And the more he rejected fate, the more accurate he was at forecasting the future.

 

Fate is Incompatible with the Evidence

And so, it seems that fate is simply incompatible with the evidence. People who try to predict the occurrence of events in a ‘fateful’ way, as if they obeying a certain guiding hand, are prone to failure. On the other hand, those who believe there is no ‘higher order to things’ and plan accordingly, turn out to be usually right.

Does that mean there is no such thing as fate, or a grand scheme? Of course not. We can never disprove the existence of such a ‘grand plan’. What we can say with some certainty, however, is that human beings who claim to know what that plan actually is, seem to be constantly wrong – whereas those who don’t bother explaining things via fate, find out that reality agrees with them time and time again.

So there may be a grand plan. We may be in a movie, or God may be looking down on us from up above. But if that’s the case, it’s a god we don’t understand, and the plan – if there actually is one – is completely undecipherable to us. As Neil Gaiman and the late Terry Pratchett beautifully wrote –

God does not play dice with the universe; He plays an ineffable game of His own devising… an obscure and complex version of poker in a pitch-dark room, with blank cards, for infinite stakes, with a Dealer who won’t tell you the rules, and who smiles all the time.

And if that’s the case, I’d rather just say outloud – “I don’t believe in fate”, and plan and invest accordingly.

You’ll simply have better success that way. And when the universe is cheating at poker with blank cards, Heaven knows you need all the help you can get.

 


 

For further reading, here are links to some interesting papers about the Good Judgement Project and the insights derived from it –

Bringing probability judgments into policy debates via forecasting tournaments

Superforecasting: How to Upgrade Your Company’s Judgment

Identifying and Cultivating Superforecasters as a Method of Improving Probabilistic Predictions

Psychological Strategies for Winning a Geopolitical Forecasting Tournament

Rethinking the training of intelligence analysts

 

Advertisements

Failures in Foresight: The Failure of Nerve

Picture from Wikipedia, uploaded by the user Yerevanci

Today I would like to talk (write?) about the first of several different failures in foresight. This first failure – called the Failure of Nerve – had been identified in 1962 by noted futurist and science fiction titan Sir Arthur C. Clarke. While Clarke has mostly pinpointed this failure as a preface for his book about the future, I’ve identified several forces leading to the Failure of Nerve, and discuss ways to circumvent it, in the hope that the astute reader will avoid making similar failures when thinking about the future.

Failure of Nerve

The Failure of Nerve is one of the most frequent of failures when talking or writing about the future, at least in my personal experience. When experts or even laypeople are expressing an opinion about the future, you expect them to be knowledgeable enough to be aware of the facts and the data from the present. And yet, all too often, this expectation is smashed on the hard-rock of mankind’s arrogance. The Failure of Nerve occurs when people are too fearful of looking for answers in the data that surrounds them, and instead focus on repeating their preconceived notions – which might’ve been true in the past, but are no longer relevant in the present.

Examples for Failures of Nerve are sadly abundant. Many quote Simon Newcomb, the famous American astronomer, who declared that flying machines are essentially impossible, a mere two years before the first flight of the Wright brothers –

“The demonstration that no possible combination of known substances, known forms of machinery and known forms of force, can be united in a practical machine by which man shall fly long distances through the air, seems to the writer as complete as it is possible for the demonstration of any physical fact to be.”

However, this is not a Failure of Nerve, since in Newcomb’s time, the data from the scientific labs themselves was incorrect. As the Wright brothers wrote about their experiments –

“Having set out with absolute faith in the existing scientific data, we were driven to doubt one thing after another, till finally, after two years of experiment, we cast it all aside, and decided to rely entirely upon our own investigations.”

Newcomb’s Failure of Nerve appeared later on, when he was confronted with reports about the Wright brothers’ success. Instead of withholding judgement and checking the data again, Newcomb only conceded that flying machines may have a slight chance of existing, but they could certainly not carry any other human beings other than the pilot.

The first flight of the wright brothers - against the better judgement of the scientific experts of the time. Source: Wikipedia
The first flight of the wright brothers – against the better judgement of the scientific experts of the time.
Source: Wikipedia

A similar Failure of Nerve can be found in the words of Napoleon Bonaparte from the year 1800, uttered in reply to news regarding Robert Fulton’s steamboat –

“What, sir, would you make a ship sail against the wind and currents by lighting a bonfire under her deck? I pray you, excuse me, I have not the time to listen to such nonsense.”

Had the uprising emperor bothered to take a better look at the current state of steamboats, he would’ve learned that boats with “bonfires under their decks” were already carrying passengers in the United States, even though the venture was not a commercial success. Fulton went on to construct a steamboat (nicknamed “Fulton’s Folly”) that rose to fame, and in 1816 France finally recovered its senses and purchased a steamboat from Great Britain. Knowing of Napoleon’s genius in warfare, it is an interesting thought exercise to consider how history might have changed had the emperor realized the potential in steamboats when the technology was still emergent.

Is it possible that steamboats like this one would've changed the course of history, had Napoleon not been affected by the Failure of Nerve? Source: Wikipedia
Is it possible that steamboats like this one would’ve changed the course of history, had Napoleon not been affected by the Failure of Nerve?
Source: Wikipedia

How do we deal with a Failure of Nerve? To find the answer to that question, we need to understand the forces that make this failure so common.

Behind the Curtains of the Nerve

There are at least three different forces that can contribute to a Failure of Nerve. These are: selective exposure to information, confirmation bias, and last but definitely not least – the conservation of reputation.

The Force of Selective Exposure

Selective exposure to information is something we all suffer from. In this day and age, we have an abundance of information. In the past, news would’ve had taken weeks and months to get to us, and we only had the village elder’s opinion to interpret them for us. Today we’re flooded by information from multiple media sources, each of which with its own not-so-secret agenda. We’re also exposed to columns by social critics and other luminaries, and we can usually tell in advance how they look at things. If you read Tom Friedman’s column, you can be sure he’ll give you the leftist approach. If you open the TV at The Glenn Beck Program, on the other hand, you’ll get the right-winged view.

An abundance of information is all good and well, until you realize that human beings today suffer from a scarcity in attention. They can only focus on one article at a time, and as a result they must choose how to divide their time between competing pieces of information. The easiest choice? Obviously, to go with the news that support your current view on life. And that is indeed the way that many people choose – and understandably results in a Failure of Nerve. How can you be aware of any new information that stands in contradiction to your core beliefs, if you only listen to the people who repeat those same core beliefs?

Philip E. Tetlock, in his new book Superforcasting, tells about Doug Lorch, one of the top forecasters discovered in recent years, who has found a way to circumvent selective exposure, albeit at an effort. In the words of Tetlock (p. 126) –

“Doug knows that when people read for pleasure they naturally gravitate to the like-minded. So he created a database containing hundreds of information sources – from the New York Times to obscure blogs – that are tagged by their ideological orientation, subject matter, and geographical origin, then wrote a program that selects what he should read next using criteria that emphasize diversity. … Doug is not merely open-minded. He is actively open-minded.”

Of course, reading opposite views to the one you adhere to can be annoying and vexing, to say the least. And yet, there is no other way to form a more nuanced and solid view of the future.

Superforecasting: The Art and Science of Prediction. By Philip E. Tetlock and Dan Gardner
Superforecasting: The Art and Science of Prediction. By Philip E. Tetlock and Dan Gardner

The Force of Confirmation Bias

Sadly, even when a person chooses to actively open his or her mind to different views, it does not mean that they will be able to assimilate the lessons into their outlook. As human beings, one is wired to –

“…search for, interpret, prefer, and recall information in a way that confirms one’s beliefs or hypotheses while giving disproportionately less attention to information that contradicts it.” – Wikipedia

The confirmation bias is well-known to any expecting future-parent. You walk around in the city, and you find that the street is choke-full of parents with strollers and babies. They are everywhere. You can’t avoid them in the streets, on the bus, and even at work you find that your co-worker had decided to bring her children to the workplace today. So what happened? Has the world’s birth rate doubled itself all of a sudden?

The obvious answer is that we are constantly influenced by confirmation bias. If our mind is constantly thinking about babies, then we’ll pay more attention to any dripping toddler crossing the road, and the memory will be etched much more firmly into our minds.

The confirmation bias does not influence only young parents. It has some real importance in the way we view our world. A study from 2009 demonstrated that when people were asked to read certain articles spend 36 percent more time, on average, reading articles that they agreed with. Another study from 2009 demonstrated that when conservatives are watching The Colbert Report – in which Stephen Colbert satirizes the part of a right-winged news reporter – they read extra-meaning into his words. They claimed that Colbert only pretends to be joking, and actually means what he says on the show.

How does confirmation bias relate to the Failure of Nerve? In a way, it serves to negate some of the bad reputation that the Failure of Nerve has garnered from Clarke. The confirmation bias basically means that unless we make a truly tremendous and conscious attempt to analyze the world around us, our mind will fool us. We’ll pay less attention to evidence that refutes our current outlook, and consider it of lesser importance than other pieces of evidence. Or as the pioneer of the scientific method, Francis Bacon, put it (I found this great quote in a highly recommended blog: You Are Not So Smart) –

“The human understanding when it has once adopted an opinion (either as being the received opinion or as being agreeable to itself) draws all things else to support and agree with it.”

Can we fight off the influence of the confirmation bias over our thinking process? We can do that partially, but never completely and it will never be easy. Warren Buffett (third on the list of Forbes’ richest people in the world, and one of the most successful investors in the world) uses two means to tackle the confirmation bias: he specifically looks for dissenters and invites them to speak up, and (assumedly) he’s writing down promptly any piece of evidence that stands in contradiction to his current ideas. In the words of Buffet himself (quoted in TheDataPoint) –

“Charles Darwin used to say that whenever he ran into something that contradicted a conclusion he cherished, he was obliged to write the new finding down within 30 minutes. Otherwise his mind would work to reject the discordant information, much as the body rejects transplants.”

In short, in order to minimize the impact of confirmation bias, you need to remain constantly vigilant against the tendency to be certain of yourself. You must chase after those who disagree with you and seek their opinions actively, and perhaps most importantly: you should write it all down in order to distance yourself from your original perspective, and allow yourself to judge your thinking as though it were someone else’.

The Conservation of Reputation

One of the best known laws in the physical world is the Conservation of Mass. Only slightly less well-known is the law of Conservation of Reputation, which states that the average expert always takes the best of care not to lose face or reputation in his or her dealings with the media. Upton Sinclair had summed up the this law nicely when he wrote –

“It is difficult to get a man to understand something, when his salary depends on his not understanding it.”

Sadly enough, most experts believe that revisions of past forecasts, or indeed any change of opinion at all, will diminish and tarnish their reputation. And so, we can meet experts who will deny reality even when they meet it face-to-face. Some of them are probably blinded by their own big ideas and egos. Others probably choose to conserve what’s left of their reputation and dignity at any cost, even as they see their forecasts shrivel and wither in the light of the present.

The story of Larry Kudlow is particularly prominent in this regard. Kudlow forecast that President George W. Bush’s substantial tax cuts will result in an economic boom. The forecast fell flat, and the economy did not progress as well as it did during President Clinton’s reign. Kudlow did not seem to notice, and declared that the “Bush Rush” is here already. In fact, in 2008 he proclaimed that the current progress of American economy “may be the greatest story never told”. Five months later, Lehman Brothers filed for bankruptcy, and the entire global financial system collapsed along with that of the U.S.

I am going to assume that Kudlow was truly sincere in his proclamations, but obviously many other experts will not feel the need to be as honest, and will adhere to their past proclamations and declarations come hell or high water. And if we’re totally honest, then it must be said that the public encourages such behavior. In January 2009, The Kudlow Report (starring none other than Kudlow himself) began airing on CNBS. Indeed, sticking to your guns even in the face of reality seems to be one of the most important lessons for experts who wish to come up with the upper hand in the present – and assume correctly that few if any would force them to come to terms with their forecasts from the past.

Conclusion

In this text, the first of several, I’ve covered the Failure of Nerve in foresight and forecasting. The Failure of Nerve was originally identified by Arthur C. Clarke, but I’ve tried to make use of our current understanding of behavioral psychology to add more depth and to identify ways for people to overcome this all-too-common failure. Another book which has been very helpful in this endeavor was the recently published Superforecasting by Philip E. Tetlock and Dan Gardner, which you should definitely read if you’re interested in the art and science of forecasting.

There are obviously several other failures in foresight, which I will cover in future articles on the subject.