Forecasting

Article written by Mark Easdown

Decision Making & Planning, Ways of Working with Uncertainty

The only function of economic forecasting is to make astrology look respectable.
— John Kenneth Galbraith
Forecasts usually tell us more of the forecaster than of the future.
— Warren Buffett
There is great value in bringing together people who attempt to address a common problem of forecasting from different perspectives and based on very different kinds of data.
— Chris Wood, Sante Fe Institute
Your assumptions are your windows on the world. Scrub them off every once in a while, or the light won’t come in.
— Isaac Asimov
For superforecasters, beliefs are hypotheses to be tested, not treasures to be guarded.
— Philip Tetlock
I prefer true but imperfect knowledge, even if it leaves much indetermined and unpredictable, to a pretence of exact knowledge that is likely to be false.
— Friedrich von Hayek
The Lucretius underestimation, after the Latin poetic philosopher who wrote that the fool believes that the tallest mountain there is, should be equal to the tallest one he has observed.
— Nassim Taleb

A forecast is a statement about the future. (Clements & Henry, 1998)

As authors of “Forecasting” (J.Castle, M.Clements, D.Henry) note; a forecast can take many forms;

  • Some are vague and some are precise, Some are concerned with near term and some the distant future

  • “Fore” denotes in advance whilst “Cast” might sound a bit chancy (cast a fishing net, cast a spell) or might sound more solid (bronze statues are also cast)

  • Chance is central to forecasting & forecasts can and often do differ from outcomes

  • Forecasts should be accompanied by some level of certainty/uncertainty, time horizon, upper/lower bounds

  • The domain in which the forecast occurs matter, especially if no-one knows the complete set of possibilities  

The authors consider the history of forecasting;

  • Forecasting likely pre-dates recorded writing with hunter-gatherers seeking where game of predators might be, edible plants and water supplies. Babylonians tracked the night sky presumably for planting and harvesting crops

  • Sir William Petty perhaps introduced early statistical forecasting in 17th Century and thought he observed a seven year “business cycle”

  • Weather forecasting evolved with Robert Fitzroy in 1859, who sought to devise a storm warning system to enable safe passage of ships and avoid loss of vessels &/or ships staying in port unnecessarily. Forecasting of nature extended to hurricanes, tropical cyclones, tornadoes, tsunamis and volcanic eruptions.

  • Yet, history is littered with failures in forecasting, large and small;

    • Ambiguous forecasts from Oracles of Delphi and Nostradamus

    • UK storms 1987, with lives lost and approximately 15million trees blown down

    • Failure to predict 1929 Great Depression or severity of Global Financial Crisis, mid-2007 to early 2009

    • “When the Paris exhibition closes, the electric light will close with it and no more be heard of it” – Sir Erasmus Wilson & “a rocket will never be able to leave the earth’s atmosphere” – NY Times 1936

What do we want from forecasts ?

  • Do we want just accuracy? To what degree is that even possible across complicated and complex domains?

  • Do we want cognitively diverse teams to make us more aware of extreme events? Thus, minimising downside risks?

  • Do we just want comfort, ideological support and evidence of our existing beliefs? Do we want entertainment?

  • Do we want to influence a target audience, shift consensus or established beliefs?

 

These answers may differ if you are a CEO, CFO, Head of Sales, Head of Innovation, an Insurance Actuary, Epidemiologist, Politician, Economist, Intelligence Agency, Shock Jock or Sports Commentator.

For example, it was a mainstream view of epidemiologists across last 20 years that a pandemic was a prominent risk;

“The presence of a large reservoir of SARS-CoV-like viruses in horseshoe bats, together with the culture of eating exotic mammals in southern China, is a time bomb. The possibility of the re-emergence of SARS and other novel viruses from animals or laboratories and therefore the need for preparedness should not be ignored.”- David Epstein 2007

https://davidepstein.com/lets-get-ready-to-rumble-humanity-vs-infectious-disease/

https://cmr.asm.org/content/cmr/20/4/660.full.pdf 

So, is COVID19 perhaps SARS2?  Clearly, forecasting a pandemic is desirable. How do we give prominence to diverse voices & data and what are the better practices to observe and implement?

SUPER-FORECASTING

In October 2002, the US National Intelligence Estimates (a consensus view of the CIA, NSA, DIA and thirteen other agencies with > 20,000 intelligence analysts) concluded that the key claims of the Bush Administration claims about Weapons of Mass Destruction in Iraq were correct. After invading Iraq in 2003, the US found no evidence of WMDs. “It was one of the worst – arguably the worst  - intelligence failure in modern history” notes Philip Tetlock and Dan Gardner in their book “Superforecasting : The Art and Science of Prediction”

In 2006, IARPA was formed to fund cutting-edge research with the aim of potentially enhancing the intelligence community work. IAPRA’s plan was to create a tournament-style incentive for top researchers (intelligence analysts, universities & a team of volunteers for the Good Judgement Project (GJP)), to generate accurate probability estimates to questions that were;

  • Neither so easy that an attentive reader of the NY Times could get them right , nor

  • So hard that no one on the planet could get them right

Approximately 500 questions spanned: economic, security, terrorism, energy, environmental, social and political realms

Forecast performance was monitored individually and in teams, and Tetlock’s GJP team proved 60% more accurate in year 1, 78% more accurate in Year 2.

  • What did these forecasting tournaments learn about the attributes of super-forecasters that may be of relevance in Commercial or Government organisations? Here are a few;

  • Superforecasters spoke in probabilities of how likely an event would occur (not in absolutes : yes/no), this better enabled them to accept a level of uncertainty – it made them more thoughtful and accurate

  • Superforecasters were often educated yet ordinary people with an open-mind, an ability to change their minds, humility and an ability to review assumptions & update forecasts frequently, albeit at times by small increments

  • Actions which were helpful included;

    • Breaking the question down into smaller components and identifying the known and the unknown, focus on work that is likely to have better payoff, actively seek to distinguish degrees of uncertainty, avoid binding rules. Consider the “outsiders” view, frame the problem not uniquely but as part of a wider phenomena

    • Examine what is unique about problem and look at your opinions and how they differ from other people’s viewpoints. Take in all the information with your “dragonfly eyes” and construct a unified vision, balancing arguments and counterarguments, balancing prudence and decisiveness – generating a description as clearly, concisely and as granular as possible

    • Don’t over-react to new information – a Bayesian approach was useful

  • The GJP found that while many forecasters were accurate within a horizon of 150 days, not even the super-forecasters were confident beyond 400 days, forecasts out to 5 years were about equal with chance.  

  • What about forecasting teams versus forecasting individuals?

o   With good group dynamics, flat and non-hierarchical structures and a culture of sharing – teams were better than individuals – aggregation was important. In fact teams of super-forecasters could beat established prediction markets.

o   The note of caution around low performing teams came when people were lazy, let others do the work or where susceptible to group-think.

Unchartered : How to map the future together.
— Margaret Heffernan

Daryl Plummer of Gartner, a technology advisory firm who produces forecasts for customers who wish to discern hype from reality.

The prediction process starts with propositions, then verified, quantified and made actionable. A robust peer review occurs and 95% of predictions are modified along the way. Plummer routinely scrutinises predictions with actual events and these results are highlighted at conferences – championing the successes and sharing insights across those that were wrong. “Nobody here is hired because they’re psychic; they’re hired to generate insights that are useful – even if they turn out wrong. It’s useful to get you thinking”.

The author notes “that what matters most isn’t the predictions themselves but how we respond to them, and whether we respond to them at all. The forecast that stupefies isn’t helpful, but the one that provokes fresh thinking can be. The point of predictions should not be to surrender to them but to use them to broaden and map your conceptual, imaginative horizons. Don’t fall for them – challenge them.”

SM__iStock-1162992330.jpg

“How to Decide” : Annie Duke – Simple Tools for making better choices

The author presents some useful tips that teams can use to elicit uninfected feedback and leverage the true wisdom of the crowd in decision making. This is especially useful where key forecast & value chain insights and institutional knowledge is held across multiple SMEs and stakeholders;

The Problem;

  • “When you tell someone what you think before hearing what they think, you can cause their opinion to bend towards yours, often times without them knowing it”,  “The only way somebody can know that they’re disagreeing with you is if they know what you think first. Keeping that to yourself when you elicit feedback makes it more likely that what they say is actually what they believe”, “To get high quality feedback it’s important to put the other person as closely as possible into the same state of knowledge that you were in at the time you made the decision”, “Belief contagion is particularly problematic in groups”

Tips to elicit those insightful cross-functional perspectives;

  • Elicit initial opinions individually and independent before the group meets. Specify the type of feedback or insights required and request an email or written thoughts be provided before meeting. Collate these initial opinions and share with group prior to meeting. Now focus on areas of “diversion”, “dispersion”, avoid using any language around “disagreement”

  • Anonymise feedback to group – this removes any influence from the insights or opinions of higher status individuals

  • Anonymising feedback also gives equal weight to insights and opinion and allows outside-the box perspectives to be heard

  • Anonymised feedback will also allow mis-understandings to be discussed and the team to grow in knowledge together

  • If the team needs to make a decision within a meeting; try

    • Writing down insights and passing to one person to write on a whiteboard – maintaining anonymity

    • Writing down your insights and pass to another person to read aloud to the group

    • If you must read your own thoughts to group – start with most junior member and work towards most senior

“Radical Uncertainty: Decision Making for an unknowable future”

Authors: John Kay & Meryn King  

“The belief that mathematical reasoning is more rigorous and precise than verbal reasoning, which is thought to be susceptible to vagueness and ambiguity, is pervasive in economics”& Jean-Claude Trichet of the 2007-2008 GFC; “As a policy-maker during the crisis, I found the available models of limited help. In fact, I would go further: in the face of the crisis, we felt abandoned by conventional tools”

The authors draw a number of helpful lessons in the use of economic and financial models in business and in government;

  • Use simple models and identify key factors that influence an assessment. Adding more and more elements to a model is to follow the mistaken belief that a model can describe the complexity of the real world. The better purpose for a model is to find “small world” problems which illuminate part of the large world radical uncertainty

  • Having identified model parameters that are likely to make a significant difference to your assessment, go and do some research in the real world to obtain evidence on the value of these parameters to customers or stakeholders. Simple models provide flexibility to explore the effects of modifications or scenarios.

A model is useful only if the person using it recognises it does not represent the world as it is really is, rather it is a tool for exploring ways in which decisions might or might not go wrong. 

Uncertainty : Howard Marks : https://www.oaktreecapital.com/insights/howard-marks-memos/

In his May 2020 newsletter to Oaktree Clients, Howard Marks notes the field of economics is muddled and imprecise, there are no rules one can count on to consistently show causation, patterns tend to repeat, and while they may be historical, logical and often observed, they remain only tendencies. Excessive trust in forecasts is dangerous.

When considering current forecasts, he notes the world is more uncertain today than at any other time in our lifetimes, the ability to deal intelligently with uncertainty is one of the most important skills, the bigger the topic (world, economy. Markets, currencies, interest rates) the less possible it is to achieve superior knowledge and we should seek to understand the limitations of our foresights.  

A forecast is a statement about the future, a future we cannot know everything about , yet it remains a useful tool for decision making, scenario modelling, stress testing and planning. The map is not the territory, so with forecasting we should learn from better practices around collating diverse views and data, building cognitively diverse teams, constantly challenging assumptions, leverage the wisdom & insights of your subject matter experts, maintain intellectual humility & resiliency facing uncertainty, use models wisely and adopt a bayesian approach.

No amount of sophistication is going to allay the fact that all your knowledge is about the past and all your decisions are about the future.
— Ian Wilson. Former GE Chairman

LOOKING TO CURATE YOUR BUSINESS STRATEGY? REACH OUT.

Whiteark is not your average consulting firm, we have first-hand experience in delivering transformation programs for private equity and other organisations with a focus on people just as much as financial outcomes.

We understand that execution is the hardest part, and so we roll our sleeves up and work with you to ensure we can deliver the required outcomes for the business. Our co-founders have a combined experience of over 50 years’ working as Executives in organisations delivering outcomes for shareholders. Reach out for a no obligation conversation on how we can help you. Contact us on whiteark@whiteark.com.au

Article written by Mark Easdown

Are you making money?

Are you making money?

Why I joined the Business Chicks, Business Club

Why I joined the Business Chicks, Business Club