Simple forecasts are best: easy to prove, hard to act on

Articles

Gary Johnson

Inpharmation

All good managers know that focus, clarity and simplicity are the hallmarks of good business. And yet, those same managers are apt to indulge in staggering complexity when it comes to forecasting. Why?

The most obvious reason would be that complex forecasting models produce more accurate forecasts. However, a vast number of studies have shown over and over again that more complex forecasting models do not produce greater accuracy. 1 (And, often, they produce worse accuracy.)

Not only is this true, it is true for every type of forecasting methodology that has been studied. For example:

• Highly qualified experts are no better at forecasting than novices.1

• Complex trend (“extrapolation”) models (like the Box Jenkins model that uses complex mathematics developed to model the astrophysics of sun-spot activity) are no more accurate than mind-numbingly simple extrapolation models like the naïve forecast1 (the trend tomorrow will be the same as the trend today).

• Complex conjoint models (models of how customers choose between products) are no better than simple conjoint models that are a fraction of the cost and far easier to use and understand.2

 

"…it is a well established empirical fact that making things complicated does not help forecast accuracy..."

 

So, when it is a well established empirical fact that making things complicated does not help forecast accuracy, why do people keep over-complicating things? There seem to be three key reasons.

The first reason is that people often consult “experts” to help them with their forecasts. Most experts are a solution waiting for a problem.

In a classic 1958 study, Dearborn and Simon presented a group of executives with a long and complex case study.3 They then asked the executives what the root of the problem was. Five out of six sales executives saw it as a sales problem. All four production executives saw it as a production problem.

It’s the same with forecasting experts. Typically, experts have invested a lot of time learning complex forecasting techniques and it is psychologically (and financially) difficult for them to accept that all that complexity is useless.

Expert forecasters’ livelihoods depend upon their models being so complex that only they can understand and operate them. A survey of such experts, found that 72% agreed that complexity enhanced accuracy and only 9% disagreed. Despite the fact that the “experts” confidence in complex forecasting techniques is wrong, we have a terrible propensity to look to experts rather than to facts.

One of the (many) studies that have demonstrated our unthinking confidence in “experts” is the famous “Dr Fox lectures”.4 Dr. Fox was an actor. A distinguished looking and authoritative sounding actor. Like most actors, he knew absolutely nothing about 'Mathematical Game Theory as Applied to Physician Education'. And yet - armed with a fictitious impressive looking resume - this is the very topic he lectured on to 55 psychologists, educators, psychiatrists, social workers and administrators. The talk lasted a full hour. What’s more, the entirely ignorant Dr Fox took half an hour’s questions on this subject about which he knew absolutely nothing. His entire performance consisted of double talk, meaningless words, false logic, contradictory statements, irrelevant humour and meaningless references to unrelated topics.

 

"…experts have invested a lot of time learning complex forecasting techniques and it is psychologically (and financially) difficult for them to accept that all that complexity is useless."

 

And yet, according to a questionnaire administered at the end of each session, the audience found the lecture to be clear and stimulating. They held Dr Fox in high regard. No one realised that the whole thing was nonsense from beginning to end.

So, we are prone to pay too much attention to “experts” who blind us with science. But the experts have the wind to their backs. The second key reason why we over-complicate forecasts is that our intuition agrees with what the “experts” are telling us. We know that the world out there is complicated. Intuitively, we feel that a forecasting model has to be complicated to stand any chance of mimicking the real world. Surprisingly, it doesn’t work like that. Even the most complex systems can be modelled just as effectively with simple models as with complex models.

There is a branch of science concerned with the study of complex systems (known as complexity theory). The pre-eminent figure is Per Bak, Professor of Theoretical Physics at the Niels Bohr Institute in Copenhagen and discoverer of ‘self organised criticality’. This is what Bak says about modelling complex systems in general.5 “Insight seldom arises from complex messy modelling (sic), but more often from gross simplifications.’

The third and final reason why we tend to opt for complexity is that it often helps us deal with the reality of corporate politics. We have to defend our forecasting models to our peers and our bosses. They are wired with the same prejudices as us: listen to experts and use complex models to forecast a complex world. Who needs the hassle of fighting these institutional forces?

Actually, there are four major benefits to keeping our forecasting models simple:

Firstly, simple models are user friendly: they are cheap, fast to build and easy to use. Glen Urban, Professor of Management at M.I.T and one of the high priests of management science says that: 'Although there are sophisticated management science models, very few complex ones have achieved continuing use.’ 6

According to John Little, Professor of Management at M.I.T., models have to be: 'Quick, quick, quick. If you can answer people's questions right away, you will affect their thinking. If you cannot, they will make their decisions without you and go on to something else.'7

Secondly, if you keep things simple you are less likely to make mistakes. “Armstrong’s laws” for using forecasting models are: (1) keep it simple and (2) don't make mistakes. The idea is that if you obey rule 1, rule 2 becomes easier to follow.

 

"(1) keep it simple and (2) don't make mistakes. The idea is that if you obey rule 1, rule 2 becomes easier to follow."

 

And, you are less likely to make the mistake of believing charlatans who shelter behind the sophistry of complex models. Forecasting authority, Stephen Schnaars advises us to, 'Be especially suspicious of forecasts made by forecasters enamoured by statistical jargon. A good rule of thumb is to discount estimates in direct proportion to the number of times they mention "parameters", "estimation procedures", and "optimization techniques".8

Thirdly, if you keep your models simple, you can concentrate on the key determinant of forecasts’ accuracy: the assumptions you make.

William Ascher studied many forecasts and looked at where the inaccuracies came from.9 He concluded that mistaken assumptions were a much bigger problem than faulty models.

Fourthly and lastly, if you keep your forecasting models simple, you will have the time and money to forecast in a number of different ways. A huge number of studies have proven that the key to forecast accuracy is to forecast in several different ways and combine the results.1 This is easy to do if you keep things simple, but practically impossible if you let things get complex.

Keeping things simple requires knowledge and courage. The aim of this article has been to give you some of the necessary knowledge.

References:

1. Johnson G. Sales Forecasting for Pharmaceuticals: An Evidence Based Approach. Henley on Thames: London Scientific Publishing, 2005.

2. Johnson G. Let's Take the Con Out of Conjoint Analysis: The Evidence for Keeping It Simple. Henley on Thames: London Scientific Publishing, 2007.

3. Dearborn DC, Simon HA. Selective Perception: A Note on the Departmental Identification of Executives. Sociometry 1958,21(140-4).

4. Naftulin DH, Ware JE, Jr., Donnelly FA. The Doctor Fox Lecture: A paradigm of educational seduction. Journal of Medical Education 1973,48:630-35.

5. Bak P. How Nature Works: The science of self-organized criticality. Oxford: Oxford University Press, 1997.

6. Urban GL, Hauser JR. Design and Marketing of New Products. Englewood Cliffs, New Jersey: Prentice Hal, 1993.

7. Little JDC. Models and Managers: The Concept of a Decision Calculus. Management Science 1970,16(April):486-485.

8. Schnaars S. Situational factors affecting forecast accuracy. Journal of Marketing Research 1984,21:290-7.

9. Ascher W. Forecasting: An Appraisal for Policy Makers and Planners. Baltimore: John Hopkins Press, 1978.

About the author:

Gary Johnson is author of Sales Forecasting for Pharmaceuticals: An Evidence Based Approach. His company provides modelling software and consultancy to all the world’s leading pharma companies. Gary has been short listed for the MCA Business Book of the Year Awards and has won numerous best paper and speaking awards.

Email: gary@inpharmation.co.uk  web : www.inpharmation.co.uk

What aspects of your forecasts could be simplified?

profile mask

Rebecca

23 March, 2011