Additional Information


Content

Think BR: Odds on for disaster

Forecasting will never be an exact science but with the right approach it can deliver accurate results, writes Richard Boyko, director, Ipsos Marketing.

Hurricane Sandy

Hurricane Sandy

Share this article

Forecasting is tough, as by definition you are going to be wrong. To quote Nate Silver: "In November 2007, just one month before it began, Economists foresaw less than a 1-in-500 chance of an economic meltdown as severe as the one that began a month later.

"Attempts to predict earthquakes have continued to envisage disasters that never happened and failed to prepare us for those, like the 2011 disaster in Japan, that did"1.

Earlier this year, six scientists were jailed in Italy for failing, as a public duty, to predict a recent earthquake.

You can imagine that forecasters were extremely busy tracking Hurricane Sandy. The decision to evacuate New York City cannot have been taken lightly but with lessons learnt from previous events - think Katrina - many lives have obviously been saved.

Hurricane forecasting has shown dramatic improvements with the average miss in trajectory today being only 100 miles versus an average of 350 miles just 25 years ago.

Some of the challenges faced in forecasting are nicely illustrated by what is known as the forecaster’s dilemma. Usually framed in the context of dining out, the forecasting challenges apply equally in many other situations from driving to work, trying to decide what mode of transport to use or forecasting new product success.

Suppose there is a popular restaurant in town you want to visit for dinner. You know that on a busy night you could be waiting an hour or more to be seated. However if it's quiet you will quickly be seated.

The question or forecasting challenge therefore, is how busy you think the restaurant will be? If you think it will be quiet you will be more likely to go than if you think it will be busy.

A level of complexity is added by the fact that there are many people thinking through the same scenarios. If they all think it will be quiet and they all turn up, the restaurant will be extremely busy.

On the other hand, if everyone thinks it will be busy, they will stay away and the place will be extremely quiet. So it becomes a complex question of not only whether you think the restaurant will be busy but also guessing what you think others will do and planning accordingly.

To help form your forecast you could collect other data. Previous experience may help - how busy was it the last time we went on Thursday? Do people go out to dinner less on Wednesdays? You could call the restaurant and ask how many bookings they have or how busy they are, with no guarantee things will not have changed by the time you arrive.

How strong your desire is to eat out at that particular restaurant or not to have to wait in line may drive you to stay in or go out.

These complexities fall over into forecasting new product success. The models that we have developed to form our estimates of how many boxes the new item will sell are fed by what we deem as key information.

As new information becomes available we will update our forecast and once the new item is launched we will review how well we do so we can do better the next time.

For example, it is important to try and forecast likely competitive reactions to any new product launch. Testing the new initiative in a competitive context helps to assess the relative strength of the offering.

This information can be supplemented by considering what the likely competitive response to a new product launch might be when creating and developing a marketing support strategy.

This point is important - I worked with one business where the team, faced an impending launch and recognised that this would elicit a strong competitive response and dedicated a lot of time to work out what those might be.

They finally decided on the most likely scenario - then proceeded to do nothing with the information. They launched behind their original marketing plan and competitors reacted as the team had envisaged.

Unfortunately, having not developed a counter strategy, the team could only watch as the competitive reaction savagely halted the progress of their new offering.

I was struck watching the Abu Dhabi F1 when Christian Horner, the principle of Red Bull racing was being interviewed about his drivers demotion on a technical fault to start the race from the pit lane after having qualified in pole position.

In response to being asked how you plan for such an event, Horner said that Red Bull look at all the possibilities and then develop a strategy for each one. An approach which paid off as Red Bull racing guided their driver to a podium place. Business can learn a lot from such thinking.

Forecasting will never be an exact science, however with a disciplined approach, careful scenario planning, the right expertise and maybe a little luck, the output of a forecasting exercise can deliver accurate results.

Such results can help ensure the successful launch of a new product, better placing it to eventually contribute to building a strong business.

In the extreme a successful forecast may save many lives, and at the very least it might provide a quick trip to dinner.

Richard Boyko, director, Ipsos Marketing

Before commenting please read our rules for commenting on articles.

If you see a comment you find offensive, you can flag it as inappropriate. In the top right-hand corner of an individual comment, you will see 'flag as inappropriate'. Clicking this prompts us to review the comment. For further information see our rules for commenting on articles.

comments powered by Disqus

Additional Information

Latest jobs Jobs web feed

FROM THE BLOGS

The Wall blogs

Back to top ^