Recently we've been looking back at some old analytical techniques to see if they apply in a big data era.  Some do and some don't.  Those that do share some common traits ... they are insightful in ways that stand the test of time, and they are just plain useful.

One such technique looks at the kind of things that throw off every forecast - things that have not yet happened and so were never built into a model or equation.  There are lots of things like this.  In the old days you would seek out subject matter experts and gather a range of opinions on things that could bugger up the future.  Now we plug and play big data techniques to survey and account for things that have not yet happened.  

Gather enough of these and you then must simulate the outcome to see how things combine to make multiple futures.  Hedging our bets you say?  No No.  Its how you use the range of outcomes that makes it powerful.  

Planning is always about managing uncertainty and that is exactly what a range of outcomes gives us.  Ask "how likely is it the market demand for our goods will be at level X?" and you get the answer 73%.  That is certainly a better answer than a point forecast you know is wrong before you have even heard the answer.  

Smart companies use this old technique in the big data era to be more thorough in their planning, risk assessments and resource allocations.  Governments use this technique to measure the likelihood a foe will acquire a capability or advance a technology.

In the end this old technique lives again.  In fact it gets better with more data, accounting for futures as big data collection and analysis techniques inform them where we used to rely on only a small number of subject matter experts.  Welcome to Big Data.

tia.png

Comment