Big Data is breathing some new life into a another analytical technique we dusted off recently.  It's about understanding the development of technology over time into finished products.  

Every company who dreams up a new product is faced with choices along the way in terms of inputs.  Buy or build, buy this one or that one, buy that one but modify it and so on.  Each input choice made influences the timing and finished product features.  Input A is better for the finished product but might take longer to complete holding everything up so perhaps the lesser input B is sufficient.  

Big Data comes in because it helps inform the choices, capabilities and timing of the inputs into the finished new product.  The analytical technique then is one of specifying each input in terms of the probability of completion and capabilities by a certain time.  Now you have a network you can simulate.  It looks (in a simplified form) something like this;  

Screen Shot 2013-04-15 at 1.43.51 PM.png

Inputs are specified by probabilities of time to completion and capabilities.  Combining inputs into sub-systems which are then also specified by probabilities of time to completion and capabilities.  Note also that the choice of one input over another are present.  Or in some cases the condition of needing 2 inputs to be complete instead of a choice one over another is also present.  

 These networks can get very big very fast.  In the past, this technique might be used to model a future enemy weapons system.  A sub or a plane is fiendishly complex so simulation techniques are used to estimate likely outcomes, outcomes with the shortest timeframe, outcomes with the most robust capabilities and so on.  But also in the past the networks were so complex and time consuming to estimate they were rarely kept up to date.  But Big Data comes into play now by keeping such networks alive and useful - both changes to the inputs as well as the faster machine learning re-estimations that make the simulations useful.  

Comment