When we consider how to make better Natural Language Processing (NLP) technology we are always thinking in terms of human comprehension as the benchmark.  When I read something how do I make sense of it?  What does my brain do and how fast does it do it?  What kind of processing power does it take to do that?

We came across some interesting facts to answer that last question.  It is widely agreed that the processing power equivalent of the human brain would be 100 million million-instructions-per-second (MIPS) in speed and 100 million megabytes (MB) in memory.  No wonder you need 8 hours of sleep per night to clear out all the leftover junk from processing that much information all day long. 

And everyone loves to talk about IBM's Watson winning Jeopardy.  But as this article reminds us "Watson needed 16 terabytes of memory, 90 powerful servers, a total of 2880 processor cores, and mind-boggling quantities of electrical power just to wrap its big computery head around the concept of wordplay".  So that was no chip on a board that beat Ken Jennings. 

But we are headed in the right direction with the work-a-day solutions we use here at Big Data Lens - smarter algorithms, more efficient code, faster processing using cloud infrastructure and more.  We may not get to the magic "100 and 100" benchmark even in our lifetime but for each new leap there are excellent new vistas to explore and lots of new insight to be gained.  

Comment