Big data in context is how real understanding begins
Big data, big picture, big salad. It really doesn’t matter, according to IBM’s Jeff Jonas: The more the merrier. Jonas, the big brain behind IBM’s business analytics efforts, spoke about the latest technological developments in crunching big numbers at the recent IBM PartnerWorld conference in New Orleans. And what’s going on is a big deal — very big.
Jonas, chief scientist of the IBM Entity Analytics group and an IBM Distinguished Engineer, doesn’t fear large data sets. He’s working on systems that work better and more efficiently, the more data there is to crunch. “Big data, new physics,” he calls it.
“It’s really about big data in context,” he said. “On this journey, with context, you end up with having higher-quality predictions because both your false positives and your false negatives are declining.”
Whereas data managers for years have worried about data cleansing, data hygenics and data sterilization, Jonas says that more data — and more information about that data — helps define patterns in data that otherwise would not be found. “Your bad data becomes your friend. It turns out you don’t want to overly clean your data.”
The next generation (dubbed G2) of business intelligence systems Jonas is working on will be able to evaluate new observations vs. previous ones in real time, he reported. It also will be able to handle “abstract entities” and “exotic features,” and become tolerant of “uncertainty or disagreement.” In other words, G2 will be able to “learn.” And all in a response time of less than 200 milliseconds.
Is there any limit to our ability to understand the world around us through data? That’s the big question.