Much of the conversation surrounding Big Data is about, well, how big it is. The sheer volume of information available is a frequent topic of discussion, and for good reason: an absolutely startling amount of knowledge has been collected, quantities that were once unthinkable. While reasonable, this sort of analysis misses one of the primary benefits of the data revolution.
Namely, that it's not just about how much information is out there but how much we can do with it. Through use of algorithms — rules that govern computational methodology — researchers can now solve complex problems with just a fraction of the time and energy.
Consider, for example, Weatherhead University professor Gary King. One of his peers had a mass of information that seemed daunting. He estimated that it would take a highly specialized computer to get through it all, one that could cost up to $2 million.
Within two hours, King and a group of graduate students came up with a solution. Using nothing more than an algorithm and a laptop, they figured out a way to perform the same task in just 20 minutes.
"There is a movement of quantification rumbling across fields in academia and science, industry and government and nonprofits," explains King.
While it is undoubtedly exciting how much information can fit into custom database software, the real value in working with a FileMaker developer comes in parsing and analyzing it. Without the right structure in place, having a pile of data can be more confusing than enlightening. With the right support system, however, Big Data can open up the capability for truly stunning feats of insight.