While many have tried, the term “big data” lacks a true consensus definition. At the moment the most popular definitions seem to coalesce around the idea that big data is one or more data sets so large and complex that they are challenging to process using traditional databases and tools. Often associated with this concept are the characteristic “three V’s” of big data: the volume (amount of data), velocity (speed of data in and out), and variety (range of data types and sources). Some enterprising companies and consultants throw in a 4th “V” for veracity or some other “V” word.
Regardless, these definitions miss a key aspect of the term. To put it into hyperbolic language, “Big Data” isn’t about the size of data at all. Instead, it is the simple yet seemingly revolutionary belief that data is valuable.
While “big data” does often happen to be large in size (although this is always relative to the available tool set), I believe that “big” actually means important (think big deal). Scientists have long known that data could create new knowledge but now the rest of the world, including government and management in particular, has realized that data can create value, principally financial but also environmental and social value. And, if data is valuable, more data is more valuable and who doesn’t want “big” (ie large) value.
Thoughts? Tomatoes to throw? Take aim below in the comments section.
Latest posts by Sean Murphy (see all)
- Former Obama For America and Living Social Data Scientists Show Off Their Startups – Data Innovation DC – Next Week! - March 18, 2014
- The Fall of the P-Value - March 13, 2014
- Flask Mega Meta Tutorial for Data Scientists - February 16, 2014