While many have tried, the term “big data” lacks a true consensus definition. At the moment the most popular definitions seem to coalesce around the idea that big data is one or more data sets so large and complex that they are challenging to process using traditional databases and tools. Often associated with this concept are the characteristic “three V’s” of big data: the volume (amount of data), velocity (speed of data in and out), and variety (range of data types and sources). Some enterprising companies and consultants throw in a 4th “V” for veracity or some other “V” word.
Regardless, these definitions miss a key aspect of the term. To put it into hyperbolic language, “Big Data” isn’t about the size of data at all. Instead, it is the simple yet seemingly revolutionary belief that data is valuable.
While “big data” does often happen to be large in size (although this is always relative to the available tool set), I believe that “big” actually means important (think big deal). Scientists have long known that data could create new knowledge but now the rest of the world, including government and management in particular, has realized that data can create value, principally financial but also environmental and social value. And, if data is valuable, more data is more valuable and who doesn’t want “big” (ie large) value.
Thoughts? Tomatoes to throw? Take aim below in the comments section.
Latest posts by Sean Murphy (see all)
- Endgame hosts DIDC’s Data and Cyber Security August Event - August 27, 2014
- DIDC Lean Data Product Development with the US Census Bureau – Debrief and Video - June 8, 2014
- A New Type of Meet Up Event? - May 23, 2014