Big Data and the Zettabyte

The industry buzzwords and focus around Big Data Management are starting to wane in the media, but should it?

Consider this: In 2015, the internet is expected to carry over a Zettabyte of data for the first time ever. Even more interesting…and alarming…is that by 2020, forecasts are calling for upwards of 35 Zettabytes, per year (!), to be carried. If Big Data were a round of golf, and we as an industry thought we were somewhere around the 2nd or 3rd hole, in reality, we haven’t even purchased our clubs yet.

For those of you not familiar with a Zettabyte, here is the breakdown:
1000 Terabytes = Petabyte
1000 Petabytes = Exabyte
1000 Exabytes = Zettabyte

Mobile video data alone has increased over 3500% since 2010; statistics abound for data growth – and they all point to explosive growth continuing. With the advent of 4G, raising data speeds and volumes by an order of magnitude, we now see 5G appearing by 2017 in some markets – again raising the volumes and speeds by yet another order of magnitude. Everything in our lives is being connected. Cars, appliances, our bodies (and embedded equipment to run them, e.g., pacemakers), every shop and street corner (video), and even the air, the water currents, and the space surrounding our immediate planet, are all feeding data in increasing volumes and depths into our collective networks. What does this mean?

It means, quite simply, that we haven’t even begun to understand what Big Data is and how it will impact us. It also means that the communications industry, and the backbone it supports, has a long way to go (as the default host for much of this data) to be prepared.

0 0 vote
Article Rating
Notify of
Inline Feedbacks
View all comments
Get started with Subex
Request Demo Contact Us