Pokemon Go has taken the world by storm. Nintendo have done it again. Just like the Nintendo Wii changed video game players from couch potatoes into physically active game participants...
In the recent market survey we performed with Heavy Reading called “The Future of Network Appliances,” we were surprised to see the rapid growth in 100G adoption, not only in core and metro networks, but also in access networks. See our infographic The Future of Network Appliances for a visual representation of the findings.
At first glance, one would expect that this is welcome news to all of us who sell hardware solutions, including network management and security appliance vendors. But, it is not without its challenges, such as the challenge of analyzing 100s of Gigabits of data in real time.
For some network management and security applications, it is possible to analyze this amount of data in real time with intelligent load-balancing across multiple CPU cores combined with filtering, slicing and de-duplication. But, for applications that need to perform deep analysis of each packet and flow, even these techniques might not be enough.
As we enter the era of 100G, we need to step back and rethink how we are analyzing data and ask some fundamental questions. Do we need to analyze all the data? Do we need to analyze all the data, all the time? Do we need to analyze the data in real time?
What we are learning from discussions with our customers is that real time analysis is not always needed. For some applications, analyzing in near real time or even post-analysis is sufficient.
In addition, not all data needs to be analyzed. Most applications just need to see parts of the data or specific flows. So, now we are beginning to see a range of analysis needs that span from real-time, always-on, full packet analysis to post-analysis of specific data on-demand.
While real-time network appliances capable of handling 100G of data will be needed in the future, there will also be a need for solutions that can deliver only the data that needs to be analyzed, when and where it is needed. We call these Smarter Data Delivery solutions.
A Smarter Data Delivery solution portfolio should offer the ability to:
- Build real-time network analysis appliances with the capacity to analyze data at full line rate under all conditions without losing any data
- Record network data so that it can be analyzed in near-real time or on-demand
- Provide the ability to select the data to be analyzed whether it is a real-time appliance or a network recording solution
So, as we move into the 100G era, we need to broaden our understanding of what it means to analyze data and embrace smarter data delivery solutions to stay ahead of data growth curve.