When you stop to think about it, the entire IT industry is driven by the premise that Moore’s law will continue to provide double the number of transistors per square inch every 18 months and thereby help us to keep up with the relentless growth in data to be processed. What happens when this is no longer true?
In continuation of his previous blog on the history, vision and challenges of NFV, Daniel Proch now moves on to discuss the solutions that are required to overcome those challenges.
This article is a first of a two-part series, where Danny Proch, VP of Product Management at Napatech, will be outlining the history of the NFV concept, and the challenges in realizing the vision, specifically in the context of 5G networks.
In the run-up to SDN World Congress in The Hague, it is interesting to note that it was at this very show in Dusseldorf 5 years ago that the original NFV whitepaper was first presented. How time flies!
In the world of fin tech, data capture has never exactly been the sexiest area of development.
But as market participants ready themselves for a new wave of trading rules, and as competition among cutting edge trading firms has shown no sign of abating, the importance of data capture has taken on much greater significance.
At Mobile World Congress recently, I was privileged to be invited to speak on a panel debating NFV. The topic of the panel discussion was “NFV: a re-examination” and included participants from Telefonica, InterDigital Europe, Cisco, Spirent and Red Hat.