Welcome!

Mobile IoT Authors: Liz McMillan, Terry Ray, Elizabeth White, Pat Romanski, Kevin Benedict

Related Topics: @CloudExpo, Java IoT, Mobile IoT, Linux Containers, Containers Expo Blog, SDN Journal, @ThingsExpo

@CloudExpo: Article

The Future of Network Appliances | @CloudExpo #Cloud #NFV #IoT #SDN

How to make Network Functions Virtualization work effectively so it will deliver on its promise

We live in a hyper-connected, mobility-enabled world, one in which carriers must make drastic changes to how they do business if they are to survive and thrive in the future. Accordingly, great strides have been made over the last three years to prove the viability of Network Functions Virtualization (NFV). Many Proof of Concept (PoC) trials have proven that workloads can be migrated to virtual environments running on standard hardware, and there are even examples of carrier deployments using NFV.

The next step is to determine how to make NFV work effectively so it will deliver on its promise. The issue is no longer whether a service can be deployed using NFV, but whether we can manage and secure that service in an NFV environment. In other words, the challenge now is to operationalize NFV. How can we ensure that NFV is ready for this challenge?

Network appliances are needed to manage and secure services - specifically, appliances that can monitor and analyze network behavior. A recent survey by Heavy Reading on behalf of Napatech called "The Future of Network Appliances" provides insight into how network appliances are being used today, progress on migrating network appliances to virtual environments and insight into the challenges that need to be addressed to ensure the success of this migration.

The survey found that 47 percent of respondents considered network appliances for network management and security to be essential, while a further 39 percent considered them valuable, reflecting a broad appreciation for the operational value of appliances. Survey responses also show that network management and security appliances are broadly deployed, especially for applications like network and application performance monitoring, test and measurement as well as firewalls, intrusion detection and prevention, and data loss prevention.

Despite these findings, the survey also revealed that progress is being made in migrating network appliances to virtual environments, especially for the most widely deployed applications. Seventy-three percent of carriers indicated that they intend to deploy virtualized appliances over the next two years. Network equipment vendors are responding, with 71 percent indicating that they intend to deliver virtualized appliances in the same time frame.

Challenges remain with regard to delivering and deploying virtualized appliances. The top three challenges that respondents chose were interworking with other vendor solutions (81 percent concerned or extremely concerned), security (79 percent) and throughput (80 percent).

A surprising finding most likely almost presents the greatest challenge: the extensive deployment of 100G network data rates in not just the core but also the metro and, most surprisingly of all, the access network. Survey respondents were asked to indicate the most common planned data rate for the core, metro and access networks in 2018. The responses showed that 75 percent of respondents planned for 100G as their most common data rate in the core and 71 percent planned to use 100G in the metro, while 58 percent planned to use 100G in the access network.

These plans are ambitious and necessary, but they offer the greatest challenge when it comes to virtualizing network appliances over the next three years. The first 100G physical network appliances are just now being introduced to the market. They are based on standard servers, as the majority of physical network appliances are today. However, they rely on high-performance network interface cards capable of providing the throughput required at these data rates.

Even at data rates of "only" 10G, standard Network Interface Cards (NICs) cannot provide the performance required for these kinds of applications. Recent benchmark testing of NFV solutions, which are based on standard NICs, have shown that there are serious performance challenges in using these kinds of products for high-speed applications even when using Data Plane Development Kit (DPDK) acceleration. Solutions based on bypassing hypervisors, such as Single-Root Input/Output Virtualization (SR-IOV), provide some relief, but come at the expense of virtual function mobility and flexibility.

In order for virtualized networks to operate successfully, effective network management and security solutions must be in place. These solutions will need to work at rates of 100G by 2018. This will require ingenuity, hard work and alternative NIC solutions in NFV deployments. Network management and security virtualized applications come up against the same performance issues as other virtual appliances at high data rates.

Network administrators have overcome this barrier in the physical data center; it remains now to apply these solutions to the virtualized environment to create operational NFV at high data rates.

More Stories By Daniel Joseph Barry

Daniel Joseph Barry is VP Positioning and Chief Evangelist at Napatech and has over 20 years experience in the IT and Telecom industry. Prior to joining Napatech in 2009, he was Marketing Director at TPACK, a leading supplier of transport chip solutions to the Telecom sector.

From 2001 to 2005, he was Director of Sales and Business Development at optical component vendor NKT Integration (now Ignis Photonyx) following various positions in product development, business development and product management at Ericsson. He joined Ericsson in 1995 from a position in the R&D department of Jutland Telecom (now TDC). He has an MBA and a BSc degree in Electronic Engineering from Trinity College Dublin.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...