Businesses everywhere are working to transform their existing legacy IT environments into a virtualized application delivery infrastructure through software-defined data center and cloud architectures. But while the virtualization of applications and services is an important initial step in this transformation, it adds a new level of complexity to IT operations.
To realize the cost and efficiency benefits of these virtualized architectures, businesses must properly orchestrate multiple vendors and technologies while also automating traditionally complex and manual tasks. When the individual components are orchestrated in a simple and unified manner, virtualized environments can provide application elasticity and agility, improving the operational expertise, time to deliver, and reliability of the IT environment.
One step back, two steps forward
Virtualization at its core actually increases the cost and complexity of IT infrastructure. Whether it is open source or vendor-delivered proprietary technology, virtualization infrastructure adds cost, complexity, and the need for expertise to support it, all in addition to supporting the original application that now resides upon this virtual architecture. Instead of supporting an application or network service on proprietary hardware or bare metal, IT organizations need to understand, design, and manage the virtualized infrastructure that the application actually resides upon.
For example, the deployment of a service like a firewall might initially only require the connection and configuration of a physical device in the traditional data center, but the virtualization of this process can be a complex ordeal. The IT team will need to determine the virtual resources that the firewall will consume, provision the virtual machine for this task, and then load the firewall software onto the virtual instance. In this case, operational support of the firewall would then also include the support of the virtual machine and underlying virtualized infrastructure.
Turning your network into a symphony
Effective orchestration requires the understanding of the interactions between the different elements in the virtualized IT infrastructure. When a new firewall needs to be installed, the IT team needs to coordinate the provisioning and deployment of the processing, memory, and storage within the network infrastructure. This process is coordinated with the installation of the firewall service, the configuration and deployment of appropriate security policies, and integration into the various network and security management tools.
In a traditional symphony, a conductor manages and integrates the disparate instruments in the orchestra to produce exquisite music for the audience. She understands the role each instrument has in the piece being played and knows how to actuate and extract the value of them to integrate into the entire piece being played.
In a similar manner, the orchestration engine for the virtualized IT architecture needs to know about each component in the environment and how these components integrate with each other to deliver the applications and data based on the business goals. The orchestrator knows the value each piece brings to the application delivery infrastructure and knows when to incorporate it and how to tune it to bring maximum value to the IT environment.
Why so manual?
IT operations are still a very manual process. When changes need to be made – either because new resources need to be added to meet demand or a new service is being deployed – the process is very hands-on. There are runbooks, procedures, and guidelines that team members must follow when making changes to the network. Sometimes these changes are time sensitive, such as the need to increase application resources when there is a surge of traffic. These operational processes are manual because IT organizations believe that it is difficult, if not impossible, to design automated processes that can integrate multiple vendors, technologies, and deployment models. There are too many variables and proprietary components to develop a process that can be automated.
But, in a way, the runbooks that they create are automated processes. While humans are currently required to execute the steps within the procedures, these elements can be programmatically defined in an orchestration tool. The conductor that creates the music from the orchestra can be replaced with an automation if her actions are consistent and predictable between performances.
Realizing the benefits of virtualization, automation, and orchestration
Automation technology is coming to virtualized environments as vendors and technology communities determine how to connect to the orchestration systems. When automation is added to the orchestration of these virtualized environments, we gain the reliability, efficiency, and accuracy that we expect. Virtualization is only a benefit when we also incorporate the orchestration and automation of these new IT architectures.
Frank Yue (News - Alert) is the Director Application Delivery Solutions for Radware. In this role, Yue is responsible for evangelizing technologies and trends around Radware�s ADC (News - Alert) solutions and products. He writes blogs, produces solution architectures, and speaks at conferences and events around the world about application networking technologies. Prior to joining Radware, Yue was at F5 Networks (News - Alert), delivering their global messaging for service providers. Yue has also covered deep packet inspection, high performance networking, and security technologies. Yue is a scuba diving instructor and background actor when he is not discussing technology.
Edited by Alicia Young