Science fiction shows us a future where artificial intelligence and automation become integrated into our human processes. Computers provide advice for our next course of action when we’re faced with decisions. The automatons offer suggestions and concerns with very specific odds of success based on the current situation.
Meanwhile, in today’s reality, we are still struggling with the automated collection of information and how to process it to deliver meaningful and insightful nuggets. Big data is a hot buzzword, but like many emerging technologies, the term means different things to different people. A general definition of big data is “the collection of large volumes of information.” The big question, though, is how we can program the analytics and heuristics to provide useful information via big data that could not be obtained through human and manual efforts in the same amount of time.
New technologies require new management techniques.
The IT community is adopting new technologies and architectures like software-defined, cloud, and virtualization. These technology models are hard to manage manually due to their increased complexity. Ownership and control in cloud environments are distributed between the business and cloud provider.
Operationally, IT organizations need to support technologies like hypervisors and containers. Businesses have the expectation that an agile and elastic network ecosystem can adjust to changing network and business requirements in real time.
Human processes cannot collect, analyze, and respond to these changing requirements quickly and effectively. When businesses attempt to rely on the legacy manual operational IT processes, they quickly find that they lack a true understanding of the current state of the IT environment, which leads to an increase in human error. The error is caused by the lack of proper analytics to identify the proper remedy or the lack of proper execution of the required changes.
Analytics are today’s artificial intelligence.
Today, big data solutions offer ways for businesses to analyze voluminous amounts of data in a programmatic manner based on policies and logic that are coded into the analytical engine. Organizations use the big data engines to gain insights to improve their business and its processes. They are looking for nuggets of information that they may miss by using traditional operational oversight processes. They are also looking for trends and results that can be identified earlier than through these traditional analytical models.
When big data analytical systems deliver information, there is still a manual process to implement and integrate changes to impact the business. Sometimes, the manual intervention is deliberate because businesses don’t trust the analytical systems to recommend a proper course of action. There are many science fiction stories where the supposedly neutral or benevolent analytical system decides that humans are less than ideal business partners.
In other cases, there is no connectivity between the collection of data and the operational aspects of the business. Making changes to a product distribution operation requires multiple physical and operational adjustments that are not practical to auto-implement with today’s infrastructure.
Artificial intelligence is already here.
The IT environment has an advantage: There is already a framework in place to automate the implementation of the recommended changes coming from the analytical engine. Open and robust multi-vendor and multi-technology APIs enable these analytical engines to interact with the IT environment. Changes can be made to the virtualized architecture, and the configurations of the various components can be altered based on the analytical output.
In the IT industry, there has been talk about self-healing networks and dynamic architectures. The analytical engine and its ability to alter the network ecosystem is critical to the success of these concepts. The analytical engine collects the sensory information from all of the components in the network, analyzes the data based on the business requirements, and then changes the multi-vendor, multi-technology environment to maintain and enhance the business functionality.
Using automated analytics and orchestration to create the self-healing network ecosystem reduces the potential for human error and human actions that are contrary to the business and IT goals. Ultimately, the positive or negative social perception of these self-healing and self-aware environments is dependent on how well we define and implement the policies to manage and constrain the artificially intelligent analytical engines.
Frank Yue is the Director Application Delivery Solutions for Radware. In this role, Yue is responsible for evangelizing technologies and trends around Radware�s ADC (News - Alert) solutions and products. He writes blogs, produces solution architectures, and speaks at conferences and events around the world about application networking technologies. Prior to joining Radware, Yue was at F5 Networks (News - Alert), delivering their global messaging for service providers. Yue has also covered deep packet inspection, high performance networking, and security technologies. Yue is a scuba diving instructor and background actor when he is not discussing technology.
Edited by Erik Linask