Artificial intelligence (AI) seems to be the new kid on the block. Everybody is talking about it, and there is a flood of IT products that feature some variety of AI, from machine learning (ML) algorithms to neural networks and deep learning. In this situation, it is tempting to think of AI as just more hype, a fad that will go away in a few months. For various reasons, though, AI is here to stay.
First of all, it’s important to remember that AI isn’t actually new, the technology dates back to the 1950s when a computer learned to develop its own board game strategy. The 1980s saw a wave of expert systems designed to support human professionals in various fields, while in the 1990s and 2000s, AI-based systems were utilised for business-related data mining and medical research, among other things.
The current peak of interest in AI is caused by two recent developments. First, compute power and storage capacities have become incredibly inexpensive. In the 1950s, for example, storage hardware that could hold 3.75 MBytes of data – enough storage space for just one present-day low-resolution photograph – was so big it had to be moved by forklift truck. Today, one can buy a USB stick that can store thousands of high-resolution pictures – and you can easily move it in your pocket, no need for a forklift. At the same time, cloud service providers, especially the hyperscalers, have made seemingly limitless compute power and storage capacities available for always-on private and commercial use.
The second important trend is IoT. A huge variety of devices – from mobile gadgets to factories and facilities – are now equipped with sensor technology, often even with multiple sensors. These sensors generate a steadily increasing flood of data that needs to be processed, analyzed, and acted upon.
The interplay of these data feeds has become so complex that the consequences tend to escape the human eye. For example, analysis of sporadic variations in a machine’s behaviour might indicate that maintenance will soon be required, a monitoring approach named ‘predictive maintenance’. Today, these kinds of ‘needle in a haystack’ data discoveries can be performed much faster and more accurately with modern AI-based technology than by humans.
Compute power and storage will continue to become cheaper and more powerful. At the same time, the need for analysis of complex data relations will escalate. This is why AI will become increasingly entrenched in IT and IoT management.
In this context, AI isn’t just for academic research or business trend analysis anymore. For example, it can be used for managing and securing digital workspaces, as it can detect suspicious deviations from users’ normal behaviour. Specialised, AI-driven software will alert security teams as soon as, for example, users suddenly start to download files from a server they had never accessed before, and for which they have no access rights. This kind of behaviour, potentially a sign of a compromised end-user account, can be incredibly hard to detect by manual log file screening – yet it is a routine task for AI-based security analytics software.
In a similar way, AI will soon assist in improving the digital workspace user experience by correlating performance indicators across the chain of apps, services, and network connections that end users need for their daily work. Just like the predictive maintenance scenario in a factory, AI will soon correlate and analyze all components of a digital workspace, informing IT staff about looming quality-of-service degradation.
New software solutions and cloud services will soon make AI a commodity – something that is integrated into all kinds of products and services, in the consumer market as well as in the enterprise. Yet for the foreseeable future, AI will not be able to replace humans in IT management and security, as it lacks human intuition. When a mother scolds a child saying, “If all your friends jumped off a cliff, would you jump, too?”, the child knows the answer is supposed to be: “No”. An artificial intelligence would answer: “Yes!” – for if everybody does it, it must be alright. However, AI is very good at finding the proverbial ‘needle in the haystack’, at repetitive, extensive data analysis – tasks which are very hard for humans. So, in spite of its limitations, AI-based IT management tools provide a big leap forward in making digital workspaces and cloud environments more secure, efficient, and reliable for end users – while saving the IT team time and money.
By Brendan McAravey, Country Manager at Citrix South Africa