Advances in the areas of artificial intelligence and machine learning are transforming traditional data centers as never before. In 2017, Gartner predicted that by 2020, 30 percent of data centers that fail to apply artificial intelligence and machine learning effectively will cease to be operationally and economically viable.
What led to this prediction in the first place?
First, let’s address the all too common confusion and misuse of the terms AI and ML. AI is a machine-generated form of intelligence that works and reacts like humans. ML is a subset of AI, based on the idea that machines can learn from data, identify patterns and improve outcomes. Simply put, AI is decision making. ML is learning new things and solving problems.
Simply put, AI is decision making. ML is learning new things and solving problems.
Three years ago, Gartner’s prediction seemed sensible considering the uptick in successful use cases for AI and ML spanning a number of areas in cooling utilization, capacity planning and security, just to name a few.
But despite being full of ‘high-tech’ computers, software and advanced networking, data centers themselves are still designed, built, and operated very much in the same way as they have been since the early days of the first mainframe machine rooms. While data centers are generally running more “smartly”, human knowledge and labor is needed to deliver the bulk of data center operations and deliver as close to 100 percent uptime as possible.
While data centers are generally running more “smartly”, human knowledge and labor is needed to deliver the bulk of data center operations.
What will help realize this prediction?
ML will continue to be adopted across data centers in the coming years. Why? For one, there is a significant amount of investment going into the development of ML systems and tools. The global ML market was valued at around $1.58 billion in 2017 and is expected to reach approximately $20.83 billion in 2024, growing at a CAGR of 44.06% between 2017 and 2024, according to Zion Market Research. More investment means more developed and sophisticated tools that can be deployed at scale.
Given the ever-increasing market pressure to deliver space, power and cooling at the lowest possible cost, data centers will need to stay competitive, and ML is a tool for improving every aspect of their operations.
Secondly, given the ever-increasing market pressure to deliver space, power and cooling at the lowest possible cost, data centers will need to stay competitive, and ML is a tool for improving every aspect of their operations. For example, data centers today are increasingly using machine learning to improve energy efficiency, primarily by monitoring temperatures and adjusting cooling systems.
Using condition-based sensors, ML can identify anomalies in equipment function and perform incident analysis. This function is a major asset to improving uptime, eliminating a lot of guess work and predicting failures before they happen. Seeing more case studies on uptime that impact business performance will absolutely expedite the use of ML.
The factor that will ultimately drive the adoption of ML is, ironically, the human factor.
The factor that will ultimately drive the adoption of ML is, ironically, the human factor. Another Gartner report demonstrates that the top barriers include human skills and fear of the unknown. Without skilled professionals to implement ML strategies and execute them effectively (with the proper algorithms and supporting technologies) adopting ML is just a pipedream.
While the 30 percent prediction has not been fully realized, the trajectory has been set and we’re on our way there. 2025 might have been a better target date!
Read Zahl’s original response to Gartner’s prediction in InsideNetworks magazine