Though much of the coverage of artificial intelligence or AI has been hype, the technology itself is real enough – and gaining traction in the commercial sphere. In fact, AI is increasingly being viewed as an integral requirement for business IT setups, rather than a luxury or fad. The research firm Gartner, Inc., predicts that more than 30 percent of data centers that fail to sufficiently prepare for AI will no longer be operationally or economically viable by 2020.
For data center operators then, accommodating AI will become a part of the job description. In this guide, we’ll be looking at the major issues to consider, and the technologies available to help prepare your data center for artificial intelligence operations.
Artificial Intelligence Makes Real Physical Demands
For a data center running a set of typical enterprise applications, the average power consumption for a rack is usually around 7 kilowatts. But AI requires much higher processor utilization – and those processors consume a lot of energy. It’s not unusual for AI applications to use more than 30 kilowatts (30 kW) per rack.
Servers configured for artificial intelligence require a greater processor density, which means more chips crammed into the server chassis. All of them will run very hot in response to the higher utilization requirements of AI, introducing a high demand for cooling. And more cooling means more power.
Adapting Your Data Center Infrastructure To The Demands Of AI
Greater hardware density and higher utilization are characteristic of artificial intelligence operations. These requirements dictate a higher density approach for your data center infrastructure, which in turn will translate into greater power demands, and an increased need for cooling and ventilation.
There are several strategies you can adopt to meet these requirements, which we shall now discuss.
Consider A Liquid Alternative To Air Cooling
Once the power usage of a rack exceeds 15 kW, fan cooling becomes non-viable. But water has 3,000 times the heat capacity of air, which is why server cabinet makers have been adding liquid pipes to their cabinets and connecting water pipes to their heat sinks, as an alternative to air fans.
A lot of high performance computing (HPC) is done with liquid cooling, and water-cooled systems are a good choice for the higher density loads typically associated with artificial intelligence.
Note that if your data center is only set up for air cooling, some capital investment will be required to enable liquid cooling.
Run Less Critical AI Workloads At Lower Resolutions
For some AI computing operations, half precision is sufficiently accurate, allowing workloads to be run at resolutions low enough for existing data center infrastructure to comfortably handle. In fact, most AI workloads can be run at half or quarter precision, rather than 64-bit double precision.
Double-precision floating point calculations are primarily used in scientific research, often run at the molecular level. For AI training or inference on deep learning models, single and half-precision calculations should suffice. These may even be run in deep neural networks.
Segment Your Infrastructure To Cater For AI
Designers laying out new data center constructions are typically allocating space for workloads with higher power usage. These kinds of facilities require engineers to take into consideration the capability of power providers to increase the power supply to the data center, and establishing which portion of the data center is best equipped for higher density capabilities.
Consider Distributing Workloads Around The Site
An alternative to concentrating all AI-related workloads into a particular segment of the data center premises is to distribute those systems around the site. Most standard artificial intelligence apps are not high density, and will typically run at eight to ten kilowatts, perhaps ranging up to 15 kilowatts. These kinds of loads can be handled through standard air cooling.
Use Hot Aisle Containment For Air Flows
In a typical data center hot aisle / cold aisle arrangement, the cabinets are laid out in alternating rows, so that cold air intakes face each other on one front-facing aisle, while hot air exhausts face each other on the alternating back-facing aisle. This can make access difficult for any IT engineer who needs to get behind a cabinet to work on a server.
For the higher density environments of AI, it’s better to use a hot aisle containment pod to control air flow. So for example, doors at the end of a hot aisle with plastic plates over the top can direct heat into a ceiling intake pipe, while barriers keep the hot and cold air from mixing.
Consider Using A Chimney Cabinet
Instead of venting hot air from the back of the data center aisle, a chimney cabinet uses convection to send hot air up into a chimney, which is then connected to an air conditioning vent. This effectively narrows the air pathway, enabling you to get greater density into a cabinet than with a hot aisle pod.
Take An Edge Approach To Data Processing
Edge computing has the core principle of locating data processing power and facilities as close as possible to the systems using or generating that information. A similar philosophy may be used in positioning data processing resources for artificial intelligence in the data center.
It can take up to 100 times more energy to move data than it takes to process the information – and this data movement requires electricity, with a power drain that increases with the volume of data. For data-hungry AI applications, this can have enormous implications for your budget. So putting data closer to where it’s processed makes operational and economic sense.
Consider Leasing Your AI Data Center Space
As artificial intelligence applications become more widespread, a market is developing in data center operators who offer a portion of high density environments for AI. A flexible lease with a third-party data center provider can give enterprises greater options for growth.
Look For AI-Friendly Trends In Data Center Technology
Advances in server technology may in future make more heterogeneous chip layouts the norm, with a mix of x86 CPUs, GPUs, FPGAs, and AI accelerators. For high speed storage, NVMe-over-Fabric and SCM technologies will likely become more affordable. Many of these advances will benefit enterprise AI application environments.
As artificial intelligence becomes more integrated into data center operations, there will also be scope for using AI to streamline and optimize the operations of the very data center that houses it. For example, with a management tool powered by predictive analytics, IT administrators can delegate most of their workload distribution responsibilities to a computer.
AI may also play a role in data center cooling. Recently, Google and DeepMind have been experimenting with using artificial intelligence to optimize their cooling activities, through a recommendation system that suggests improvements to their network of data center operations.