论文标题

在NVIDIA NICS实施强化学习数据中心拥塞控制

Implementing Reinforcement Learning Datacenter Congestion Control in NVIDIA NICs

论文作者

Fuhrer, Benjamin, Shpigelman, Yuval, Tessler, Chen, Mannor, Shie, Chechik, Gal, Zahavi, Eitan, Dalal, Gal

论文摘要

随着通信协议的发展,数据中心网络的利用率增加。结果,拥塞更加频繁,导致更高的延迟和数据包损失。结合工作量的复杂性日益增加,拥塞控制(CC)算法的手动设计变得极为困难。这要求开发AI方法来取代人类努力。不幸的是,由于其计算功能有限,目前无法在网络设备上部署AI模型。在这里,我们通过基于最近的强化学习CC算法构建计算轻度解决方案来为该问题提供解决方案[ARXIV:2207.02295]。我们通过将其复杂的神经网络提炼成决策树,从而减少了RL-CC的推理时间。此转换使$ $ -SEC决策时间要求中的实时推断对质量产生可忽略的影响。我们在实时群集中对Nvidia NIC的转换政策部署。与生产中使用的流行CC算法相比,RL-CC是唯一在大量流量测试的所有基准测试中表现良好的方法。它同时平衡了多个指标:带宽,延迟和数据包下降。这些结果表明,CC的数据驱动方法是可行的,这挑战了先前的信念,即手工制作的启发式方法对于实现最佳性能是必要的。

As communication protocols evolve, datacenter network utilization increases. As a result, congestion is more frequent, causing higher latency and packet loss. Combined with the increasing complexity of workloads, manual design of congestion control (CC) algorithms becomes extremely difficult. This calls for the development of AI approaches to replace the human effort. Unfortunately, it is currently not possible to deploy AI models on network devices due to their limited computational capabilities. Here, we offer a solution to this problem by building a computationally-light solution based on a recent reinforcement learning CC algorithm [arXiv:2207.02295]. We reduce the inference time of RL-CC by x500 by distilling its complex neural network into decision trees. This transformation enables real-time inference within the $μ$-sec decision-time requirement, with a negligible effect on quality. We deploy the transformed policy on NVIDIA NICs in a live cluster. Compared to popular CC algorithms used in production, RL-CC is the only method that performs well on all benchmarks tested over a large range of number of flows. It balances multiple metrics simultaneously: bandwidth, latency, and packet drops. These results suggest that data-driven methods for CC are feasible, challenging the prior belief that handcrafted heuristics are necessary to achieve optimal performance.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源