论文标题

UBFT:使用分类内存[扩展版本]的微秒尺度BFT

uBFT: Microsecond-scale BFT using Disaggregated Memory [Extended Version]

论文作者

Aguilera, Marcos K., Ben-David, Naama, Guerraoui, Rachid, Murat, Antoine, Xygkis, Athanasios, Zablotchi, Igor

论文摘要

我们提出了UBFT,这是第一个状态机器复制(SMR)系统,以实现数据中心的微秒级延迟,而仅使用$ 2F {+} 1 $ replicas来耐受$ f $ byzantine故障。 UBFT提供的拜占庭式容忍度(BFT)至关重要,因为纯粹的崩溃似乎仅仅是一种幻想,据报道,现实生活中的系统以许多意外的方式失败。 UBFT依靠一个小的未量化的可信计算基础 - 分解内存 - 并消耗了几乎有限的内存量。 UBFT是基于一种称为一致的尾部广播的新型抽象,我们用来在界定内存时防止模棱两可。我们使用基于RDMA的分类内存实现UBFT,并获得最小的端到端延迟。这是至少50美元$ \ times $比Minbft快,Minbft是基于英特尔SGX的最新状态$ 2F {+} 1 $ bft SMR。我们使用UBFT复制两个KV商店(Memcached and Redis),以及金融订单匹配引擎(Liquibook)。这些应用的潜伏期较低(长达20US),并且拜占庭式耐受性较小,只有10US。 UBFT的价格是少量可靠的分解内存(小于1 MIB),在我们的原型中,它由通过RDMA连接的少量内存服务器组成,并复制以实现故障公差。

We propose uBFT, the first State Machine Replication (SMR) system to achieve microsecond-scale latency in data centers, while using only $2f{+}1$ replicas to tolerate $f$ Byzantine failures. The Byzantine Fault Tolerance (BFT) provided by uBFT is essential as pure crashes appear to be a mere illusion with real-life systems reportedly failing in many unexpected ways. uBFT relies on a small non-tailored trusted computing base -- disaggregated memory -- and consumes a practically bounded amount of memory. uBFT is based on a novel abstraction called Consistent Tail Broadcast, which we use to prevent equivocation while bounding memory. We implement uBFT using RDMA-based disaggregated memory and obtain an end-to-end latency of as little as 10us. This is at least 50$\times$ faster than MinBFT , a state of the art $2f{+}1$ BFT SMR based on Intel's SGX. We use uBFT to replicate two KV-stores (Memcached and Redis), as well as a financial order matching engine (Liquibook). These applications have low latency (up to 20us) and become Byzantine tolerant with as little as 10us more. The price for uBFT is a small amount of reliable disaggregated memory (less than 1 MiB), which in our prototype consists of a small number of memory servers connected through RDMA and replicated for fault tolerance.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源