论文标题
Metaverse Xurllc服务的注意力感知资源分配和QOE分析
Attention-aware Resource Allocation and QoE Analysis for Metaverse xURLLC Services
论文作者
论文摘要
Metaverse封装了我们对下一代互联网的期望,同时带来了新的关键绩效指标(KPI)。尽管常规的超级宽容和低延迟通信(URLLC)可以满足客观KPI,但很难提供个性化的沉浸式体验,这是元视频的独特特征。由于经验质量(QOE)可以被视为全面的KPI,因此使用个性化资源分配方案将URLLC朝向下一代URLLC(XURLLC),以实现更高的QoE。为了部署元Xurllc服务,我们研究了元服务提供商(MSP)和网络基础架构提供商(INP)之间的相互作用,并提供最佳的合同设计框架。具体而言,将最大化的MSP的实用程序定义为元用户的QOE的函数,同时确保INP的激励措施。为了以数学方式建模QOE,我们提出了一个名为Meta Immersion的新型度量,该指标既包含了元用户的客观KPI和主观感觉。此外,我们开发了一种注意力感知的渲染能力分配方案,以改善Xurllc的QoE。使用用户对象 - 注意级别的数据集,我们验证Xurllc与使用统一的资源分配方案相比,Xurllc平均可以提高20.1%的QOE改进。本文的代码可从https://github.com/hongyangdu/attentionqoe获得
Metaverse encapsulates our expectations of the next-generation Internet, while bringing new key performance indicators (KPIs). Although conventional ultra-reliable and low-latency communications (URLLC) can satisfy objective KPIs, it is difficult to provide a personalized immersive experience that is a distinctive feature of the Metaverse. Since the quality of experience (QoE) can be regarded as a comprehensive KPI, the URLLC is evolved towards the next generation URLLC (xURLLC) with a personalized resource allocation scheme to achieve higher QoE. To deploy Metaverse xURLLC services, we study the interaction between the Metaverse service provider (MSP) and the network infrastructure provider (InP), and provide an optimal contract design framework. Specifically, the utility of the MSP, defined as a function of Metaverse users' QoE, is to be maximized, while ensuring the incentives of the InP. To model the QoE mathematically, we propose a novel metric named Meta-Immersion that incorporates both the objective KPIs and subjective feelings of Metaverse users. Furthermore, we develop an attention-aware rendering capacity allocation scheme to improve QoE in xURLLC. Using a user-object-attention level dataset, we validate that the xURLLC can achieve an average of 20.1% QoE improvement compared to the conventional URLLC with a uniform resource allocation scheme. The code for this paper is available at https://github.com/HongyangDu/AttentionQoE