论文标题

个性化学习和估计的生成框架:理论,算法和隐私

A Generative Framework for Personalized Learning and Estimation: Theory, Algorithms, and Privacy

论文作者

Ozkara, Kaan, Girgis, Antonious M., Data, Deepesh, Diggavi, Suhas

论文摘要

联邦学习的一个显着特征是(本地)客户数据可能具有统计异质性。这种异质性促使了个性化学习的设计,在这种学习中,通过协作培训了个人(个性化)模型。文献中提出了各种个性化方法,似乎非常不同的形式和方法,从将单个全球模型用于本地正规化和模型插值,再到将多个全球模型用于个性化聚类等。在这项工作中,我们始于一个生成框架,从而可以潜在地统一几个不同的算法以及建议新的算法。我们将生成框架应用于个性化的估计,并将其连接到经典的经验贝叶斯方法论。我们在此框架下制定私人个性化估计。然后,我们将生成框架用于学习,该框架统一了几种已知的个性化FL算法,并提出了新的。我们提出和研究一种基于知识蒸馏的新算法,该算法的数值优于几种已知算法。我们还为个性化学习方法开发隐私,并保证用户级的隐私和组成。我们通过数值评估估计和学习问题的性能以及隐私,证明了我们提出的方法的优势。

A distinguishing characteristic of federated learning is that the (local) client data could have statistical heterogeneity. This heterogeneity has motivated the design of personalized learning, where individual (personalized) models are trained, through collaboration. There have been various personalization methods proposed in literature, with seemingly very different forms and methods ranging from use of a single global model for local regularization and model interpolation, to use of multiple global models for personalized clustering, etc. In this work, we begin with a generative framework that could potentially unify several different algorithms as well as suggest new algorithms. We apply our generative framework to personalized estimation, and connect it to the classical empirical Bayes' methodology. We develop private personalized estimation under this framework. We then use our generative framework for learning, which unifies several known personalized FL algorithms and also suggests new ones; we propose and study a new algorithm AdaPeD based on a Knowledge Distillation, which numerically outperforms several known algorithms. We also develop privacy for personalized learning methods with guarantees for user-level privacy and composition. We numerically evaluate the performance as well as the privacy for both the estimation and learning problems, demonstrating the advantages of our proposed methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源