论文标题

Feamoe:专家的公平,可解释和自适应混合物

FEAMOE: Fair, Explainable and Adaptive Mixture of Experts

论文作者

Sharma, Shubham, Henderson, Jette, Ghosh, Joydeep

论文摘要

在高风险环境中部署的值得信赖的机器学习模型所需的三个关键属性是公平性,解释性和能够考虑各种“漂移”的能力。尽管模型准确性的漂移(例如由于协变量转移引起的)已被广泛研究,但随着时间的推移,公平度量的漂移仍然在很大程度上没有探索。在本文中,我们提出了Feamoe,这是一种新颖的“ Experts”启发框架,旨在学习更公平,更可解释/可解释的模型,该模型也可以快速适应分类器的准确性和公平性的漂移。我们说明了三项大众公平措施的框架,并证明了如何在这些公平限制方面处理漂移。多个数据集上的实验表明,我们应用于线性专家的混合物的框架能够在产生更公平的模型的同时,在准确性方面与神经网络相当地执行。然后,我们使用大规模的HMDA数据集,并表明,尽管在HMDA上训练的各种模型都证明了相对于准确性和公平性的漂移,但Feamoe可以相对于所有考虑的公平措施,并保持模型的准确性。我们还证明,所提出的框架允许产生快速的沙普利价值解释,这使得基于计算有效的特征归因于模型决策的解释可以通过Feamoe轻松获得。

Three key properties that are desired of trustworthy machine learning models deployed in high-stakes environments are fairness, explainability, and an ability to account for various kinds of "drift". While drifts in model accuracy, for example due to covariate shift, have been widely investigated, drifts in fairness metrics over time remain largely unexplored. In this paper, we propose FEAMOE, a novel "mixture-of-experts" inspired framework aimed at learning fairer, more explainable/interpretable models that can also rapidly adjust to drifts in both the accuracy and the fairness of a classifier. We illustrate our framework for three popular fairness measures and demonstrate how drift can be handled with respect to these fairness constraints. Experiments on multiple datasets show that our framework as applied to a mixture of linear experts is able to perform comparably to neural networks in terms of accuracy while producing fairer models. We then use the large-scale HMDA dataset and show that while various models trained on HMDA demonstrate drift with respect to both accuracy and fairness, FEAMOE can ably handle these drifts with respect to all the considered fairness measures and maintain model accuracy as well. We also prove that the proposed framework allows for producing fast Shapley value explanations, which makes computationally efficient feature attribution based explanations of model decisions readily available via FEAMOE.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源