论文标题

班级对上下文不变,反之亦然:在学习不变性的概括方面

Class Is Invariant to Context and Vice Versa: On Learning Invariance for Out-Of-Distribution Generalization

论文作者

Qi, Jiaxin, Tang, Kaihua, Sun, Qianru, Hua, Xian-Sheng, Zhang, Hanwang

论文摘要

分布式概括(OOD)都是关于对环境变化的学习不变性。如果每个类中的上下文分布均匀分布,则OOD将是微不足道的,因为由于基本原则,可以轻松地删除上下文:类是上下文不变的。但是,收集这种平衡的数据集是不切实际的。学习不平衡的数据使模型偏向上下文,从而伤害了OOD。因此,OOD的关键是上下文平衡。我们认为,在先前的工作中广泛采用的假设,可以直接从偏见的类预测中注释或估算上下文偏差,从而使上下文不完整甚至不正确。相比之下,我们指出了上述原则的另一面:上下文对于类也不变,这激励我们将类(已经被标记的)视为解决上下文偏见的各种环境(没有上下文标签)。我们通过最大程度地减少了类样本相似性的对比损失,同时确保这种相似性在所有类别中不变,从而实现了这一想法。在具有各种上下文偏见和域间隙的基准测试中,我们表明,配备了我们上下文估计的简单基于重新加权的分类器实现了最新的性能。我们在https://github.com/simpleshinobu/irmcon上提供了附录中的理论理由和代码。

Out-Of-Distribution generalization (OOD) is all about learning invariance against environmental changes. If the context in every class is evenly distributed, OOD would be trivial because the context can be easily removed due to an underlying principle: class is invariant to context. However, collecting such a balanced dataset is impractical. Learning on imbalanced data makes the model bias to context and thus hurts OOD. Therefore, the key to OOD is context balance. We argue that the widely adopted assumption in prior work, the context bias can be directly annotated or estimated from biased class prediction, renders the context incomplete or even incorrect. In contrast, we point out the everoverlooked other side of the above principle: context is also invariant to class, which motivates us to consider the classes (which are already labeled) as the varying environments to resolve context bias (without context labels). We implement this idea by minimizing the contrastive loss of intra-class sample similarity while assuring this similarity to be invariant across all classes. On benchmarks with various context biases and domain gaps, we show that a simple re-weighting based classifier equipped with our context estimation achieves state-of-the-art performance. We provide the theoretical justifications in Appendix and codes on https://github.com/simpleshinobu/IRMCon.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源