论文标题
连续测试时间域的适应
Continual Test-Time Domain Adaptation
论文作者
论文摘要
测试时间域的适应性旨在在不使用任何源数据的情况下将源预先训练的模型调整为目标域。现有作品主要考虑目标域是静态的情况。但是,现实世界的机器感知系统正在非平稳和不断变化的环境中运行,目标域分布可以随时间变化。现有的方法主要基于自我训练和熵正规化,可能会遭受这些非平稳环境的困扰。由于目标域中的分布随时间变化,伪标签变得不可靠。嘈杂的伪标签可以进一步导致错误积累和灾难性遗忘。为了解决这些问题,我们提出了一种连续的测试时间适应方法〜(Cotta),其中包括两个部分。首先,我们建议通过使用权重平均和增强平均预测来减少误差积累,这些预测通常更准确。另一方面,为了避免灾难性的遗忘,我们建议在每次迭代期间随机将一小部分神经元恢复为源预训练的权重,以帮助长期保留源知识。该方法可以为网络中的所有参数提供长期适应性。 Cotta易于实现,并且可以很容易地将其纳入现成的预培训模型中。我们证明了方法对四个分类任务的有效性以及连续测试时间适应的分段任务,我们在其上超越了现有方法。我们的代码可在\ url {https://qin.ee/cotta}上找到。
Test-time domain adaptation aims to adapt a source pre-trained model to a target domain without using any source data. Existing works mainly consider the case where the target domain is static. However, real-world machine perception systems are running in non-stationary and continually changing environments where the target domain distribution can change over time. Existing methods, which are mostly based on self-training and entropy regularization, can suffer from these non-stationary environments. Due to the distribution shift over time in the target domain, pseudo-labels become unreliable. The noisy pseudo-labels can further lead to error accumulation and catastrophic forgetting. To tackle these issues, we propose a continual test-time adaptation approach~(CoTTA) which comprises two parts. Firstly, we propose to reduce the error accumulation by using weight-averaged and augmentation-averaged predictions which are often more accurate. On the other hand, to avoid catastrophic forgetting, we propose to stochastically restore a small part of the neurons to the source pre-trained weights during each iteration to help preserve source knowledge in the long-term. The proposed method enables the long-term adaptation for all parameters in the network. CoTTA is easy to implement and can be readily incorporated in off-the-shelf pre-trained models. We demonstrate the effectiveness of our approach on four classification tasks and a segmentation task for continual test-time adaptation, on which we outperform existing methods. Our code is available at \url{https://qin.ee/cotta}.