论文标题
固定子中心:捕获数据复杂性的更好方法
The Fixed Sub-Center: A Better Way to Capture Data Complexity
论文作者
论文摘要
用单个中心处理类可能几乎无法捕获数据分布的复杂性。使用多个子中心是解决此问题的另一种方法。但是,高度相关的子类,分类器的参数随类的数量线性增长,并且缺乏阶层的紧凑性是三个典型的问题,需要在现有的多次数方法中解决。为此,我们建议使用固定的子中心(F-SC),该子中心(F-SC)允许该模型创建更多的差异子中心,同时大量节省内存和削减计算成本。 F-SC特别是首先从均匀分布中为每个类采样了一个类UI,然后为每个班级生成一个正态分布,其中平均值等于UI。最后,根据与每个类相对应的正态分布对子中心进行采样,并在训练过程中固定子中心,以避免梯度计算的开销。此外,F-SC对样品及其相应子中心之间的欧几里得距离处进行了惩罚,它有助于保持内部紧密度。实验结果表明,F-SC显着提高了图像分类和细粒识别任务的准确性。
Treating class with a single center may hardly capture data distribution complexities. Using multiple sub-centers is an alternative way to address this problem. However, highly correlated sub-classes, the classifier's parameters grow linearly with the number of classes, and lack of intra-class compactness are three typical issues that need to be addressed in existing multi-subclass methods. To this end, we propose to use Fixed Sub-Center (F-SC), which allows the model to create more discrepant sub-centers while saving memory and cutting computational costs considerably. The F-SC specifically, first samples a class center Ui for each class from a uniform distribution, and then generates a normal distribution for each class, where the mean is equal to Ui. Finally, the sub-centers are sampled based on the normal distribution corresponding to each class, and the sub-centers are fixed during the training process avoiding the overhead of gradient calculation. Moreover, F-SC penalizes the Euclidean distance between the samples and their corresponding sub-centers, it helps remain intra-compactness. The experimental results show that F-SC significantly improves the accuracy of both image classification and fine-grained recognition tasks.