论文标题

具有异质特征的训练数据的自适应批归其归一化

Adaptive Batch Normalization for Training Data with Heterogeneous Features

论文作者

Alsobhi, Wael, Alafif, Tarik, Abdel-Hakim, Alaa, Zong, Weiwei

论文摘要

分批归一化(BN)是许多深度学习应用程序的重要预处理步骤。由于这是一个与数据相关的过程,因此对于某些均质数据集,它是一个冗余甚至降低性能的过程。在本文中,我们提出了一种早期可行性评估方法,用于估计在给定数据批次上应用BN的好处。提出的方法使用一种新型的基于阈值的方法根据其对标准化的需求将训练数据批次分为两组。根据所考虑的批次的特征异质性,确定了归一化的需求。拟议的方法是一种预训练的处理,这意味着没有开销的培训。评估结果表明,所提出的方法比使用MNIST,Fashion-Mnist,CIFAR-10和CIFAR-100数据集的传统BN相比,大多数批量的性能更好。另外,通过减少内部变量转换的发生来提高网络稳定性。

Batch Normalization (BN) is an important preprocessing step to many deep learning applications. Since it is a data-dependent process, for some homogeneous datasets it is a redundant or even a performance-degrading process. In this paper, we propose an early-stage feasibility assessment method for estimating the benefits of applying BN on the given data batches. The proposed method uses a novel threshold-based approach to classify the training data batches into two sets according to their need for normalization. The need for normalization is decided based on the feature heterogeneity of the considered batch. The proposed approach is a pre-training processing, which implies no training overhead. The evaluation results show that the proposed approach achieves better performance mostly in small batch sizes than the traditional BN using MNIST, Fashion-MNIST, CIFAR-10, and CIFAR-100 datasets. Additionally, the network stability is increased by reducing the occurrence of internal variable transformation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源