论文标题

LR-NET:用于低分辨率图像分类的基于块的卷积神经网络

LR-Net: A Block-based Convolutional Neural Network for Low-Resolution Image Classification

论文作者

Ganj, Ashkan, Ebadpour, Mohsen, Darvish, Mahdi, Bahador, Hamid

论文摘要

如今,基于CNN的架构在学习和提取功能中的图像分类方面的成功使它们如此受欢迎,但是当我们应用最先进的模型来对嘈杂和低质量的图像进行分类时,图像分类的任务变得更具挑战性。由于模型低分辨率和缺乏有意义的全局功能,模型仍然很难从此类图像中提取有意义的功能。此外,高分辨率图像需要更多的层来训练,这意味着它们需要更多的时间和计算能力才能训练。我们的方法还解决了消失梯度的问题,因为我们前面提到的深层神经网络变得更深。为了解决所有这些问题,我们开发了一种新颖的图像分类体系结构,该架构由块组成,旨在从模糊和嘈杂的低分辨率图像中学习低级和全局特征。我们的块设计受到残留连接和成立模块的严重影响,以提高性能并减少参数尺寸。我们还使用MNIST家族数据集评估了我们的工作,并特别强调了Oracle-Mnist数据集,这是由于其低质量和嘈杂的图像而最难分类的。我们进行了深入的测试,证明所呈现的体系结构比现有的尖端卷积神经网络更快,更准确。此外,由于我们的模型的独特属性,它可以使用较少的参数产生更好的结果。

The success of CNN-based architecture on image classification in learning and extracting features made them so popular these days, but the task of image classification becomes more challenging when we apply state of art models to classify noisy and low-quality images. It is still difficult for models to extract meaningful features from this type of image due to its low-resolution and the lack of meaningful global features. Moreover, high-resolution images need more layers to train which means they take more time and computational power to train. Our method also addresses the problem of vanishing gradients as the layers become deeper in deep neural networks that we mentioned earlier. In order to address all these issues, we developed a novel image classification architecture, composed of blocks that are designed to learn both low level and global features from blurred and noisy low-resolution images. Our design of the blocks was heavily influenced by Residual Connections and Inception modules in order to increase performance and reduce parameter sizes. We also assess our work using the MNIST family datasets, with a particular emphasis on the Oracle-MNIST dataset, which is the most difficult to classify due to its low-quality and noisy images. We have performed in-depth tests that demonstrate the presented architecture is faster and more accurate than existing cutting-edge convolutional neural networks. Furthermore, due to the unique properties of our model, it can produce a better result with fewer parameters.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源