论文标题
MGML:用于遥感场景分类的多晶格多级功能集合网络
MGML: Multi-Granularity Multi-Level Feature Ensemble Network for Remote Sensing Scene Classification
论文作者
论文摘要
遥感(RS)场景分类是预测RS图像的场景类别的一项挑战任务。 RS图像具有两个主要特征:大分辨率差异引起的较大的级别差异,并使大型地理覆盖区域中的信息混淆。减轻上述两个字符的负面影响。我们建议在本文中有效地处理RS场景分类任务,提出一个多范围多层次特征集合网络(MGML-Fenet)。具体而言,我们建议通过通道 - 分离特征生成器(CS-FG)在不同级别的网络中提取多层次的多层次特征融合分支(MGML-FFB)。为了避免从混淆信息中的干扰,我们提出了多粒性多层次特征集合模块(MGML-FEM),该模块可以通过全渠道功能生成器(FC-FG)提供多种预测。与以前的方法相比,我们提出的网络具有使用结构信息和丰富的细粒特征的能力。此外,通过集合学习方法,我们提出的MGML-Fenets可以获得更具说服力的最终预测。在多个RS数据集(AID,NWPU-RESISC45,UC-MERCED和VGOOGLE)上进行的大量分类实验表明,我们所提出的网络比以前的最新ART(SOTA)网络获得更好的性能。可视化分析还显示了MGML-Fenet的良好解释性。
Remote sensing (RS) scene classification is a challenging task to predict scene categories of RS images. RS images have two main characters: large intra-class variance caused by large resolution variance and confusing information from large geographic covering area. To ease the negative influence from the above two characters. We propose a Multi-granularity Multi-Level Feature Ensemble Network (MGML-FENet) to efficiently tackle RS scene classification task in this paper. Specifically, we propose Multi-granularity Multi-Level Feature Fusion Branch (MGML-FFB) to extract multi-granularity features in different levels of network by channel-separate feature generator (CS-FG). To avoid the interference from confusing information, we propose Multi-granularity Multi-Level Feature Ensemble Module (MGML-FEM) which can provide diverse predictions by full-channel feature generator (FC-FG). Compared to previous methods, our proposed networks have ability to use structure information and abundant fine-grained features. Furthermore, through ensemble learning method, our proposed MGML-FENets can obtain more convincing final predictions. Extensive classification experiments on multiple RS datasets (AID, NWPU-RESISC45, UC-Merced and VGoogle) demonstrate that our proposed networks achieve better performance than previous state-of-the-art (SOTA) networks. The visualization analysis also shows the good interpretability of MGML-FENet.