论文标题

大规模云检测的图案识别方案

Pattern Recognition Scheme for Large-Scale Cloud Detection over Landmarks

论文作者

Pérez-Suay, Adrián, Amorós-López, Julia, Gómez-Chova, Luis, Muñoz-Marí, Jordi, Just, Dieter, Camps-Valls, Gustau

论文摘要

地标识别和匹配是许多图像导航和注册(INR)模型的关键步骤,并在地球观察卫星的仪器数据处理链中维护几何质量评估(GQA)。准确地匹配地标是至关重要的,并且该过程可以受到给定地标的云污染的强烈影响。本文介绍了一种完整的模式识别方法,能够使用MeteoSat第二代(MSG)数据检测云的存在。该方法基于专用支持向量机(SVM)的集合组合,取决于特定的地标和照明条件。这种分裂和纠纷策略是由数据复杂性的动机,并遵循一种基于物理的策略,该策略在一天中考虑了季节性和照明条件下的变异性,以分解观察。此外,它允许以负担得起的计算成本培训数百万个样本的分类方案。图像档案由200个具有近700万个多光谱图像的地标测试站点组成,与2010年期间的味精采集相对应。根据云检测准确性和计算成本,对结果进行了分析。我们向社区提供说明性的源代码和一部分大型培训数据。

Landmark recognition and matching is a critical step in many Image Navigation and Registration (INR) models for geostationary satellite services, as well as to maintain the geometric quality assessment (GQA) in the instrument data processing chain of Earth observation satellites. Matching the landmark accurately is of paramount relevance, and the process can be strongly impacted by the cloud contamination of a given landmark. This paper introduces a complete pattern recognition methodology able to detect the presence of clouds over landmarks using Meteosat Second Generation (MSG) data. The methodology is based on the ensemble combination of dedicated support vector machines (SVMs) dependent on the particular landmark and illumination conditions. This divide-and-conquer strategy is motivated by the data complexity and follows a physically-based strategy that considers variability both in seasonality and illumination conditions along the day to split observations. In addition, it allows training the classification scheme with millions of samples at an affordable computational costs. The image archive was composed of 200 landmark test sites with near 7 million multispectral images that correspond to MSG acquisitions during 2010. Results are analyzed in terms of cloud detection accuracy and computational cost. We provide illustrative source code and a portion of the huge training data to the community.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源