论文标题

SPABERT:从地理数据中的地理数据中的验证语言模型

SpaBERT: A Pretrained Language Model from Geographic Data for Geo-Entity Representation

论文作者

Li, Zekun, Kim, Jina, Chiang, Yao-Yi, Chen, Muhao

论文摘要

命名地理实体(简称地理本地)是许多地理数据集的基础。表征地理性是各种应用领域不可或缺的一部分,例如地理智能和地图理解,而关键的挑战是捕获实体的空间变化上下文。我们假设我们将通过其周围实体知道地理原性的特征,类似于通过其语言背景知道单词含义。因此,我们提出了一种新型的空间语言模型Spabert,该模型基于地理空间数据中的相邻实体提供了通用的地理位置表示。 Spabert扩展了BERT以捕获线性的空间环境,同时结合了空间坐标嵌入机制,以保留二维空间中实体的空间关系。 Spabert通过掩盖语言建模和掩盖实体预测任务审慎,以学习空间依赖性。我们将SPABERT应用于两个下游任务:地理原理分型和地理原性链接。与不使用空间上下文的现有语言模型相比,Spabert在这两个任务上均显示出显着的性能改善。我们还分析了各种环境中Spabert的实体表示以及空间坐标嵌入的效果。

Named geographic entities (geo-entities for short) are the building blocks of many geographic datasets. Characterizing geo-entities is integral to various application domains, such as geo-intelligence and map comprehension, while a key challenge is to capture the spatial-varying context of an entity. We hypothesize that we shall know the characteristics of a geo-entity by its surrounding entities, similar to knowing word meanings by their linguistic context. Accordingly, we propose a novel spatial language model, SpaBERT, which provides a general-purpose geo-entity representation based on neighboring entities in geospatial data. SpaBERT extends BERT to capture linearized spatial context, while incorporating a spatial coordinate embedding mechanism to preserve spatial relations of entities in the 2-dimensional space. SpaBERT is pretrained with masked language modeling and masked entity prediction tasks to learn spatial dependencies. We apply SpaBERT to two downstream tasks: geo-entity typing and geo-entity linking. Compared with the existing language models that do not use spatial context, SpaBERT shows significant performance improvement on both tasks. We also analyze the entity representation from SpaBERT in various settings and the effect of spatial coordinate embedding.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源