论文标题
教授神经模块网络进行算术
Teaching Neural Module Networks to Do Arithmetic
论文作者
论文摘要
回答需要对原始文本进行多步多类型推理的复杂问题是具有挑战性的,尤其是在进行数值推理时。神经模块网络(NMN),遵循程序员互动框架和设计可训练的模块以学习不同的推理技能。但是,NMN仅具有有限的推理能力,并且缺乏数值推理能力。我们将NMN提升为:(a)弥合其解释器与复杂问题之间的差距; (b)引入对数字进行数值推理的加法和减法模块。在下降的一部分中,实验结果表明,我们提出的方法可以提高NMN的数值推理能力,提高了F1得分,并明显胜过以前的最新模型。
Answering complex questions that require multi-step multi-type reasoning over raw text is challenging, especially when conducting numerical reasoning. Neural Module Networks(NMNs), follow the programmer-interpreter framework and design trainable modules to learn different reasoning skills. However, NMNs only have limited reasoning abilities, and lack numerical reasoning capability. We up-grade NMNs by: (a) bridging the gap between its interpreter and the complex questions; (b) introducing addition and subtraction modules that perform numerical reasoning over numbers. On a subset of DROP, experimental results show that our proposed methods enhance NMNs' numerical reasoning skills by 17.7% improvement of F1 score and significantly outperform previous state-of-the-art models.