Skip to main navigation Skip to search Skip to main content

Learning number reasoning for numerical table-to-text generation

Research output: Contribution to journalArticlepeer-review

Abstract

Although the existing numerical table-to-text generation models have achieved remarkable progress, the idea of generating an accurate analysis of the input table is not well explored. Most existing table-to-text generation algorithms for generating table related information only copy the table record directly but ignore reasoning or calculating over table records. One of the key steps to achieve this ability is number reasoning, which refers to do logical reasoning about the numbers from table records. In this paper, we attempt to improve the number reasoning capability of neural table-to-text generation by generating additional mathematical equations from numerical table records. We propose a neural architecture called Neural Table Reasoning Generator (NTRG), with an additional switching gate as well as a specifically designed equation decoder for generating mathematical equations adaptively. Moreover, we present a pre-training strategy for NTRG similar to the mask language model. Empirical results show that NTRG yields new state-of-the-art results on ROTOWIRE. Furthermore, in order to give a quantitative evaluation of the ability of number reasoning, we construct a sentence-level number reasoning dataset. Results demonstrate the superiority of our approaches over strong baselines.

Original languageEnglish
Pages (from-to)2269-2280
Number of pages12
JournalInternational Journal of Machine Learning and Cybernetics
Volume12
Issue number8
DOIs
StatePublished - Aug 2021

Keywords

  • Data-to-text
  • Natural language generation
  • Number reasoning
  • Table modeling

Fingerprint

Dive into the research topics of 'Learning number reasoning for numerical table-to-text generation'. Together they form a unique fingerprint.

Cite this