Transition-based neural RST parsing with implicit syntax features

  • Nan Yu
  • , Meishan Zhang
  • , Guohong Fu*
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Syntax has been a useful source of information for statistical RST discourse parsing. Under the neural setting, a common approach integrates syntax by a recursive neural network (RNN), requiring discrete output trees produced by a supervised syntax parser. In this paper, we propose an implicit syntax feature extraction approach, using hidden-layer vectors extracted from a neural syntax parser. In addition, we propose a simple transition-based model as the baseline, further enhancing it with dynamic oracle. Experiments on the standard dataset show that our baseline model with dynamic oracle is highly competitive. When implicit syntax features are integrated, we are able to obtain further improvements, better than using explicit Tree-RNN.

Original languageEnglish
Title of host publicationCOLING 2018 - 27th International Conference on Computational Linguistics, Proceedings
EditorsEmily M. Bender, Leon Derczynski, Pierre Isabelle
PublisherAssociation for Computational Linguistics (ACL)
Pages559-570
Number of pages12
ISBN (Electronic)9781948087506
StatePublished - 2018
Externally publishedYes
Event27th International Conference on Computational Linguistics, COLING 2018 - Santa Fe, United States
Duration: 20 Aug 201826 Aug 2018

Publication series

NameCOLING 2018 - 27th International Conference on Computational Linguistics, Proceedings

Conference

Conference27th International Conference on Computational Linguistics, COLING 2018
Country/TerritoryUnited States
CitySanta Fe
Period20/08/1826/08/18

Fingerprint

Dive into the research topics of 'Transition-based neural RST parsing with implicit syntax features'. Together they form a unique fingerprint.

Cite this