TY - GEN
T1 - Emotion Classification with Explicit and Implicit Syntactic Information
AU - Chen, Nan
AU - Xia, Qingrong
AU - Zhou, Xiabing
AU - Chen, Wenliang
AU - Zhang, Min
N1 - Publisher Copyright:
© 2021, Springer Nature Switzerland AG.
PY - 2021
Y1 - 2021
N2 - Emotion classification has become a hot research topic in natural language processing due to its wide application. Existing studies suffer from the error propagation problem when using the syntax information in emotion classification since the parser can not produce perfect syntax trees. To address this problem, we propose a new approach by comparing and combining different levels of syntactic information to make full use of syntactic information and alleviate the error propagation. First, we propose to use graph convolutional networks (GCN) to encode dependency trees, in which the probability matrix of all dependency arcs (edge-weighted graph) is treated as the GCN adjacent matrix. Next, we extract the dependency parser encoder hidden representations as the implicit syntactic representations, which can directly avoid the error propagation problem. Finally, we fuse the two different syntax-aware information and inject them into our baseline model as extra inputs. Further experimental results show that the explicit and implicit syntactic information can improve the performance of a BERT-based system which is much stronger than the baseline. In addition, we find that the syntactic knowledge that BERT can express is limited, and the syntactic information of our model brings more contributions, which makes our model consistently outperform the BERT on different sentence lengths.
AB - Emotion classification has become a hot research topic in natural language processing due to its wide application. Existing studies suffer from the error propagation problem when using the syntax information in emotion classification since the parser can not produce perfect syntax trees. To address this problem, we propose a new approach by comparing and combining different levels of syntactic information to make full use of syntactic information and alleviate the error propagation. First, we propose to use graph convolutional networks (GCN) to encode dependency trees, in which the probability matrix of all dependency arcs (edge-weighted graph) is treated as the GCN adjacent matrix. Next, we extract the dependency parser encoder hidden representations as the implicit syntactic representations, which can directly avoid the error propagation problem. Finally, we fuse the two different syntax-aware information and inject them into our baseline model as extra inputs. Further experimental results show that the explicit and implicit syntactic information can improve the performance of a BERT-based system which is much stronger than the baseline. In addition, we find that the syntactic knowledge that BERT can express is limited, and the syntactic information of our model brings more contributions, which makes our model consistently outperform the BERT on different sentence lengths.
KW - BERT
KW - Emotion classification
KW - Syntactic information
UR - https://www.scopus.com/pages/publications/85118152901
U2 - 10.1007/978-3-030-88480-2_48
DO - 10.1007/978-3-030-88480-2_48
M3 - 会议稿件
AN - SCOPUS:85118152901
SN - 9783030884796
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 607
EP - 618
BT - Natural Language Processing and Chinese Computing - 10th CCF International Conference, NLPCC 2021, Proceedings
A2 - Wang, Lu
A2 - Feng, Yansong
A2 - Hong, Yu
A2 - He, Ruifang
PB - Springer Science and Business Media Deutschland GmbH
T2 - 10th CCF Conference on Natural Language Processing and Chinese Computing, NLPCC 2021
Y2 - 13 October 2021 through 17 October 2021
ER -