Skip to main navigation Skip to search Skip to main content

Cross-Lingual Semantic Role Labeling with Model Transfer

  • Hao Fei*
  • , Meishan Zhang
  • , Fei Li
  • , Donghong Ji
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Prior studies show that cross-lingual semantic role labeling (SRL) can be achieved by model transfer under the help of universal features. In this article, we fill the gap of cross-lingual SRL by proposing an end-to-end SRL model that incorporates a variety of universal features and transfer methods. We study both the bilingual transfer and multi-source transfer, under gold or machine-generated syntactic inputs, pre-trained high-order abstract features, and contextualized multilingual word representations. Experimental results on the Universal Proposition Bank corpus indicate that performances of the cross-lingual SRL can vary by leveraging different cross-lingual features. In addition, whether the features are gold-standard also has an impact on performances. Precisely, we find that gold syntax features are much more crucial for cross-lingual SRL, compared with the automatically-generated ones. Moreover, universal dependency structure features are able to give the best help, and both pre-trained high-order features and contextualized word representations can further bring significant improvements.

Original languageEnglish
Article number9165903
Pages (from-to)2427-2437
Number of pages11
JournalIEEE/ACM Transactions on Audio Speech and Language Processing
Volume28
DOIs
StatePublished - 2020
Externally publishedYes

Keywords

  • Natural language processing
  • cross-lingual transfer
  • model transfer
  • semantic role labeling (SRL)

Fingerprint

Dive into the research topics of 'Cross-Lingual Semantic Role Labeling with Model Transfer'. Together they form a unique fingerprint.

Cite this