Abstract
Prior studies show that cross-lingual semantic role labeling (SRL) can be achieved by model transfer under the help of universal features. In this article, we fill the gap of cross-lingual SRL by proposing an end-to-end SRL model that incorporates a variety of universal features and transfer methods. We study both the bilingual transfer and multi-source transfer, under gold or machine-generated syntactic inputs, pre-trained high-order abstract features, and contextualized multilingual word representations. Experimental results on the Universal Proposition Bank corpus indicate that performances of the cross-lingual SRL can vary by leveraging different cross-lingual features. In addition, whether the features are gold-standard also has an impact on performances. Precisely, we find that gold syntax features are much more crucial for cross-lingual SRL, compared with the automatically-generated ones. Moreover, universal dependency structure features are able to give the best help, and both pre-trained high-order features and contextualized word representations can further bring significant improvements.
| Original language | English |
|---|---|
| Article number | 9165903 |
| Pages (from-to) | 2427-2437 |
| Number of pages | 11 |
| Journal | IEEE/ACM Transactions on Audio Speech and Language Processing |
| Volume | 28 |
| DOIs | |
| State | Published - 2020 |
| Externally published | Yes |
Keywords
- Natural language processing
- cross-lingual transfer
- model transfer
- semantic role labeling (SRL)
Fingerprint
Dive into the research topics of 'Cross-Lingual Semantic Role Labeling with Model Transfer'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver