Skip to main navigation Skip to search Skip to main content

MAG+: AN EXTENDED MULTIMODAL ADAPTATION GATE FOR MULTIMODAL SENTIMENT ANALYSIS

  • Xianbing Zhao*
  • , Yixin Chen*
  • , Wanting Li
  • , Lei Gao
  • , Buzhou Tang
  • *Corresponding author for this work
  • Harbin Institute of Technology Shenzhen
  • University College London
  • Peng Cheng Laboratory

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Human multimodal sentiment analysis is a challenging task that devotes to extract and integrate information from multiple resources, such as language, acoustic and visual information. Recently, multimodal adaptation gate (MAG), an attachment to transformer-based pre-trained language representation models, such as BERT and XLNet, has shown state-of-the-art performance on multimodal sentiment analysis. MAG only uses a 1-layer network to fuse multimodal information directly, and does not pay attention to relationships among different modalities. In this paper, we propose an extended MAG, called MAG+, to reinforce multimodal fusion. MAG+ contains two modules: multi-layer MAGs with modality reinforcement (M3R) and Adaptive Layer Aggregation (ALA). In the MAG with modality reinforcement of M3R, each modality is reinforced by all other modalities via crossmodal attention at first, and then all modalities are fused via MAG. The ALA module leverages the multimodal representations at low and high levels as the final multimodal representation. Similar to MAG, MAG+ is also attached to BERT and XLNet. Experimental results on two widely used datasets demonstrate the efficacy of our proposed MAG+.

Original languageEnglish
Title of host publication2022 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2022 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages4753-4757
Number of pages5
ISBN (Electronic)9781665405409
DOIs
StatePublished - 2022
Externally publishedYes
Event2022 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2022 - Hybrid, Singapore
Duration: 22 May 202227 May 2022

Publication series

NameICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
Volume2022-May
ISSN (Print)1520-6149

Conference

Conference2022 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2022
Country/TerritorySingapore
CityHybrid
Period22/05/2227/05/22

Keywords

  • BERT
  • Multimodal Fusion
  • Multimodal Sentiment Analysis

Fingerprint

Dive into the research topics of 'MAG+: AN EXTENDED MULTIMODAL ADAPTATION GATE FOR MULTIMODAL SENTIMENT ANALYSIS'. Together they form a unique fingerprint.

Cite this