Skip to main navigation Skip to search Skip to main content

Hypercomplex Graph Neural Network: Towards Deep Intersection of Multi-Modal Brain Networks

  • Yanwu Yang
  • , Chenfei Ye
  • , Guoqing Cai
  • , Kunru Song
  • , Jintao Zhang
  • , Yang Xiang
  • , Ting Ma*
  • *Corresponding author for this work
  • Harbin Institute of Technology Shenzhen
  • Peng Cheng Laboratory
  • Beijing Normal University

Research output: Contribution to journalArticlepeer-review

Abstract

The multi-modal neuroimage study has provided insights into understanding the heteromodal relationships between brain network organization and behavioral phenotypes. Integrating data from various modalities facilitates the characterization of the interplay among anatomical, functional, and physiological brain alterations or developments. Graph Neural Networks (GNNs) have recently become popular in analyzing and fusing multi-modal, graph-structured brain networks. However, effectively learning complementary representations from other modalities remains a significant challenge due to the sophisticated and heterogeneous inter-modal dependencies. Furthermore, most existing studies often focus on specific modalities (e.g., only fMRI and DTI), which limits their scalability to other types of brain networks. To overcome these limitations, we propose a HyperComplex Graph Neural Network (HC-GNN) that models multi-modal networks as hypercomplex tensor graphs. In our approach, HC-GNN is conceptualized as a dynamic spatial graph, where the attentively learned inter-modal associations are represented as the adjacency matrix. HC-GNN leverages hypercomplex operations for inter-modal intersections through cross-embedding and cross-aggregation, enriching the deep coupling of multi-modal representations. We conduct a statistical analysis on the saliency maps to associate disease biomarkers. Extensive experiments on three datasets demonstrate the superior classification performance of our method and its strong scalability to various types of modalities. Our work presents a powerful paradigm for the study of multi-modal brain networks.

Original languageEnglish
Pages (from-to)3304-3316
Number of pages13
JournalIEEE Journal of Biomedical and Health Informatics
Volume29
Issue number5
DOIs
StatePublished - 2025
Externally publishedYes

Keywords

  • Multi-modal graph neural network
  • brain
  • hypercomplex-GNN
  • neuroimage
  • tensor

Fingerprint

Dive into the research topics of 'Hypercomplex Graph Neural Network: Towards Deep Intersection of Multi-Modal Brain Networks'. Together they form a unique fingerprint.

Cite this