Abstract
Recent advancements in fMRI-based brain disorder diagnosis have shown that graph neural networks (GNNs) have been state-of-the-art methods for brain network analysis. Among them, transductive and inductive learning can be exploited by GNN. Transductive graphs, such as population graphs, take each subject as a node and use the node classification task for diagnosis. This line of work suffers from high computational costs and poor scalability to unseen data. Inductive methods, on the other hand, only consider labeled data and may suffer from overfitting and poor generalization when training with insufficient samples. To address these limitations, we propose a unified transductive-inductive network to study the properties of both transductive and inductive learning frameworks. Our approach is implemented in a self-knowledge distillation architecture, where transductive predictions are distilled from a transductive population graph network into an inductive network as a self-supervised regularization term. To preserve the topological properties within transductive graphs, i.e., inter-node similarity, we propose a topology-regularized self-knowledge distillation (Topo-KD) approach to regularize the student model's learning. Evaluations on the ADNI dataset demonstrate the superiority of the approach in performance and scalability.
| Original language | English |
|---|---|
| Pages (from-to) | 4045-4049 |
| Number of pages | 5 |
| Journal | Proceedings - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing |
| DOIs | |
| State | Published - 2024 |
| Externally published | Yes |
| Event | 2024 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2024 - Seoul, Korea, Republic of Duration: 14 Apr 2024 → 19 Apr 2024 |
Keywords
- Graph neural network
- Inductive learning
- Neuroimage
- Transductive learning
Fingerprint
Dive into the research topics of 'Topology-Regularized Self-Knowledge Distillation for Transductive-Inductive Learning of Brain Disorder Diagnosis'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver