HpLapGCN: Hypergraph p-Laplacian graph convolutional networks

  • Sichao Fu
  • , Weifeng Liu*
  • , Yicong Zhou
  • , Liqiang Nie
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Currently, the representation learning of a graph has been proved to be a significant technique to extract graph structured data features. In recent years, many graph representation learning (GRL) algorithms, such as Laplacian Eigenmaps (LE), Node2vec and graph convolutional networks (GCN), have been reported and have achieved great success on node classification tasks. The most representative GCN fuses the feature information and structure information of data, which aims to generalize convolutional neural networks (CNN) to learn data features with arbitrary structure. However, how to exactly express the structure information of data is still an enormous challenge. In this paper, we utilize hypergraph p-Laplacian to preserve the local geometry of samples and then propose an effective variant of GCN, i.e. hypergraph p-Laplacian graph convolutional networks (HpLapGCN). Since hypergraph p-Laplacian is a generalization of the graph Laplacian, HpLapGCN model shows great potential to learn more representative data features. In particular, we simplify and deduce a one-order approximation of spectral hypergraph p-Laplacian convolutions. Thus, we can get a more efficient layer-wise aggregate rule. Extensive experiment results on the Citeseer and Cora datasets prove that our proposed model achieves better performance compare with GCN and p-Laplacian GCN (pLapGCN).

Original languageEnglish
Pages (from-to)166-174
Number of pages9
JournalNeurocomputing
Volume362
DOIs
StatePublished - 14 Oct 2019
Externally publishedYes

Keywords

  • Graph convolutional networks
  • Hypergraph
  • p-Laplacian

Fingerprint

Dive into the research topics of 'HpLapGCN: Hypergraph p-Laplacian graph convolutional networks'. Together they form a unique fingerprint.

Cite this