Global disentangled graph convolutional neural network based on a graph topological metric

Publication Name

Knowledge-Based Systems

Abstract

Graph convolutional networks (GCNs) are powerful tools for analyzing structured data with entities based on messages passing between a node and its surrounding nodes; these networks exhibit exceptional capabilities in diverse complex graph learning tasks. However, despite GCNs being capable of incorporating information from entities, they often neglect the structural connections between the entities generated by latent factors. In this study, we propose a global disentangled graph convolutional neural network based on a graph topological metric to identify these latent factors and perform graph-level disentanglement learning. In the proposed framework, a simple graph is accepted as the input and disentangled into several factorized graphs. Each factorized graph represents a latent factor and the disentangled relationship among the nodes. Specifically, our approach decouples the message passing process in GCNs into two distinct flows, feature and structural information flow. Importantly, a topological metric, named mean average distance, is introduced to promote the disentanglement among the factor graphs. Furthermore, we utilize the Jensen–Shannon MI estimator to promote disentanglement through feature information flow. Experiments on synthetic and real-world datasets demonstrated the superiority of our framework over state-of-the-art GNN networks. This work introduces a novel approach, preserving independence among latent factors while ensuring each factor maintains a consistent and interpretable meaning. We anticipate that this research can provide theoretical and technical analysis to further advance the understanding of graph disentanglement learning.

Open Access Status

This publication is not available as open access

Volume

284

Article Number

111283

Funding Number

2022LYKFYB01

Funding Sponsor

National Natural Science Foundation of China

Share

COinS
 

Link to publisher version (DOI)

http://dx.doi.org/10.1016/j.knosys.2023.111283