The Vapnik-Chervonenkis dimension of graph and recursive neural networks

RIS ID

130299

Publication Details

Scarselli, F., Tsoi, A. & Hagenbuchner, M. (2018). The Vapnik-Chervonenkis dimension of graph and recursive neural networks. Neural Networks, 108 248-259.

Abstract

The Vapnik-Chervonenkis dimension (VC-dim) characterizes the sample learning complexity of a classification model and it is often used as an indicator for the generalization capability of a learning method. The VC-dim has been studied on common feed-forward neural networks, but it has yet to be studied on Graph Neural Networks (GNNs) and Recursive Neural Networks (RecNNs). This paper provides upper bounds on the order of growth of the VC-dim of GNNs and RecNNs. GNNs and RecNNs are from a new class of neural network models which are capable of processing inputs that are given as graphs. A graph is a data structure that generalizes the representational power of vectors and sequences, via the ability to represent dependencies or relationships between feature vectors. It was shown previously that the ability of recurrent neural networks to process sequences increases the VC-dim when compared to the VC-dim of Neural Networks, which are limited to processing vectors. Since graphs are a more general form than sequences, the question arises how this will affect the VC-dimension of GNNs and RecNNs. A main finding in this paper is that the upper bounds on the VC-dim for GNNs and RecNNs are comparable to the upper bounds for recurrent neural networks. The result also suggests that the generalization capability of such models increases with the number of connected nodes.

Please refer to publisher version or contact your library.

Share

COinS
 

Link to publisher version (DOI)

http://dx.doi.org/10.1016/j.neunet.2018.08.010