z-logo
open-access-imgOpen Access
ABCDE: Approximating Betweenness-Centrality ranking with progressive-DropEdge
Author(s) -
Martin Mirakyan
Publication year - 2021
Publication title -
peerj. computer science
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.806
H-Index - 24
ISSN - 2376-5992
DOI - 10.7717/peerj-cs.699
Subject(s) - betweenness centrality , centrality , computer science , theoretical computer science , graph , inference , node (physics) , computation , network analysis , algorithm , artificial intelligence , mathematics , combinatorics , physics , structural engineering , quantum mechanics , engineering
Betweenness-centrality is a popular measure in network analysis that aims to describe the importance of nodes in a graph. It accounts for the fraction of shortest paths passing through that node and is a key measure in many applications including community detection and network dismantling. The computation of betweenness-centrality for each node in a graph requires an excessive amount of computing power, especially for large graphs. On the other hand, in many applications, the main interest lies in finding the top-k most important nodes in the graph. Therefore, several approximation algorithms were proposed to solve the problem faster. Some recent approaches propose to use shallow graph convolutional networks to approximate the top-k nodes with the highest betweenness-centrality scores. This work presents a deep graph convolutional neural network that outputs a rank score for each node in a given graph. With careful optimization and regularization tricks, including an extended version of DropEdge which is named Progressive-DropEdge, the system achieves better results than the current approaches. Experiments on both real-world and synthetic datasets show that the presented algorithm is an order of magnitude faster in inference and requires several times fewer resources and time to train.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here