DeepNC: Deep generative Network Completion

Most network data are collected from partially observable networks with both missing nodes and missing edges, for example, due to limited resources and privacy settings specified by users on social media. Thus, it stands to reason that inferring the missing parts of the networks by performing networ...

Full description

Saved in:
Bibliographic Details
Main Authors: Cong Tran (Author) , Shin, Won-Yong (Author) , Spitz, Andreas (Author) , Gertz, Michael (Author)
Format: Article (Journal)
Language:English
Published: 2022
In: IEEE transactions on pattern analysis and machine intelligence
Year: 2022, Volume: 44, Issue: 4, Pages: 1837-1852
ISSN:1939-3539
DOI:10.1109/TPAMI.2020.3032286
Online Access:Verlag, lizenzpflichtig, Volltext: https://doi.org/10.1109/TPAMI.2020.3032286
Get full text
Author Notes:Cong Tran, student member, IEEE, Won-Yong Shin, senior member, IEEE, Andreas Spitz, and Michael Gertz
Description
Summary:Most network data are collected from partially observable networks with both missing nodes and missing edges, for example, due to limited resources and privacy settings specified by users on social media. Thus, it stands to reason that inferring the missing parts of the networks by performing network completion should precede downstream applications. However, despite this need, the recovery of missing nodes and edges in such incomplete networks is an insufficiently explored problem due to the modeling difficulty, which is much more challenging than link prediction that only infers missing edges. In this paper, we present DeepNC, a novel method for inferring the missing parts of a network based on a deep generative model of graphs. Specifically, our method first learns a likelihood over edges via an autoregressive generative model, and then identifies the graph that maximizes the learned likelihood conditioned on the observable graph topology. Moreover, we propose a computationally efficient $\sf DeepNC$DeepNC algorithm that consecutively finds individual nodes that maximize the probability in each node generation step, as well as an enhanced version using the expectation-maximization algorithm. The runtime complexities of both algorithms are shown to be almost linear in the number of nodes in the network. We empirically demonstrate the superiority of DeepNC over state-of-the-art network completion approaches.
Item Description:Date of publication 19 Oct. 2020
Gesehen am 05.04.2022
Physical Description:Online Resource
ISSN:1939-3539
DOI:10.1109/TPAMI.2020.3032286