Flattened Graph Convolutional Networks For Recommendation

25 Sep 2022  ·  Yue Xu, Hao Chen, Zengde Deng, Yuanchen Bei, Feiran Huang ·

Graph Convolutional Networks (GCNs) and their variants have achieved significant performances on various recommendation tasks. However, many existing GCN models tend to perform recursive aggregations among all related nodes, which can arise severe computational burden to hinder their application to large-scale recommendation tasks. To this end, this paper proposes the flattened GCN~(FlatGCN) model, which is able to achieve superior performance with remarkably less complexity compared with existing models. Our main contribution is three-fold. First, we propose a simplified but powerful GCN architecture which aggregates the neighborhood information using one flattened GCN layer, instead of recursively. The aggregation step in FlatGCN is parameter-free such that it can be pre-computed with parallel computation to save memory and computational cost. Second, we propose an informative neighbor-infomax sampling method to select the most valuable neighbors by measuring the correlation among neighboring nodes based on a principled metric. Third, we propose a layer ensemble technique which improves the expressiveness of the learned representations by assembling the layer-wise neighborhood representations at the final layer. Extensive experiments on three datasets verify that our proposed model outperforms existing GCN models considerably and yields up to a few orders of magnitude speedup in training efficiency.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods