no code implementations • 29 Sep 2021 • Ziang Zhou, Jieming Shi, Shengzhong Zhang, Zengfeng Huang, Qing Li
Therefore, we propose an effective framework, Stabilized self-training with Negative sampling (SN), which is applicable to existing GNNs to stabilize the training process and enhance the training data, and consequently, boost classification accuracy on graphs with few labeled data.
no code implementations • NeurIPS 2020 • Haishan Ye, Ziang Zhou, Luo Luo, Tong Zhang
In this paper, we propose a new method which establishes the optimal computational complexity and a near optimal communication complexity.
1 code implementation • 30 Jun 2020 • Shengzhong Zhang, Zengfeng Huang, Haicang Zhou, Ziang Zhou
A key of success to such contrastive learning methods is how to draw positive and negative samples.
no code implementations • 2 May 2020 • Haishan Ye, Luo Luo, Ziang Zhou, Tong Zhang
This paper considers the decentralized convex optimization problem, which has a wide range of applications in large-scale machine learning, sensor networks, and control theory.
1 code implementation • 7 Oct 2019 • Ziang Zhou, Jieming Shi, Shengzhong Zhang, Zengfeng Huang, Qing Li
However, under extreme cases when very few labels are available (e. g., 1 labeled node per class), GNNs suffer from severe performance degradation.
no code implementations • ICLR 2019 • Shengzhong Zhang, Ziang Zhou, Zengfeng Huang, Zhongyu Wei
We consider the fundamental problem of semi-supervised node classification in attributed graphs with a focus on \emph{few-shot} learning.