no code implementations • ECCV 2020 • Jian Gao, Yang Hua, Guosheng Hu, Chi Wang, Neil M. Robertson
Distributional uncertainty exists broadly in many real-world applications, one of which in the form of domain discrepancy.
1 code implementation • 23 Mar 2024 • Jianqing Zhang, Yang Liu, Yang Hua, Jian Cao
Heterogeneous Federated Learning (HtFL) enables collaborative learning on multiple clients with different model architectures while preserving privacy.
no code implementations • 17 Mar 2024 • Xiaoyu Wu, Yang Hua, Chumeng Liang, Jiaru Zhang, Hao Wang, Tao Song, Haibing Guan
In response, we present Contrasting Gradient Inversion for Diffusion Models (CGI-DM), a novel method featuring vivid visual representations for digital copyright authentication.
1 code implementation • 14 Feb 2024 • Guanxiong Sun, Yang Hua, Guosheng Hu, Neil Robertson
Based on the analysis, we present a simple yet efficient framework to address the computational bottlenecks and achieve efficient one-stage VOD by exploiting the temporal consistency in video frames.
1 code implementation • 14 Feb 2024 • Guanxiong Sun, Yang Hua, Guosheng Hu, Neil Robertson
Deep video models, for example, 3D CNNs or video transformers, have achieved promising performance on sparse video tasks, i. e., predicting one result per video.
2 code implementations • ICCV 2023 • Guanxiong Sun, Chi Wang, Zhaoyu Zhang, Jiankang Deng, Stefanos Zafeiriou, Yang Hua
Then, these video prompts are prepended to the patch embeddings of the current frame as the updated input for video feature extraction.
1 code implementation • 18 Jan 2024 • Guanxiong Sun, Yang Hua, Guosheng Hu, Neil Robertson
However, we argue that these memory structures are not efficient or sufficient because of two implied operations: (1) concatenating all features in memory for enhancement, leading to a heavy computational cost; (2) frame-wise memory updating, preventing the memory from capturing more temporal information.
1 code implementation • 6 Jan 2024 • Jianqing Zhang, Yang Liu, Yang Hua, Jian Cao
To reduce the high communication cost of transmitting model parameters, a major challenge in HtFL, prototype-based HtFL methods are proposed to solely share class representatives, a. k. a, prototypes, among heterogeneous clients while maintaining the privacy of clients' models.
no code implementations • 19 Dec 2023 • Peishen Yan, Hao Wang, Tao Song, Yang Hua, Ruhui Ma, Ningxin Hu, Mohammad R. Haghighat, Haibing Guan
Specifically, the FL server applies parameter-level masks to model updates uploaded by clients and trains the masks over a small clean dataset (i. e., root dataset) to learn the subtle difference between benign and malicious model updates in a high-dimension space.
1 code implementation • 8 Dec 2023 • Jianqing Zhang, Yang Liu, Yang Hua, Hao Wang, Tao Song, Zhengui Xue, Ruhui Ma, Jian Cao
Amid the ongoing advancements in Federated Learning (FL), a machine learning paradigm that allows collaborative learning with data privacy protection, personalized FL (pFL) has gained significant prominence as a research direction within the FL domain.
2 code implementations • NeurIPS 2023 • Jianqing Zhang, Yang Hua, Jian Cao, Hao Wang, Tao Song, Zhengui Xue, Ruhui Ma, Haibing Guan
Recently, federated learning (FL) is popular for its privacy-preserving and collaborative learning abilities.
3 code implementations • ICCV 2023 • Jianqing Zhang, Yang Hua, Hao Wang, Tao Song, Zhengui Xue, Ruhui Ma, Jian Cao, Haibing Guan
Federated Learning (FL) is popular for its privacy-preserving and collaborative learning capabilities.
no code implementations • 8 Aug 2023 • Haomin Zhuang, Mingxian Yu, Hao Wang, Yang Hua, Jian Li, Xu Yuan
Federated learning (FL) has been widely deployed to enable machine learning training on sensitive data across distributed devices.
3 code implementations • 1 Jul 2023 • Jianqing Zhang, Yang Hua, Hao Wang, Tao Song, Zhengui Xue, Ruhui Ma, Haibing Guan
To address this, we propose the Federated Conditional Policy (FedCP) method, which generates a conditional policy for each sample to separate the global information and personalized information in its features and then processes them by a global head and a personalized head, respectively.
1 code implementation • 9 Feb 2023 • Chumeng Liang, Xiaoyu Wu, Yang Hua, Jiaru Zhang, Yiming Xue, Tao Song, Zhengui Xue, Ruhui Ma, Haibing Guan
Recently, Diffusion Models (DMs) boost a wave in AI for Art yet raise new copyright concerns, where infringers benefit from using unauthorized paintings to train DMs to generate novel paintings in a similar style.
2 code implementations • 2 Dec 2022 • Jianqing Zhang, Yang Hua, Hao Wang, Tao Song, Zhengui Xue, Ruhui Ma, Haibing Guan
A key challenge in federated learning (FL) is the statistical heterogeneity that impairs the generalization of the global model on each client.
1 code implementation • 30 Jun 2022 • Xinshao Wang, Yang Hua, Elyor Kodirov, Sankha Subhra Mukherjee, David A. Clifton, Neil M. Robertson
For the issue (2), the effectiveness of ProSelfLC defends entropy minimisation.
no code implementations • 18 Apr 2022 • Yanchao Yuan, Cancheng Li, Lu Xu, Ke Zhang, Yang Hua, Jicong Zhang
Test results show that the proposed method with dice loss function yields a Dice value of 0. 820, an IoU of 0. 701, Acc of 0. 969, and modified Hausdorff distance (MHD) of 1. 43 for 30 vulnerable cases of plaques, it outperforms some of the conventional CNN-based methods on these metrics.
no code implementations • 30 Dec 2021 • JunKyu Lee, Lev Mukhanov, Amir Sabbagh Molahosseini, Umar Minhas, Yang Hua, Jesus Martinez del Rincon, Kiril Dichev, Cheol-Ho Hong, Hans Vandierendonck
Deep learning is pervasive in our daily life, including self-driving cars, virtual assistants, social network services, healthcare services, face recognition, etc.
1 code implementation • CVPR 2021 • Jiaru Zhang, Yang Hua, Zhengui Xue, Tao Song, Chengyu Zheng, Ruhui Ma, Haibing Guan
Bayesian neural networks have been widely used in many applications because of the distinctive probabilistic representation framework.
1 code implementation • 25 May 2021 • Yanran Wu, Xiangtai Li, Chen Shi, Yunhai Tong, Yang Hua, Tao Song, Ruhui Ma, Haibing Guan
Motivated by this, we propose a novel network by aligning two-path information into each other through a learned flow field.
1 code implementation • ICCV 2021 • Yuxin Ma, Yang Hua, Hanming Deng, Tao Song, Hao Wang, Zhengui Xue, Heng Cao, Ruhui Ma, Haibing Guan
Vessel segmentation is critically essential for diagnosinga series of diseases, e. g., coronary artery disease and retinal disease.
no code implementations • 21 Jun 2020 • Shi-Yang Yan, Yang Hua, Neil M. Robertson
We tackle this problem by proposing an off-policy RL learning algorithm where a behaviour policy represented by GRUs performs the sampling.
5 code implementations • CVPR 2021 • Xinshao Wang, Yang Hua, Elyor Kodirov, David A. Clifton, Neil M. Robertson
Keywords: entropy minimisation, maximum entropy, confidence penalty, self knowledge distillation, label correction, label noise, semi-supervised learning, output regularisation
no code implementations • 21 Apr 2020 • Shi-Yang Yan, Yang Hua, Neil Robertson
Furthermore, to enable the ParaCNN to model paragraph comprehensively, we also propose an adversarial twin net training scheme.
no code implementations • 7 Feb 2020 • Yihan Du, Yan Yan, Si Chen, Yang Hua
This strategy efficiently filters out some irrelevant proposals and avoids the redundant computation for feature extraction, which enables our method to operate faster than conventional classification-based tracking methods.
no code implementations • 22 Nov 2019 • Xinshao Wang, Elyor Kodirov, Yang Hua, Neil Robertson
Loss functions play a crucial role in deep metric learning thus a variety of them have been proposed.
1 code implementation • 20 Nov 2019 • Xinshao Wang, Elyor Kodirov, Yang Hua, Neil M. Robertson
This way it can prevent overfitting to trivial images, and alleviate the influence of outliers.
no code implementations • 25 Sep 2019 • Xinshao Wang, Yang Hua, Elyor Kodirov, Neil M. Robertson
It is fundamental and challenging to train robust and accurate Deep Neural Networks (DNNs) when semantically abnormal examples exist.
3 code implementations • 27 May 2019 • Xinshao Wang, Elyor Kodirov, Yang Hua, Neil M. Robertson
By DM, we connect the design of loss function and example weighting together.
Ranked #30 on Image Classification on Clothing1M (using extra training data)
3 code implementations • 28 Mar 2019 • Xinshao Wang, Yang Hua, Elyor Kodirov, Neil M. Robertson
In this work, we study robust deep learning against abnormal training data from the perspective of example weighting built in empirical loss functions, i. e., gradient magnitude with respect to logits, an angle that is not thoroughly studied so far.
Ranked #33 on Image Classification on Clothing1M (using extra training data)
no code implementations • 27 Mar 2019 • Alessandro Borgia, Yang Hua, Elyor Kodirov, Neil M. Robertson
Video-based person re-identification deals with the inherent difficulty of matching unregulated sequences with different length and with incomplete target pose/viewpoint structure.
2 code implementations • CVPR 2019 • Xinshao Wang, Yang Hua, Elyor Kodirov, Neil M. Robertson
To address this, we propose to build a set-based similarity structure by exploiting all instances in the gallery.
no code implementations • 22 Nov 2018 • Soumya Shubhra Ghosh, Yang Hua, Sankha Subhra Mukherjee, Neil Robertson
Despite the breakthroughs in quality of image enhancement, an end-to-end solution for simultaneous recovery of the finer texture details and sharpness for degraded images with low resolution is still unsolved.
3 code implementations • 4 Nov 2018 • Xinshao Wang, Yang Hua, Elyor Kodirov, Guosheng Hu, Neil M. Robertson
Therefore, we propose a novel sample mining method, called Online Soft Mining (OSM), which assigns one continuous score to each sample to make use of all samples in the mini-batch.
no code implementations • ECCV 2018 • Guosheng Hu, Li Liu, Yang Yuan, Zehao Yu, Yang Hua, Zhihong Zhang, Fumin Shen, Ling Shao, Timothy Hospedales, Neil Robertson, Yongxin Yang
To advance subtle expression recognition, we contribute a Large-scale Subtle Emotions and Mental States in the Wild database (LSEMSW).
no code implementations • ICCV 2017 • Guosheng Hu, Yang Hua, Yang Yuan, Zhihong Zhang, Zheng Lu, Sankha S. Mukherjee, Timothy M. Hospedales, Neil M. Robertson, Yongxin Yang
To solve this problem, we establish a theoretical equivalence between tensor optimisation and a two-stream gated neural network.
no code implementations • ICCV 2015 • Yang Hua, Karteek Alahari, Cordelia Schmid
Tracking-by-detection approaches are some of the most successful object trackers in recent years.