1 code implementation • 24 Apr 2024 • Zhixiong Yang, Jingyuan Xia, Shengxi Li, Xinghua Huang, Shuanghui Zhang, Zhen Liu, Yaowen Fu, Yongxiang Liu
This paper proposes an unsupervised kernel estimation model, named dynamic kernel prior (DKP), to realize an unsupervised and pre-training-free learning-based algorithm for solving the BSR problem.
no code implementations • 27 Feb 2024 • Qunliang Xing, Mai Xu, Shengxi Li, Xin Deng, Meisong Zheng, Huaida Liu, Ying Chen
However, these methods exhibit a pervasive enhancement bias towards the compression domain, inadvertently regarding it as more realistic than the raw domain.
1 code implementation • ICCV 2023 • Shengxi Li, Jialu Zhang, Yifei Li, Mai Xu, Xin Deng, Li Li
The emergence of conditional generative adversarial networks (cGANs) has revolutionised the way we approach and control the generation, by means of adversarially learning joint distributions of data and auxiliary information.
no code implementations • 16 Oct 2022 • Shengxi Li, Xinyi Zhao, Ljubisa Stankovic, Danilo Mandic
The success of convolution neural networks (CNN) has been revolutionising the way we approach and use intelligent machines in the Big Data era.
1 code implementation • CVPR 2022 • Lai Jiang, Yifei Li, Shengxi Li, Mai Xu, Se Lei, Yichen Guo, Bo Huang
E-commerce images are playing a central role in attracting people's attention when retailing and shopping online, and an accurate attention prediction is of significant importance for both customers and retailers, where its research is yet to start.
Ranked #2 on Saliency Prediction on SALECI
1 code implementation • 18 Nov 2021 • Li Yang, Mai Xu, Shengxi Li, Yichen Guo, Zulin Wang
When assessing the quality of 360{\textdegree} video, human tends to perceive its quality degradation from the viewport-based spatial distortion of each spherical frame to motion artifact across adjacent frames, ending with the video-level quality score, i. e., a progressive quality assessment paradigm.
no code implementations • 14 Mar 2021 • Shengxi Li, Danilo Mandic
A large class of modern probabilistic learning systems assumes symmetric distributions, however, real-world data tend to obey skewed distributions and are thus not always adequately modelled through symmetric distributions.
1 code implementation • 9 Sep 2020 • Jingyuan Xia, Shengxi Li, Jun-Jie Huang, Imad Jaimoukha, Deniz Gunduz
In this paper, we propose a novel solution for non-convex problems of multiple variables, especially for those typically solved by an alternating minimization (AM) strategy that splits the original optimization problem into a set of sub-problems corresponding to each variable, and then iteratively optimize each sub-problem using a fixed updating rule.
1 code implementation • NeurIPS 2020 • Shengxi Li, Zeyang Yu, Min Xiang, Danilo Mandic
For rigour, we first establish the physical meaning of the phase and amplitude in CF, and show that this provides a feasible way of balancing the accuracy and diversity of generation.
no code implementations • 2 Jan 2020 • Ljubisa Stankovic, Danilo Mandic, Milos Dakovic, Milos Brajovic, Bruno Scalzo, Shengxi Li, Anthony G. Constantinides
Many modern data analytics applications on graphs operate on domains where graph topology is not known a priori, and hence its determination becomes part of the problem definition, rather than serving as prior knowledge which aids the problem solution.
1 code implementation • 9 Jun 2019 • Shengxi Li, Zeyang Yu, Min Xiang, Danilo Mandic
To relieve this issue, we introduce an efficient optimisation method on a statistical manifold defined under an approximate Wasserstein distance, which allows for explicit metrics and computable operations, thus significantly stabilising and improving the EMM estimation.
no code implementations • 5 Mar 2019 • Zeyang Yu, Shengxi Li, Danilo Mandic
To resolve this issue, we design a new cost function, which is capable of controlling the balance between the phase and the amplitude contribution to the solution.
no code implementations • 21 May 2018 • Shengxi Li, Zeyang Yu, Danilo Mandic
Mixture modelling using elliptical distributions promises enhanced robustness, flexibility and stability over the widely employed Gaussian mixture model (GMM).