no code implementations • 25 Nov 2021 • Xingzi Xu, Ali Hasan, Khalil Elkhalil, Jie Ding, Vahid Tarokh
While NODEs model the evolution of a latent variables as the solution to an ODE, C-NODE models the evolution of the latent variables as the solution of a family of first-order quasi-linear partial differential equations (PDEs) along curves on which the PDEs reduce to ODEs, referred to as characteristic curves.
1 code implementation • 22 Feb 2021 • Yuting Ng, Ali Hasan, Khalil Elkhalil, Vahid Tarokh
We propose a new generative modeling technique for learning multidimensional cumulative distribution functions (CDFs) in the form of copulas.
no code implementations • 17 Feb 2021 • Ali Hasan, Khalil Elkhalil, Yuting Ng, Joao M. Pereira, Sina Farsiu, Jose H. Blanchet, Vahid Tarokh
We propose a novel neural network architecture that enables non-parametric calibration and generation of multivariate extreme value distributions (MEVs).
no code implementations • 12 Jul 2020 • Khalil Elkhalil, Ali Hasan, Jie Ding, Sina Farsiu, Vahid Tarokh
It has been conjectured that the Fisher divergence is more robust to model uncertainty than the conventional Kullback-Leibler (KL) divergence.
no code implementations • 11 Jun 2020 • Amine Bejaoui, Khalil Elkhalil, Abla Kammoun, Mohamed Slim Alouni, Tarek Al-Naffouri
The use of quadratic discriminant analysis (QDA) or its regularized version (R-QDA) for classification is often not recommended, due to its well-acknowledged high sensitivity to the estimation noise of the covariance matrix.
no code implementations • 19 Apr 2019 • Khalil Elkhalil, Abla Kammoun, Xiangliang Zhang, Mohamed-Slim Alouini, Tareq Al-Naffouri
This paper carries out a large dimensional analysis of a variation of kernel ridge regression that we call \emph{centered kernel ridge regression} (CKRR), also known in the literature as kernel ridge regression with offset.
1 code implementation • 1 Nov 2017 • Khalil Elkhalil, Abla Kammoun, Romain Couillet, Tareq Y. Al-Naffouri, Mohamed-Slim Alouini
This article carries out a large dimensional analysis of standard regularized discriminant analysis classifiers designed on the assumption that data arise from a Gaussian mixture model with different means and covariances.