no code implementations • 14 May 2021 • Ke Wang, Alexander Franks, Sang-Yun Oh
In this paper, we compare and contrast two strategies for inference in graphical models with latent confounders: Gaussian graphical models with latent variables (LVGGM) and PCA-based removal of confounding (PCA+GGM).
no code implementations • 25 Oct 2020 • Zhipu Zhou, Alexander Shkolnik, Sang-Yun Oh
Our results point to the possibility that most of the risk in equity markets may be explained by a sparse network of interacting assets (or their issuing firms).
1 code implementation • 7 Oct 2019 • Javier Zapata, Sang-Yun Oh, Alexander Petersen
Next, the partial separability structure is shown to be particularly useful in order to provide a well-defined functional Gaussian graphical model that can be identified with a sequence of finite-dimensional graphical models, each of identical fixed dimension.
no code implementations • 23 Jul 2019 • Sang-Yun Oh, Hye-Jin S. Kim, Jongeun Lee, Junmo Kim
We introduce Repetition-Reduction network (RRNet) for resource-constrained depth estimation, offering significantly improved efficiency in terms of computation, memory and energy consumption.
1 code implementation • 22 May 2019 • Pedro Cisneros-Velarde, Sang-Yun Oh, Alexander Petersen
As a consequence of this formulation, the radius of the Wasserstein ambiguity set is directly related to the regularization parameter in the estimation problem.
1 code implementation • 30 Oct 2017 • Penporn Koanantakool, Alnur Ali, Ariful Azad, Aydin Buluc, Dmitriy Morozov, Leonid Oliker, Katherine Yelick, Sang-Yun Oh
Across a variety of scientific disciplines, sparse inverse covariance estimation is a popular tool for capturing the underlying dependency relationships in multivariate data.
no code implementations • 28 Jan 2016 • Evan Racah, Seyoon Ko, Peter Sadowski, Wahid Bhimji, Craig Tull, Sang-Yun Oh, Pierre Baldi, Prabhat
Experiments in particle physics produce enormous quantities of data that must be analyzed and interpreted by teams of physicists.
no code implementations • NeurIPS 2014 • Sang-Yun Oh, Onkar Dalal, Kshitij Khare, Bala Rajaratnam
In direct contrast to the parallel work in the Gaussian setting however, this new convex pseudo-likelihood framework has not leveraged the extensive array of methods that have been proposed in the machine learning literature for convex optimization.
no code implementations • 20 Jul 2013 • Kshitij Khare, Sang-Yun Oh, Bala Rajaratnam
As none of the popular methods proposed for solving pseudo-likelihood based objective functions have provable convergence guarantees, it is not clear if corresponding estimators exist or are even computable, or if they actually yield correct partial correlation graphs.