no code implementations • 23 Jan 2024 • Scout Jarman, Zigfried Hampel-Arias, Adra Carr, Kevin R. Moon
Deep learning identification models have shown promise for identifying gas plumes in Longwave IR hyperspectral images of urban scenes, particularly when a large library of gases are being considered.
no code implementations • 23 Oct 2022 • Andres F. Duque, Myriam Lizotte, Guy Wolf, Kevin R. Moon
With this in mind, we present a novel manifold alignment method called MALI (Manifold alignment with label information) that learns a correspondence between two distinct domains.
no code implementations • 15 Jun 2022 • Andres F. Duque, Guy Wolf, Kevin R. Moon
The integration of multimodal data presents a challenge in cases when the study of a given phenomena by different instruments or conditions generates distinct but related domains.
3 code implementations • 29 Jan 2022 • Jake S. Rhodes, Adele Cutler, Kevin R. Moon
Random forests are considered one of the best out-of-the-box classification and regression algorithms due to their high level of predictive performance with relatively little tuning.
no code implementations • 22 Oct 2020 • Teresa White, Jesse Wheeler, Colton Lindstrom, Randall Christensen, Kevin R. Moon
This paper presents a method for determining the navigation errors present at the beginning of a GPS-denied period utilizing data from a synthetic aperture radar (SAR) system.
no code implementations • 14 Jul 2020 • Andrés F. Duque, Sacha Morin, Guy Wolf, Kevin R. Moon
Our regularization, based on the diffusion potential distances from the recently-proposed PHATE visualization method, encourages the learned latent representation to follow intrinsic data geometry, similar to manifold learning algorithms, while still enabling faithful extension to new data and reconstruction of data in the original feature space from latent coordinates.
no code implementations • 15 Jun 2020 • Jake S. Rhodes, Adele Cutler, Guy Wolf, Kevin R. Moon
We show, both qualitatively and quantitatively, the advantages of our approach in retaining local and global structures in data, while emphasizing important variables in the low-dimensional embedding.
1 code implementation • 10 Jul 2019 • Nathan Brugnone, Alex Gonopolskiy, Mark W. Moyle, Manik Kuchroo, David van Dijk, Kevin R. Moon, Daniel Colon-Ramos, Guy Wolf, Matthew J. Hirn, Smita Krishnaswamy
Here, we consider multiple levels of abstraction via a multiresolution geometry of data points at different granularities.
no code implementations • 25 Jun 2019 • Andrés F. Duque, Guy Wolf, Kevin R. Moon
Manifold learning techniques for dynamical systems and time series have shown their utility for a broad spectrum of applications in recent years.
no code implementations • 1 Oct 2018 • Salimeh Yasaei Sekeh, Morteza Noshad, Kevin R. Moon, Alfred O. Hero
We derive a bound on the convergence rate for the Friedman-Rafsky (FR) estimator of the HP-divergence, which is related to a multivariate runs statistic for testing between two distributions.
no code implementations • 27 Sep 2018 • Scott Gigante, David van Dijk, Kevin R. Moon, Alexander Strzalkowski, Katie Ferguson, Guy Wolf, Smita Krishnaswamy
DyMoN is well-suited to the idiosyncrasies of biological data, including noise, sparsity, and the lack of longitudinal measurements in many types of systems.
no code implementations • 17 Feb 2017 • Morteza Noshad, Kevin R. Moon, Salimeh Yasaei Sekeh, Alfred O. Hero III
Considering the $k$-nearest neighbor ($k$-NN) graph of $Y$ in the joint data set $(X, Y)$, we show that the average powered ratio of the number of $X$ points to the number of $Y$ points among all $k$-NN points is proportional to R\'{e}nyi divergence of $X$ and $Y$ densities.
no code implementations • 13 Sep 2016 • Kevin R. Moon, Morteza Noshad, Salimeh Yasaei Sekeh, Alfred O. Hero III
Information theoretic measures (e. g. the Kullback Liebler divergence and Shannon mutual information) have been used for exploring possibly nonlinear multivariate dependencies in high dimension.
no code implementations • 13 Oct 2015 • Stephen V. Gliske, Kevin R. Moon, William C. Stacey, Alfred O. Hero III
High frequency oscillations (HFOs) are a promising biomarker of epileptic brain tissue and activity.
no code implementations • 27 Apr 2015 • Kevin R. Moon, Veronique Delouille, Alfred O. Hero III
For example, the Bayes error rate of a given feature space, if known, can be used to aid in choosing a classifier, as well as in feature selection and model selection for the base classifiers and the meta classifier.
no code implementations • 10 Apr 2015 • Kevin R. Moon, Veronique Delouille, Jimmy J. Li, Ruben De Visscher, Fraser Watson, Alfred O. Hero III
We also find that including data focused on the neutral line of an active region can result in an increased correspondence between our clustering results and other active region descriptors such as the Mount Wilson classifications and the $R$ value.
no code implementations • 13 Mar 2015 • Kevin R. Moon, Jimmy J. Li, Veronique Delouille, Ruben De Visscher, Fraser Watson, Alfred O. Hero III
We find the relationship between complexity of an active region as measured by Mount Wilson and the intrinsic dimension of its image patches.
no code implementations • 7 Nov 2014 • Kevin R. Moon, Alfred O. Hero III
The problem of f-divergence estimation is important in the fields of machine learning, information theory, and statistics.
no code implementations • 24 Jun 2014 • Kevin R. Moon, Jimmy J. Li, Veronique Delouille, Fraser Watson, Alfred O. Hero III
Sunspots, as seen in white light or continuum images, are associated with regions of high magnetic activity on the Sun, visible on magnetogram images.