no code implementations • 25 Jan 2024 • John A. Miller, Mohammed Aldosari, Farah Saeed, Nasid Habib Barna, Subas Rana, I. Budak Arpinar, Ninghao Liu
Furthermore, there is a vast amount of knowledge available that deep learning models can tap into, including Knowledge Graphs and Large Language Models fine-tuned with scientific domain knowledge.
1 code implementation • 19 May 2022 • Mohammadreza Iman, John A. Miller, Khaled Rasheed, Robert M. Branch, Hamid R. Arabnia
Deep transfer learning techniques try to tackle the limitations of deep learning, the dependency on extensive training data and the training costs, by reusing obtained knowledge.
1 code implementation • 2 Mar 2021 • Indrajeet Y. Javeri, Mohammadhossein Toutiaee, Ismailcem B. Arpinar, Tom W. Miller, John A. Miller
However, advances in machine learning research indicate that neural networks can be powerful data modeling techniques, as they can give higher accuracy for a plethora of learning problems and datasets.
no code implementations • 29 Apr 2020 • Mohammadhossein Toutiaee, Abbas Keshavarzi, Abolfazl Farahani, John A. Miller
We propose a novel application of Transfer Learning to classify video-frame sequences over multiple classes.
no code implementations • 29 Apr 2020 • Mohammadhossein Toutiaee, Soheyla Amirian, John A. Miller, Sheng Li
The proposed approach aids labeling new data (fictitious output images) by minimizing a penalized version of the least squares cost function between realistic pictures and target pictures.
no code implementations • 28 Dec 2015 • Khalifeh AlJadda, Rene Ranzinger, Melody Porterfield, Brent Weatherly, Mohammed Korayem, John A. Miller, Khaled Rasheed, Krys J. Kochut, William S. York
The first, is a free, semi-automated MSn data interpreter called the Glycomic Elucidation and Annotation Tool (GELATO).
no code implementations • 28 Dec 2015 • Khalifeh AlJadda, Mohammed Korayem, Camilo Ortiz, Trey Grainger, John A. Miller, Khaled Rasheed, Krys J. Kochut, William S. York, Rene Ranzinger, Melody Porterfield
In this paper we introduce an extension to Bayesian Networks to handle massive sets of hierarchical data in a reasonable amount of time and space.
no code implementations • 21 Jul 2014 • Khalifeh AlJadda, Mohammed Korayem, Camilo Ortiz, Trey Grainger, John A. Miller, William S. York
When modeling this kind of hierarchical data across large data sets, Bayesian networks become infeasible for representing the probability distributions for the following reasons: i) Each level represents a single random variable with hundreds of thousands of values, ii) The number of levels is usually small, so there are also few random variables, and iii) The structure of the network is predefined since the dependency is modeled top-down from each parent to each of its child nodes, so the network would contain a single linear path for the random variables from each parent to each child node.