Search Results for author: Theodore P. Pavlic

Found 10 papers, 7 papers with code

Learning Decomposable and Debiased Representations via Attribute-Centric Information Bottlenecks

no code implementations21 Mar 2024 Jinyung Hong, Eun Som Jeon, Changhoon Kim, Keun Hee Park, Utkarsh Nath, Yezhou Yang, Pavan Turaga, Theodore P. Pavlic

Biased attributes, spuriously correlated with target labels in a dataset, can problematically lead to neural networks that learn improper shortcuts for classifications and limit their capabilities for out-of-distribution (OOD) generalization.

Attribute Representation Learning

Randomly Weighted Neuromodulation in Neural Networks Facilitates Learning of Manifolds Common Across Tasks

1 code implementation17 Nov 2023 Jinyung Hong, Theodore P. Pavlic

Geometric Sensitive Hashing functions, a family of Local Sensitive Hashing functions, are neural network models that learn class-specific manifold geometry in supervised learning.

Learning to Modulate Random Weights: Neuromodulation-inspired Neural Networks For Efficient Continual Learning

1 code implementation8 Apr 2022 Jinyung Hong, Theodore P. Pavlic

Existing Continual Learning (CL) approaches have focused on addressing catastrophic forgetting by leveraging regularization methods, replay buffers, and task-specific components.

Computational Efficiency Continual Learning +1

Representing Prior Knowledge Using Randomly, Weighted Feature Networks for Visual Relationship Detection

2 code implementations AAAI Workshop CLeaR 2022 Jinyung Hong, Theodore P. Pavlic

Furthermore, background knowledge represented by RWFNs can be used to alleviate the incompleteness of training sets even though the space complexity of RWFNs is much smaller than LTNs (1:27 ratio).

Predicate Detection Relational Reasoning +4

An Insect-Inspired Randomly, Weighted Neural Network with Random Fourier Features For Neuro-Symbolic Relational Learning

3 code implementations11 Sep 2021 Jinyung Hong, Theodore P. Pavlic

We demonstrate that compared to LTNs, RWFNs can achieve better or similar performance for both object classification and detection of the part-of relations between objects in SII tasks while using much far fewer learnable parameters (1:62 ratio) and a faster learning process (1:2 ratio of running speed).

Decoder Relational Reasoning

Beyond Tracking: Using Deep Learning to Discover Novel Interactions in Biological Swarms

1 code implementation20 Aug 2021 Taeyeong Choi, Benjamin Pyenson, Juergen Liebig, Theodore P. Pavlic

Because the resulting predictive models are not based on human-understood predictors, we use explanatory modules (e. g., Grad-CAM) that combine information hidden in the latent variables of the deep-network model with the video data itself to communicate to a human observer which aspects of observed individual behaviors are most informative in predicting group behavior.

KCNet: An Insect-Inspired Single-Hidden-Layer Neural Network with Randomized Binary Weights for Prediction and Classification Tasks

no code implementations17 Aug 2021 Jinyung Hong, Theodore P. Pavlic

Fruit flies are established model systems for studying olfactory learning as they will readily learn to associate odors with both electric shock or sugar rewards.

Data Augmentation Image Classification

Randomly Weighted, Untrained Neural Tensor Networks Achieve Greater Relational Expressiveness

no code implementations1 Jun 2020 Jinyung Hong, Theodore P. Pavlic

Neural Tensor Networks (NTNs), which are structured to encode the degree of relationship among pairs of entities, are used in Logic Tensor Networks (LTNs) to facilitate Statistical Relational Learning (SRL) in first-order logic.

Decoder Relational Reasoning +1

Cannot find the paper you are looking for? You can Submit a new open access paper.