Search Results for author: Antonio Rodriguez-Sanchez

Found 9 papers, 5 papers with code

Greedy-layer Pruning: Speeding up Transformer Models for Natural Language Processing

1 code implementation31 May 2021 David Peer, Sebastian Stabinger, Stefan Engl, Antonio Rodriguez-Sanchez

Knowledge distillation maintains high performance and reaches high compression rates, nevertheless, the size of the student model is fixed after pre-training and can not be changed individually for a given downstream task and use-case to reach a desired performance/speedup ratio.

Knowledge Distillation Unsupervised Pre-training

Training Deep Capsule Networks with Residual Connections

1 code implementation15 Apr 2021 Josef Gugglberger, David Peer, Antonio Rodriguez-Sanchez

Capsule networks are a type of neural network that have recently gained increased popularity.

Auto-tuning of Deep Neural Networks by Conflicting Layer Removal

1 code implementation7 Mar 2021 David Peer, Sebastian Stabinger, Antonio Rodriguez-Sanchez

In the worst-case scenario, we prove that such a layer could lead to a network that cannot be trained at all.

Neural Architecture Search

Conflicting Bundles: Adapting Architectures Towards the Improved Training of Deep Neural Networks

1 code implementation5 Nov 2020 David Peer, Sebastian Stabinger, Antonio Rodriguez-Sanchez

In this paper, we introduce a novel theory and metric to identify layers that decrease the test accuracy of the trained models, this identification is done as early as at the beginning of training.

Limitation of capsule networks

no code implementations21 May 2019 David Peer, Sebastian Stabinger, Antonio Rodriguez-Sanchez

A recently proposed method in deep learning groups multiple neurons to capsules such that each capsule represents an object or part of an object.

Increasing the adversarial robustness and explainability of capsule networks with $γ$-capsules

1 code implementation23 Dec 2018 David Peer, Sebastian Stabinger, Antonio Rodriguez-Sanchez

In this paper we introduce a new inductive bias for capsule networks and call networks that use this prior $\gamma$-capsule networks.

Adversarial Robustness Inductive Bias

Guided Labeling using Convolutional Neural Networks

no code implementations6 Dec 2017 Sebastian Stabinger, Antonio Rodriguez-Sanchez

Over the last couple of years, deep learning and especially convolutional neural networks have become one of the work horses of computer vision.

Evaluation of Deep Learning on an Abstract Image Classification Dataset

no code implementations25 Aug 2017 Sebastian Stabinger, Antonio Rodriguez-Sanchez

Convolutional Neural Networks have become state of the art methods for image classification over the last couple of years.

Classification General Classification +1

Learning Abstract Classes using Deep Learning

no code implementations17 Jun 2016 Sebastian Stabinger, Antonio Rodriguez-Sanchez, Justus Piater

Humans are generally good at learning abstract concepts about objects and scenes (e. g.\ spatial orientation, relative sizes, etc.).

Cannot find the paper you are looking for? You can Submit a new open access paper.