1 code implementation • 19 Jul 2023 • Boris Flach, Dmitrij Schlesinger, Alexander Shekhovtsov
The flexibility and simplicity of this approach allows its application to a wide range of learning scenarios and downstream tasks.
no code implementations • ICLR 2022 • Alexander Shekhovtsov, Dmitrij Schlesinger, Boris Flach
The importance of Variational Autoencoders reaches far beyond standalone generative models -- the approach is also used for learning latent representations and can be generalized to semi-supervised learning.
no code implementations • NeurIPS 2020 • Alexander Shekhovtsov, Viktor Yanush, Boris Flach
In neural networks with binary activations and or binary weights the training by gradient descent is complicated as the model has piecewise constant response.
no code implementations • ICLR 2019 • Alexander Shekhovtsov, Boris Flach
Probabilistic Neural Networks deal with various sources of stochasticity: input noise, dropout, stochastic neurons, parameter uncertainties modeled as random variables, etc.
no code implementations • 1 Nov 2018 • Alexander Shekhovtsov, Boris Flach
In this work we investigate the reasons why Batch Normalization (BN) improves the generalization performance of deep networks.
no code implementations • 28 Mar 2018 • Alexander Shekhovtsov, Boris Flach
We address the problem of estimating statistics of hidden units in a neural network using a method of analytic moment propagation.
no code implementations • 28 Mar 2018 • Alexander Shekhovtsov, Boris Flach, Michal Busta
We propose a feed-forward inference method applicable to belief and neural networks.
no code implementations • 25 Sep 2017 • Boris Flach, Alexander Shekhovtsov, Ondrej Fikar
Learning, taking into account full distribution of the data, referred to as generative, is not feasible with deep neural networks (DNNs) because they model only the conditional distribution of the outputs given the inputs.
no code implementations • 23 Jul 2014 • Michail Schlesinger, Boris Flach, Evgeniy Vodolazskiy
The article studies the problem of finding d most admissible solutions for a given d. A tractable subclass of these problems is defined by the concepts of invariants and polymorphisms similar to the classic constraint satisfaction approach.
no code implementations • 10 Dec 2012 • Boris Flach
The aim of this short note is to draw attention to a method by which the partition function and marginal probabilities for a certain class of random fields on complete graphs can be computed in polynomial time.