Variational mean-field theory for training restricted Boltzmann machines with binary synapses

11 Nov 2019  ·  Haiping Huang ·

Unsupervised learning requiring only raw data is not only a fundamental function of the cerebral cortex, but also a foundation for a next generation of artificial neural networks. However, a unified theoretical framework to treat sensory inputs, synapses and neural activity together is still lacking. The computational obstacle originates from the discrete nature of synapses, and complex interactions among these three essential elements of learning. Here, we propose a variational mean-field theory in which the distribution of synaptic weights is considered. The unsupervised learning can then be decomposed into two intertwined steps: a maximization step is carried out as a gradient ascent of the lower-bound on the data log-likelihood, in which the synaptic weight distribution is determined by updating variational parameters, and an expectation step is carried out as a message passing procedure on an equivalent or dual neural network whose parameter is specified by the variational parameters of the weight distribution. Therefore, our framework provides insights on how data (or sensory inputs), synapses and neural activities interact with each other to achieve the goal of extracting statistical regularities in sensory inputs. This variational framework is verified in restricted Boltzmann machines with planted synaptic weights and handwritten-digits learning.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here