Bayesian Tensor Network with Polynomial Complexity for Probabilistic Machine Learning

30 Dec 2019  ·  Shi-Ju Ran ·

It is known that describing or calculating the conditional probabilities of multiple events is exponentially expensive. In this work, Bayesian tensor network (BTN) is proposed to efficiently capture the conditional probabilities of multiple sets of events with polynomial complexity. BTN is a directed acyclic graphical model that forms a subset of TN. To testify its validity for exponentially many events, BTN is implemented to the image recognition, where the classification is mapped to capturing the conditional probabilities in an exponentially large sample space. Competitive performance is achieved by the BTN with simple tree network structures. Analogous to the tensor network simulations of quantum systems, the validity of the simple-tree BTN implies an ``area law'' of fluctuations in image recognition problems.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here