no code implementations • 25 Mar 2023 • Zhouzheng Li, Hao liu
Beta-VAE is a very classical model for disentangled representation learning, the use of an expanding bottleneck that allow information into the decoder gradually is key to representation disentanglement as well as high-quality reconstruction.
no code implementations • 25 Mar 2021 • Zhouzheng Li, Kun Feng
While the beta-VAE family is aiming to find disentangled representations and acquire human-interpretable generative factors, like what an ICA (from the linear domain) does, we propose Full Encoder, a novel unified autoencoder framework as a correspondence to PCA in the non-linear domain.