Triple-GAN with Variable Fractional Order Gradient Descent Method and Mish Activation Function

23 Sep 2020  ·  Cancan Yang, Zan Yang, Shan Liao, Zheming Hong, Wei Nai ·

As is known to all, Generative Antagonism Network (GAN) plays an important role in the field of image generation and classification. However, there are some shortcomings in it, one of which is that the generator and discriminator cannot converge to the real data at the same time. Therefore, Triple-GAN is proposed. Compared with GAN, it is composed of three parts: classifier, generator and discriminator, which guarantees that the data distribution of generator and classifier can converge to the real distribution. Yet, there are still some traditional limitations in the field of activation function and gradient descent such as gradient disappearance and gradient explosion. Thus, this paper adopts the new activation function Mish, the gradient ascending method and the gradient descending method instead of the original activation function and the gradient ascending algorithm. When a certain sparsity prevents the gradient from exploding and disappearing, the smooth curve of negative input value near the origin makes up for the defect of inactivation of neurons flowing into large gradient,and thus achieving better activation efficiency and greatly improving the efficiency of deep learning.

PDF
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods