University of Wollongong
Browse

Toward gradient bandit-based selection of candidate architectures in AutoGAN

journal contribution
posted on 2024-11-17, 13:18 authored by Yi Fan, Guoqiang Zhou, Jun Shen, Guilan Dai
The neural architecture search (NAS) method provides a new approach to the design of generative adversarial network (GAN)’s architecture. The existing state-of-the-art algorithm, AutoGAN, is an example in discovering GAN’s architecture using reinforcement learning (RL)-based NAS. However, performance differences between the candidate architectures are not taken into consideration. In this paper, one new ImprovedAutoGAN algorithm based on AutoGAN is proposed. We found that which candidate architectures to use can affect the performance of the final network. So in the process of architecture search, compared with AutoGAN, in which candidate architectures are selected randomly, our method uses gradient bandit algorithm to increase the probability of selecting networks with better performance. This paper also introduces a temperature coefficient in the algorithm to prevent the search results from getting trapped in the local optimum. The GANs are searched using the same search space as AutoGAN, and the discovered GAN has a Frechet inception distance (FID) score of 11.60 on CIFAR-10, reaching the best level in the current RL-based NAS methods. Experiments also show that the transferability of this GAN is satisfying.

Funding

National Natural Science Foundation of China (2019-2022)

History

Journal title

Soft Computing

Volume

25

Issue

6

Pagination

4367-4378

Language

English

Usage metrics

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC