eduzhai > Applied Sciences > Engineering >

AutoGAN-Distiller Searching to Compress Generative Adversarial Networks

  • Save

... pages left unread,continue reading

Document pages: 12 pages

Abstract: The compression of Generative Adversarial Networks (GANs) has lately drawnattention, due to the increasing demand for deploying GANs into mobile devicesfor numerous applications such as image translation, enhancement and editing.However, compared to the substantial efforts to compressing other deep models,the research on compressing GANs (usually the generators) remains at itsinfancy stage. Existing GAN compression algorithms are limited to handlingspecific GAN architectures and losses. Inspired by the recent success of AutoMLin deep compression, we introduce AutoML to GAN compression and develop anAutoGAN-Distiller (AGD) framework. Starting with a specifically designedefficient search space, AGD performs an end-to-end discovery for new efficientgenerators, given the target computational resource constraints. The search isguided by the original GAN model via knowledge distillation, thereforefulfilling the compression. AGD is fully automatic, standalone (i.e., needingno trained discriminators), and generically applicable to various GAN models.We evaluate AGD in two representative GAN tasks: image translation and superresolution. Without bells and whistles, AGD yields remarkably lightweight yetmore competitive compressed models, that largely outperform existingalternatives. Our codes and pretrained models are available atthis https URL.

Please select stars to rate!

         

0 comments Sign in to leave a comment.

    Data loading, please wait...
×