前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >NVAE介绍

NVAE介绍

作者头像
用户1908973
发布2020-07-14 16:54:17
1K0
发布2020-07-14 16:54:17
举报
文章被收录于专栏:CreateAMindCreateAMind

https://arxiv.org/pdf/2007.03898.pdf

1. Introduction

In this paper, we aim to make VAEs great again by architecture design. We propose Nouveau VAE (NVAE), a deep hierarchical VAE with a carefully designed network architecture that produces highquality images. NVAE obtains the state-of-the-art results among non-autoregressive likelihood-based generative models, reducing the gap with autoregressive models. The main building block of our network is depthwise convolutions that rapidly increase the receptive field of the network without dramatically increasing the number of parameters.

In summary, we make the following contributions:

i) We propose a novel deep hierarchical VAE, called NVAE, with depthwise convolutions in its generative model.

ii) We propose a new residual parameterization of the approximate posteriors.

iii) We stabilize training deep VAEs with spectral regularization.

iv) We provide practical solutions to reduce the memory burden of VAEs.

v) We show that deep hierarchical VAEs can obtain state-of-the-art results on several image datasets, and can produce high-quality samples even when trained with the original VAE objective. To the best of our knowledge, NVAE is the first successful application of VAEs to images as large as 256×256 pixels.

2. Method

In this paper, we propose a deep hierarchical VAE called NVAE that generates large high-quality images. NVAE’s design focuses on tackling two main challenges: (i) designing expressive neural networks specifically for VAEs, and (ii) scaling up the training to a large number of hierarchical groups and image sizes while maintaining training stability.

Training objective: the variational lower bound

Residual Cells for Encoder and Decoder:

Taming the Unbounded KL Term:

3. Experiment

Hyper-parameters:

Performance:

Comparison:

More samples:

4. Conclusion

Engineering is magic... : )

本文参与 腾讯云自媒体分享计划,分享自微信公众号。
原始发表:2020-07-13,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 CreateAMind 微信公众号,前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体分享计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档