首页
学习
活动
专区
工具
TVP
发布
精选内容/技术社群/优惠产品,尽在小程序
立即前往

Spam Filters「建议收藏」

Sam Holden 23 Aug 2003 00:00 1 id=”twitter-widget-0″ scrolling=”no” frameborder=”0″ allowtransparency=”true” class=”twitter-share-button twitter-share-button-rendered twitter-tweet-button” title=”Twitter Tweet Button” src=”http://platform.twitter.com/widgets/tweet_button.690bdfd7a6f940134f5b0c1ed92905a6.en.html#_=1448418168091&count=vertical&dnt=false&id=twitter-widget-0&lang=en&original_referer=http%3A%2F%2Ffreecode.com%2Farticles%2Fspam-filters&size=m&text=Spam%20Filters&type=share&url=http%3A%2F%2Ffreecode.com%2Farticles%2Fspam-filters&via=freecode_com” style=”margin: 30px 0px 0px; padding: 0px; border-width: 0px; border-style: initial; outline: 0px; font-size: 14.3999996185303px; clear: both; float: none; position: static; visibility: visible; width: 65px; height: 20px; background: transparent;”>

01
  • 您找到你想要的搜索结果了吗?
    是的
    没有找到

    Nonparametric VAE for Hierarchical Representation Learning

    The recently developed variational autoencoders (VAEs) have proved to be an effective confluence of the rich repre- sentational power of neural networks with Bayesian meth- ods. However, most work on VAEs use a rather simple prior over the latent variables such as standard normal distribu- tion, thereby restricting its applications to relatively sim- ple phenomena. In this work, we propose hierarchical non- parametric variational autoencoders, which combines tree- structured Bayesian nonparametric priors with VAEs, to en- able infinite flexibility of the latent representation space. Both the neural parameters and Bayesian priors are learned jointly using tailored variational inference. The resulting model induces a hierarchical structure of latent semantic concepts underlying the data corpus, and infers accurate representations of data instances. We apply our model in video representation learning. Our method is able to dis- cover highly interpretable activity hierarchies, and obtain improved clustering accuracy and generalization capacity based on the learned rich representations.

    01
    领券