前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >tensorpack

tensorpack

作者头像
CreateAMind
发布2018-07-24 15:44:04
9080
发布2018-07-24 15:44:04
举报
文章被收录于专栏:CreateAMind

tensorpack

Neural Network Toolbox on TensorFlow.

See some examples to learn about the framework:

Vision:

  • DoReFa-Net: train binary / low-bitwidth CNN on ImageNet
  • Train ResNet on ImageNet / Cifar10 / SVHN
  • Generative Adversarial Network(GAN) variants, including DCGAN, InfoGAN, Conditional GAN, WGAN, BEGAN, DiscoGAN, Image to Image, CycleGAN.
  • Fully-convolutional Network for Holistically-Nested Edge Detection(HED)
  • Spatial Transformer Networks on MNIST addition
  • Visualize CNN saliency maps
  • Similarity learning on MNIST

Reinforcement Learning:

  • Deep Q-Network(DQN) variants on Atari games, including DQN, DoubleDQN, DuelingDQN.
  • Asynchronous Advantage Actor-Critic(A3C) with demos on OpenAI Gym

Speech / NLP:

  • LSTM-CTC for speech recognition
  • char-rnn for fun
  • LSTM language model on PennTreebank

The examples are not only for demonstration of the framework -- you can train them and reproduce the results in papers.

Features:

It's Yet Another TF wrapper, but different in:

  1. Not focus on models.
    • There are already too many symbolic function wrappers. Tensorpack includes only a few common models, and helpful tools such as LinearWrap to simplify large models. But you can use any other wrappers within tensorpack, such as sonnet/Keras/slim/tflearn/tensorlayer/....
  2. Focus on training speed.
    • Tensorpack trainer is almost always faster than feed_dict based wrappers. Even on a tiny CNN example, the training runs 2x faster than the equivalent Keras code.
    • Data-parallel multi-GPU training is off-the-shelf to use. It is as fast as Google's benchmark code.
    • Data-parallel distributed training is off-the-shelf to use. It is as slow as Google's benchmark code.
  3. Focus on large datasets.
    • It's painful to read/preprocess data from TF. Use DataFlow to load large datasets (e.g. ImageNet) in pure Python with multi-process prefetch.
    • DataFlow has a unified interface, so you can compose and reuse them to perform complex preprocessing.
  4. Interface of extensible Callbacks. Write a callback to implement everything you want to do apart from the training iterations, and enable it with one line of code. Common examples include:
    • Change hyperparameters during training
    • Print some tensors of interest
    • Run inference on a test dataset
    • Run some operations once a while
    • Send loss to your phone

Install:

Dependencies:

  • Python 2 or 3
  • TensorFlow >= 1.0.0 (>=1.1.0 for Multi-GPU)
  • Python bindings for OpenCV (Optional, but required by a lot of features)
代码语言:javascript
复制
pip install -U git+https://github.com/ppwwyyxx/tensorpack.git
# or add `--user` to avoid system-wide installation.
本文参与 腾讯云自媒体同步曝光计划,分享自微信公众号。
原始发表:2017-07-17,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 CreateAMind 微信公众号,前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体同步曝光计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
目录
  • tensorpack
    • Vision:
      • Reinforcement Learning:
        • Speech / NLP:
          • Features:
            • Install:
            领券
            问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档