前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >一看就懂的Tensorflow实战(Logistic回归模型)

一看就懂的Tensorflow实战(Logistic回归模型)

作者头像
AI异构
发布2020-07-29 11:19:40
4430
发布2020-07-29 11:19:40
举报
文章被收录于专栏:AI异构

Logistic回归简介

Logistic模型

Logistic模型

Logistic模型图解

损失函数(交叉熵损失)

交叉熵

softmax多分类

softmax

Tensorflow Logistic回归

导入 mnist数据集
代码语言:javascript
复制
import tensorflow as tf

# Import MINST data
from tensorflow.examples.tutorials.mnist import input_data
mnist = input_data.read_data_sets("./data/", one_hot=True)

代码语言:javascript
复制
Extracting ./data/train-images-idx3-ubyte.gz
Extracting ./data/train-labels-idx1-ubyte.gz
Extracting ./data/t10k-images-idx3-ubyte.gz
Extracting ./data/t10k-labels-idx1-ubyte.gz
设置参数
代码语言:javascript
复制
# Parameters
learning_rate = 0.01
training_epochs = 25
batch_size = 100
display_step = 1
构建模型
代码语言:javascript
复制
# tf Graph Input
x = tf.placeholder(tf.float32, [None, 784]) # mnist data image of shape 28*28=784
y = tf.placeholder(tf.float32, [None, 10]) # 0-9 digits recognition => 10 classes
# Set model weights
W = tf.Variable(tf.zeros([784, 10]))
b = tf.Variable(tf.zeros([10]))
# Construct model
pred = tf.nn.softmax(tf.matmul(x, W) + b) # Softmax
定义损失函数(交叉熵)
代码语言:javascript
复制
# Minimize error using cross entropy
cost = tf.reduce_mean(-tf.reduce_sum(y*tf.log(pred), reduction_indices=1))
设置优化器(SGD)
代码语言:javascript
复制
# Gradient Descent
optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)
训练
代码语言:javascript
复制
# Initialize the variables (i.e. assign their default value)
init = tf.global_variables_initializer()
# Start training
with tf.Session() as sess:
    sess.run(init)

    # Training cycle
    for epoch in range(training_epochs):
        avg_cost = 0.
        total_batch = int(mnist.train.num_examples/batch_size)
        # Loop over all batches
        for i in range(total_batch):
            batch_xs, batch_ys = mnist.train.next_batch(batch_size)
            # Fit training using batch data
            _, c = sess.run([optimizer, cost], feed_dict={x: batch_xs,
                                                          y: batch_ys})
            # Compute average loss
            avg_cost += c / total_batch
        # Display logs per epoch step
        if (epoch+1) % display_step == 0:
            print ("Epoch:", '%04d' % (epoch+1), "cost=", "{:.9f}".format(avg_cost))

    print ("Optimization Finished!")

    # Test model
    correct_prediction = tf.equal(tf.argmax(pred, 1), tf.argmax(y, 1))
    # Calculate accuracy for 3000 examples
    accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))
    print ("Accuracy:", accuracy.eval({x: mnist.test.images[:3000], y: mnist.test.labels[:3000]}))

代码语言:javascript
复制
Epoch: 0001 cost= 1.184004518
Epoch: 0002 cost= 0.665706925
Epoch: 0003 cost= 0.552722147
Epoch: 0004 cost= 0.498915088
Epoch: 0005 cost= 0.465621155
Epoch: 0006 cost= 0.442525349
Epoch: 0007 cost= 0.425387941
Epoch: 0008 cost= 0.412269829
Epoch: 0009 cost= 0.401506593
Epoch: 0010 cost= 0.392485674
Epoch: 0011 cost= 0.384779438
Epoch: 0012 cost= 0.378020378
Epoch: 0013 cost= 0.372379096
Epoch: 0014 cost= 0.367407406
Epoch: 0015 cost= 0.362790742
Epoch: 0016 cost= 0.358502308
Epoch: 0017 cost= 0.354814395
Epoch: 0018 cost= 0.351517630
Epoch: 0019 cost= 0.348201127
Epoch: 0020 cost= 0.345595738
Epoch: 0021 cost= 0.342694492
Epoch: 0022 cost= 0.340168105
Epoch: 0023 cost= 0.338062184
Epoch: 0024 cost= 0.335568684
Epoch: 0025 cost= 0.333836752
Optimization Finished!
Accuracy: 0.8886667

参考

李宏毅机器学习教程

[TensorFlow-Examples]https://github.com/aymericdamien/TensorFlow-Examples

本文参与 腾讯云自媒体同步曝光计划,分享自微信公众号。
原始发表:2018-04-04,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 AI异构 微信公众号,前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体同步曝光计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
目录
  • Logistic回归简介
    • Logistic模型
      • 损失函数(交叉熵损失)
        • softmax多分类
        • Tensorflow Logistic回归
          • 导入 mnist数据集
            • 设置参数
              • 构建模型
                • 定义损失函数(交叉熵)
                  • 设置优化器(SGD)
                    • 训练
                    • 参考
                    领券
                    问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档