今天教大家服务器运行 jupyter notebook
不建议 win10
mkvirtualenv -p /usr/bin/python3.6 deeplearnworkon deeplearnpip install tensorflowpip install jupyter
然后
vim ~/.jupyter/jupyter_notebook_config.py
改下
c.NotebookApp.ip = '0.0.0.0'c.NotebookApp.open_browser = Falsec.NotebookApp.port = 8888c.NotebookApp.password = ''
然后关关防火墙,让 ssh 通下
root@VM-0-5-ubuntu:~# firewall-cmd --staterunningroot@VM-0-5-ubuntu:~# systemctl stop firewalld.serviceroot@VM-0-5-ubuntu:~# systemctl disable firewalld.serviceSynchronizing state of firewalld.service with SysV service script with /lib/systemd/systemd-sysv-install.Executing: /lib/systemd/systemd-sysv-install disable firewalldRemoved /etc/systemd/system/dbus-org.fedoraproject.FirewallD1.service.
打开 jupyter notebook
打开浏览器,跑起来
也可以将服务器换成本地的,vm 建立一个隧道
这些配置简单
本地打开没问题
搞定了,我当然来复习 tfboys,虽说 tf 更到 2.0,不与 1.0 接融,但是 tf2.0 更简单了
官网 :http://www.tensorfly.cn/
# TensorFlowimport tensorflow as tfprint(tf.__version__)
2.0.0
# 加载手写数字集mnist = tf.keras.datasets.mnist(x_train, y_train), (x_test, y_test) = mnist.load_data()from matplotlib import pyplot as plt%matplotlib inlineplt.imshow(x_train[0])
# 缩放x_train, x_test = x_train / 255.0, x_test / 255.0
# 使用tf的接口kerasmodel = tf.keras.models.Sequential([ # (28, 28,1) ->(28, 28) tf.keras.layers.Flatten(input_shape=(28, 28)), # 128 神经元个数 tf.keras.layers.Dense(128, activation='relu'), # 防止过拟合 tf.keras.layers.Dropout(0.2), # 分类 10份 tf.keras.layers.Dense(10, activation='softmax')])# 多分类sparse_categorical_crossentropymodel.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
# 5个epochsmodel.fit(x_train, y_train, epochs=5)model.evaluate(x_test, y_test, verbose=2)# [0.07285336476690137, 0.9783]
Train on 60000 samplesEpoch 1/560000/60000 [==============================] - 8s 132us/sample - loss: 0.3024 - accuracy: 0.9125Epoch 2/560000/60000 [==============================] - 6s 92us/sample - loss: 0.1457 - accuracy: 0.9564Epoch 3/560000/60000 [==============================] - 5s 88us/sample - loss: 0.1095 - accuracy: 0.9672Epoch 4/560000/60000 [==============================] - 5s 88us/sample - loss: 0.0900 - accuracy: 0.9730Epoch 5/560000/60000 [==============================] - 6s 92us/sample - loss: 0.0745 - accuracy: 0.976910000/1 - 1s - loss: 0.0372 - accuracy: 0.9783[0.07285336476690137, 0.9783]