TVP

# TensorFlow和Keras：初学者的简单示例教程

Outlook

Sunny = 1

Overcast = 0

Rain = -1

Humidity

Normal = 0

High = 1

Wind

Weak = 0

Strong = 1

Play

Yes = 1

No = 0

Inputs

[outlook, humidity, wind]

[1,1,0]

[1,1,1]

[0,1,0]

[-1,1,0]

[-1,0,0]

[-1,0,1]

[0,0,1]

[1,1,0]

[1,0,0]

[-1,0,0]

[1,0,1]

[0,1,1]

[0,0,0]

[-1,1,1]

Outputs

[play]

[0]

[0]

[1]

[1]

[1]

[0]

[1]

[0]

[1]

[1]

[1]

[1]

[1]

[0]

import tensorflow as tffrom tensorflow import keras

1.首先，我们定义机器学习模型并创建输入层，Python代码如下：

model = keras.Sequential()input_layer = keras.layers.Dense(3, input_shape=[3], activation='tanh')model.add(input_layer)

2.现在让我们创建输出层。

3.现在我们需要设置优化器。

4.现在我们编译机器学习模型。

model.compile(optimizer=gd,loss='mean_squared_error')

5.准备我们的训练数据。我们将数据格式化为tensorflow 变量。您也可以将它们格式化为numpy变量。

x = tf.Variable([[1,1,0],[1,1,1],[0,1,0],[-1,1,0],[-1,0,0],[-1,0,1],[0,0,1],[1,1,0],[1,0,0],[-1,0,0],[1,0,1],[0,1,1],[0,0,0],[-1,1,1]])y = tf.Variable([[0],[0],[1],[1],[1],[0],[1],[0],[1],[1],[1],[1],[1],[0]])

6.接下来我们拟合我们的数据。

model.fit(x, y ,epochs=1000, steps_per_epoch=10)

7.训练结束后，我们可以使用现有的训练输入来测试我们的数据，并将其与我们的训练输出进行比较。

results = model.predict(x, verbose=0, steps=1)print(results)

[[0.15947242]

[0.2009128 ]

[0.6640448 ]

[0.74036646]

[0.82375276]

[0.8218212 ]

[0.8796288 ]

[0.15947242]

[0.817388 ]

[0.82375276]

[0.8464085 ]

[0.7018454 ]

[0.8803827 ]

[0.75258946]]

[[0.04279386]

[0.05731023]

[0.97331977]

[0.9714603 ]

[0.9792363 ]

[0.06504408]

[0.9524578 ]

[0.04279386]

[0.9649155 ]

[0.9792363 ]

[0.958375 ]

[0.9327749 ]

[0.9992924 ]

[0.06461356]]

test = tf.Variable([[1,0,0]])result = model.predict(test, verbose=0, steps=1)

model.save_weights('model_weights.h5')

I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA

• 发表于:
• 原文链接https://kuaibao.qq.com/s/20181008A1B24V00?refer=cp_1026
• 腾讯「腾讯云开发者社区」是腾讯内容开放平台帐号（企鹅号）传播渠道之一，根据《腾讯内容开放平台服务协议》转载发布内容。
• 如有侵权，请联系 cloudcommunity@tencent.com 删除。

2018-06-21

2018-05-17

2022-11-28

2022-11-28

2022-11-28

2022-11-28