神经网络连载（二）

1：输入与输出

1.1权重w(i,j)。

1.2阈值B。

1.3激活函数

```def tanh(x):
return np.tanh(x)

def tanh_deriv(x):
return 1.0 - np.tanh(x) * np.tanh(x)```

```class NeuralNetwork:
def __init__(self, layers, activation='tanh'):
if activation == 'logistic':
self.activation = logistic
self.activation_deriv = logistic_derivative
elif activation == 'tanh':
self.activation = tanh
self.activation_deriv = tanh_deriv

self.weights = []
for i in range(1, len(layers) - 1):
self.weights.append(
(2 * np.random.random((layers[i - 1] + 1, layers[i] + 1)) - 1) * 0.25)
self.weights.append(
(2 * np.random.random((layers[-2] + 1, layers[-1])) - 1) * 0.25)```

self.weights用来储存这些神经元的权值和阈值。一个for循环，用随机数初始化这些权值。跳出循环后加上阈值。这样就构成了一个神经网络简单模型。

```    def predict(self, x):
x = np.array(x)
temp = np.ones(x.shape[0] + 1)
temp[0:-1] = x
a = temp
for l in range(0, len(self.weights)):
a = self.activation(np.dot(a, self.weights[l]))
return a```

```import numpy as np

def tanh(x):
return np.tanh(x)

def tanh_deriv(x):
return 1.0 - np.tanh(x) * np.tanh(x)

class NeuralNetwork:
def __init__(self, layers):

self.activation = tanh
self.activation_deriv = tanh_deriv

self.weights = []
for i in range(1, len(layers) - 1):
self.weights.append(
(2 * np.random.random((layers[i - 1] + 1, layers[i] + 1)) - 1) * 0.25)
self.weights.append(
(2 * np.random.random((layers[-2] + 1, layers[-1])) - 1) * 0.25)

def predict(self, x):
x = np.array(x)
temp = np.ones(x.shape[0] + 1)
temp[0:-1] = x
a = temp
for l in range(0, len(self.weights)):
a = self.activation(np.dot(a, self.weights[l]))
return a

nn = NeuralNetwork([2, 3, 1])
X = np.array([[0, 0], [0, 1], [1, 0], [1, 1]])
y = np.array([0, 1, 1, 0])
for i in [[0, 0], [0, 1], [1, 0], [1, 1]]:
print(i, nn.predict(i))
```

92 篇文章17 人订阅

0 条评论