# Lets get started!!!

image.png

image.png

```import numpy as np
def sigmoid(x):
return 1/(1+np.exp(-x))
a=-2
f=sigmoid(a)
print(f) #outputs 0.1192```

# Aim

image.png

image.png

```import numpy as np

def sigmoid(x):
return 1./(1+np.exp(-x))

def derivative_sigmoid(x):
return sigmoid(x) * (1 - sigmoid(x))

a = -2
h = 0.1
a = a + h * derivative_sigmoid(a)
f = sigmoid(a)
print(f)  #outputs 0.1203```

image.png

```import numpy as np

return x+y

def product(x, y):
return x * y

def sigmoid(x):
return 1 / (1 + np.exp( -x ))

a=1
b=-2
c=-3
e=product(c,d)
f=sigmoid(e)
print(f)  #outputs 0.952574```

image.png

image.png

image.png

```import numpy as np

return x + y

def product(x, y):
return x * y

def sigmoid(x):
return 1 / (1 + np.exp(-x))

def derivative_sigmoid(x):
return sigmoid(x) * (1 - sigmoid(x))

# initialization
a = 1
b = -2
c = -3
# forward-propogation
e = product(c, d)
# step size
h = 0.1
# derivatives
derivative_f_e = derivative_sigmoid(e)
derivative_e_d = c
derivative_e_c = d
derivative_d_a = 1
derivative_d_b = 1
# backward-propogation (Chain rule)
derivative_f_a = derivative_f_e * derivative_e_d * derivative_d_a
derivative_f_b = derivative_f_e * derivative_e_d * derivative_d_b
derivative_f_c = derivative_f_e * derivative_e_c
# update-parameters
a = a + h * derivative_f_a
b = b + h * derivative_f_b
c = c + h * derivative_f_c
e = product(c, d)
f = sigmoid(e)
print(f)  # prints 0.9563```

# 待续

0 条评论

## 相关文章

35870

31850

47440

### BP神经网络

BP(Back Propagation)神经网络是1986年由以Rumelhart和McCelland为首的科学家小组提出的，是一种按误差逆传播算法训练的多层前...

30190

338120

17320

### 吴恩达深度学习笔记 course4 week4 测验

Face verification requires comparing a new picture against one person’s face, wh...

32340

81550

### BP神经网络

BP(Back Propagation)神经网络是1986年由以Rumelhart和McCelland为首的科学家小组提出的，是一种按误差逆传播算法训练的多层前...

31590

793110