对3层神经网络结构推导,求出它的参数,以及每层需要计算的参数和数量。
说明:本次总结的图片来自周志华老师的课件。
单个节点的神经元
图中给出了输入到某一个隐藏层单一节点的过程
一个完整的神经网络结构如下:
对于给定的数据集(x1,y1),(x2,y2),...,(xn,yn){(x_{1},y_{1}),(x_{2},y_{2}),...,(x_{n},y_{n})}
对于第kk个样本在输出层的第jj个节点上的输出结果为:
y^kj
\hat{y}^{k}_{j}
那么,对于一个样本来说,整体的均方误差为:
Ek=12∑j=1l(y^kj−ykj)2
E_{k} = \frac{1}{2} \sum_{j=1}^{l} (\hat{y}^{k}_{j} - y^{k}_{j})^{2}
基于梯度下降法来进行更新: 激活函数为
f
f 这里ff为给定的表示符号,可代指所有符合条件的激活函数。不过,本博文设置的激活函数为sigmoid,即f(x)=11+e−xf(x) = \frac{1}{1+e^{-x}}
学习率为
η
\eta
对权重ww和vv的更新,遵循先ww后vv,原因是先更新靠近输出的权重,ww是属于靠近输出层的权重。
w<=w+Δw
w <= w + \Delta w
v<=v+Δv
v <= v + \Delta v
这里,Δw=−η∂Ek∂whj\Delta w = -\eta \frac{\partial E_{k}}{ \partial w_{hj}}
由于whjw_{hj}先影响第jj个输出层神经元的输入值βj\beta_{j},再影响到它的输出值y^kj\hat{y}^{k}_{j},最后是EkE_{k}
由链式法则,
∂Ek∂whj=∂Ek∂y^kj∗∂y^kj∂βj∗∂βj∂whj
\frac{\partial E_{k}}{\partial w_{hj}} = \frac{ \partial E_{k}}{ \partial \hat{y}^{k}_{j}} * \frac{\partial \hat{y}^{k}_{j}}{\partial \beta_{j}} * \frac{\partial \beta_{j}}{\partial w_{hj}}
又:
∂βj∂whj=bh
\frac {\partial \beta_{j}}{ \partial w_{hj}} = b_{h}
设
gj=−∂Ek∂y^kj∗y^kj∂βj
g_{j} = - \frac{\partial E_{k}}{\partial \hat{y}^{k}_{j}} * \frac{\hat{y}^{k}_{j}}{\partial \beta_{j}}
于是,
gj=−(y^kj−ykj)f′(βj−θj)=y^kj(1−ykj)(ykj−y^kj)
g_{j} = - (\hat{y}^{k}_{j} - y^{k}_{j}) f^{'}(\beta_{j} - \theta_{j}) = \hat{y}^{k}_{j} (1-y^{k}_{j}) (y^{k}_{j} - \hat{y}^{k}_{j})
进一步,
∂Ek∂hj=gj∗bh
\frac{\partial E_{k}}{\partial h_{j}} = g_{j} * b_{h}
从而,
Δwhj=η∗gj∗bh
\Delta w_{hj} = \eta * g_{j} * b_{h}
更新:
whj=whj+η∗gj∗bh
w_{hj} = w_{hj}+ \eta * g_{j} * b_{h}
对θ\theta更新的规则:
θ<=θ+Δθ
\theta < = \theta + \Delta \theta
这里,
Δθj=−η∂Ek∂θj
\Delta \theta_{j} = -\eta \frac{\partial E_{k}}{\partial \theta_{j}}
对于,
∂Ek∂θj=∂Ek∂y^kj∂y^kj∂θj
\frac{\partial E_{k}}{\partial \theta_{j}} = \frac{\partial E_{k}}{\partial \hat{y}^{k}_{j}} \frac{\partial \hat{y}^{k}_{j}}{\partial \theta_{j}}
进一步,
∂Ek∂θj=12∗2∗(y^kj−ykj)∗y^kj∗(−1)∗(1−y^kj)=−y^kj∗(1−y^kj)∗(y^kj−ykj)
\frac{\partial E_{k}}{\partial \theta_{j}} = \frac{1}{2} *2 *(\hat{y}^{k}_{j} - y^{k}_{j}) * \hat{y}^{k}_{j} *(-1)* (1-\hat{y} ^{k}_{j}) = - \hat{y}^{k}_{j} * (1 - \hat{y}^{k}_{j}) * (\hat{y}^{k}_{j} - y^{k}_{j})
从而,
θj+1=θj+η∗y^kj∗(1−y^kj)∗(y^kj−ykj)
\theta_{j+1} = \theta_{j} + \eta * \hat{y}^{k}_{j} * (1 - \hat{y}^{k}_{j}) * (\hat{y}^{k}_{j} - y^{k}_{j})
更新规则:
v<=v+(−η∂Ek∂v)=v+Δv
v <= v + (-\eta \frac{\partial E_{k}}{\partial v}) = v + \Delta v
对于,
Δvih=−η∂Ek∂vih
\Delta v_{ih} = -\eta \frac{\partial E_{k}}{\partial v_{ih}}
进一步,
∂Ek∂vih=∑j=1l∂Ek∂y^kj∂y^kj∂bh∂bh∂vih
\frac{\partial E_{k}}{\partial v_{ih}} = \sum^{l}_{j=1} \frac{\partial E_{k}}{\partial \hat{y}^{k}_{j}} \frac{\partial \hat{y}^{k}_{j}}{\partial {b_{h}}} \frac{\partial b_{h}}{\partial v_{ih}}
由,
∂y^kj∂bh=y^kj∂βj∂βj∂bh=y^kj(1−y^kj)whj
\frac{\partial \hat{y}^{k}_{j}}{\partial b_{h}} = \frac{\hat{y}^{k}_{j}}{\partial \beta_{j}} \frac{\partial \beta_{j}}{\partial b_{h}} = \hat{y}^{k}_{j} (1-\hat{y}^{k}_{j}) w_{hj}
于是,
∂Ek∂vih=bh(1−bh)∑j=1lwhjy^kj(1−y^kj)(ykj−y^kj)
\frac{\partial E_{k}}{\partial v_{ih}} = b_{h}(1 - b_{h}) \sum_{j=1}^{l} w_{hj} \hat{y}^{k}_{j}(1-\hat{y}^{k}_{j})(y^{k}_{j} - \hat{y}^{k}_{j})
vv的更新为:
vj+1=vj+bh(1−bh)∑j=1lwhjy^kj(1−y^kj)(ykj−y^kj)
v_{j+1} = v_{j} + b_{h}(1 - b_{h}) \sum_{j=1}^{l} w_{hj} \hat{y}^{k}_{j}(1-\hat{y}^{k}_{j})(y^{k}_{j} - \hat{y}^{k}_{j})
参数有:
权重: vihv_{ih} d*q 个 whjw_{hj} q*l个 隐藏层阈值 q个 输出层阈值 l个
合计: (d+l+1)*q + l