# 吴恩达深度学习课程作业Part.2

4 – Neural Network model

Logistic regression did not work well on the “flower dataset”. You are going to train a Neural Network with a single hidden layer.

Reminder: The general methodology to build a Neural Network is to:

1. Define the neural network structure ( # of input units, # of hidden units, etc). (定义神经网络结构（输入单元数量，隐藏单元数量等）。)

2. Initialize the model’s parameters(初始化模型的参数)

3. Loop:

– Implement forward propagation(实现向前传播)

– Compute loss（计算损失函数）

– Implement backward propagation to get the gradients(实现向后传播以获得梯度)

You often build helper functions to compute steps 1-3 and then merge them into one function we call. Once you’ve builtand learnt the right parameters, you can make predictions on new data.（您经常构建帮助函数来计算步骤1-3，然后将它们合并到一个函数中，我们称之为。一旦你建立了并学习了正确的参数，你就可以预测新的数据。）

4.1 – Defining the neural network structure

Exercise: Define three variables:

– n_x: the size of the input layer输入层的节点数

– n_h: the size of the hidden layer (set this to 4)隐藏层的节点数

– n_y: the size of the output layer输出层的节点数

Hint: Use shapes of X and Y to find n_x and n_y. Also, hard code the hidden layer size to be 4.

4.2 – Initialize the model’s parameters

Exercise: Implement the function.

Instructions:

– Make sure your parameters’ sizes are right. Refer to the neural network figure above if needed.(确保你的参数的大小是正确的。如果需要，请参考上面的神经网络图。)

– You will initialize the weights matrices with random values. (你将用随机值初始化权重矩阵。)

– Use:to randomly initialize a matrix of shape (a,b).

– You will initialize the bias vectors as zeros. (你将初始化偏置向量为零。)

– Use:to initialize a matrix of shape (a,b) with zeros.

4.3 – The Loop

Question: Implement.

Instructions:

– Look above at the mathematical representation of your classifier.(请看上面的分类器的数学表示。)

– You can use the function. It is built-in (imported) in the notebook.(你可以使用函数.它是notebook的内置函数)

– You can use the function. It is part of the numpy library.(你可以使用函数.它是notebook的内置函数)

– The steps you have to implement are:

1. Retrieve each parameter from the dictionary “parameters” (which is the output of) by using.(使用从字典“parameters”（这是的输出）中检索每个参数。)

2. Implement Forward Propagation. Compute

Z

[1]

,

A

[1],

Z

[2]and

A

[2]

(the vector of all your predictions on all the examples in the training set).(实现向前传播。计算

Z

[1]，A[1]，Z[2]和

A

[2]（训练中所有例子的所有预测的向量组）。)

– Values needed in the backpropagation are stored in ““. Thewill be given as an input to the backpropagation function.(反向传播所需的值存储在“cache`将作为反向传播函数的输入。)

Exercise: Implementto compute the value of the cost

J

.

Instructions:

– There are many ways to implement the cross-entropy loss. To help you, we give you how we would have implemented

∑i

=m

y

(i)

log(

a

[2](i)):

Question: Implement the update rule. Use gradient descent. You have to use (dW1, db1, dW2, db2) in order to update (W1, b1, W2, b2).(实施更新规则。使用渐变下降。你必须使用（dW1，db1，dW2，db2）来更新（W1，b1，W2，b2）。)

θ

=θ−

α

∂J

∂θwhere

α

is the learning rate and

θ

represents a parameter.

Illustration: The gradient descent algorithm with a good learning rate (converging) and a bad learning rate (diverging). Images courtesy of Adam Harley.(具有良好学习速率（收敛）和不良学习速率（发散）的梯度下降算法。)

• 发表于:
• 原文链接https://kuaibao.qq.com/s/20180711G0ET7P00?refer=cp_1026
• 腾讯「云+社区」是腾讯内容开放平台帐号（企鹅号）传播渠道之一，根据《腾讯内容开放平台服务协议》转载发布内容。
• 如有侵权，请联系 yunjia_community@tencent.com 删除。

2022-05-16

2022-05-16

2018-06-26

2018-06-26

2018-06-26

2018-06-26

2018-04-27

2018-04-25

2018-04-18

2022-05-16

2022-05-16

2022-05-16