Consider the neural network shown below: Z 2 Z W The weight matrix, W, is: [1, 1, -1, 0.5,…

Question:

Transcribed Image Text:

Consider the neural network shown below:

Z₁

2₂

Z₂

W₂

The weight matrix, W, is: [1, 1, -1, 0.5, 1, 2]. Assume that the hidden layer uses RelU and the

output layer uses Sigmoid activation function. Assume squared error loss, i.e.,

Loss = (y – y)².

The input x = 4, and the output y = 0.

Using this information, answer the questions below:

(Show all work, and all answers should be rounded to 2 decimal places OR POINTS WILL

BE TAKEN OFF!)

(a) [2 points] Use forward propagation to compute the predicted output.

(b) [1 point] What is the loss or error value?

(c) [4 points] Using backpropagation, compute the gradient of the weight vector, that is, compute

the partial derivative of the error with respect to all of the weights.

Expert Answer:

Answer rating: 100% (QA)

Part a Forward propagation Compute the weighted sum of the inputs in the hidden layer Z1 4 1 4 1 1 0

View the full answer