[SOLVED] EEL5840-Homework 5

25.99 $

Category:

Description

Rate
  1. Consider the Neural Network below.

All weights are initialized to the values shown (and there are no biases for simplicity). Consider the data point x = [1,1]T with desired output vector d = [1,0]T Complete one iteration of backpropagation by hand assuming a learning rate of η = 0.1. What would all the weight values be after the one backpropagation iteration? Show your work. Use the following activation function:

(1)

  1. Derive the update equation for output layer neurons if the activation function usedis the hyperbolic tangent, ϕ(v) = tanh(v) (instead of the activation function used in the notes). Show your work.
  2. Derive the update equation for output layer neurons if the activation function used isthe softmax function,, where O is the number of output neurons and

vi is from the ith neuron. Note: In this case the output is multi-dimensional (i.e., di ∈RO) and there will be a specific di for each output neuron. Show your work.