Question: # Your task is to replace ? ? ? import numpy as np # First we set the state of the network sigma =
# Your task is to replace
import numpy as np
# First we set the state of the network
sigma nptanh
w
b
# Then we define the neuron activation.
def aa:
z w a b
return sigma z
# Experiment with different values of x below.
x
printax
# First define our sigma function.
sigma nptanh
# Next define the feedforward equation.
def aw b a :
z w a b
return sigmaz
# The individual cost function is the square of the difference between
# the network output and the training data output.
def C w b x y :
return aw b x y
# This function returns the derivative of the cost function with
# respect to the weight.
def dCdw w b x y :
z
dCda # Derivative of cost with activation
dadz npcoshz # derivative of activation with weighted sum z
dzdw # derivative of weighted sum z with weight
return # Return the chain rule product.
# This function returns the derivative of the cost function with
# respect to the bias.
# It is very similar to the previous function.
# You should complete this function.
def dCdb w b x y :
z
dCda
dadz
# Change the next line to give the derivative of
# the weighted sum, z with respect to the bias, b
dzdb
return
# Define the activation function.
sigma nptanh
# Let's use a random initial weight and bias.
W nparray
b nparray
# define our feed forward function
def aa :
# Notice the next line is almost the same as previously,
# except we are using matrix multiplication rather than scalar multiplication
z
# Everything else is the same though,
return sigmaz
# Next, if a training example is
x nparray
y nparray
# Then the cost function is
d # Vector difference between observed and expected activation
C # Absolute value squared of the difference.
# First define our sigma function.
sigma nptanh
# Next define the feedforward equation.
def aw b a :
z
return sigmaz
# This function returns the derivative of the cost function with
# respect to the weight.
def dCdw w b x y :
dCda # Derivative of cost with activation
dadz # derivative of activation with weighted sum z
J
dzdw # derivative of weighted sum z with weight
J
return J # Return the chain rule product.
# This function returns the derivative of the cost function with
# respect to the bias.
# It is very similar to the previous function.
# You should complete this function.
def dCdb w b x y :
dCda
dadz
# Change the next line to give the derivative of
# the weighted sum, z with respect to the bias, b
dzdb
return dCda dadz dzdb
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
