Question: Exercise 0 1 Instructions: Implement the Siamese function below. You should be using all the functions explained be# GRADED FUNCTION: Siamese def Siamese ( text
Exercise
Instructions: Implement the Siamese function below. You should be using all the functions explained be# GRADED FUNCTION: Siamese
def Siamesetextvectorizer, vocabsize dfeature:
Returns a Siamese model.
Args:
textvectorizer TextVectorization: TextVectorization instance, already adapted to your training data.
vocabsize int optional: Length of the vocabulary. Defaults to
dmodel int optional: Depth of the model. Defaults to
Returns:
tfmodel.Model: A Siamese model.
### START CODE HERE ###
branch tfkeras.models.Sequentialname'sequential'
# Add the textvectorizer layer. This is the textvectorizer you instantiated and trained before
branch.addtextvectorizer
# Add the Embedding layer. Remember to call it 'embedding' using the parameter name
branch.addtfkeras.layers.Embedding
inputdimvocabsize,
outputdimdfeature,
embeddingsinitializer'uniform',
embeddingsregularizerNone,
activityregularizerNone,
embeddingsconstraintNone,
maskzeroFalse,
inputlengthNone,
sparseFalse,
name'embedding'
# Add the LSTM layer, recall from W that you want to the LSTM layer to return sequences, ot just one value.
# Remember to call it LSTM using the parameter name
branch.addtfkeras.layers.LSTM
unitsdfeature,
activation'tanh',
recurrentactivation'sigmoid',
usebiasTrue,
kernelinitializer'glorotuniform',
recurrentinitializer'orthogonal',
biasinitializer'zeros',
unitforgetbiasTrue,
kernelregularizerNone,
recurrentregularizerNone,
biasregularizerNone,
activityregularizerNone,
kernelconstraintNone,
recurrentconstraintNone,
biasconstraintNone,
dropout
recurrentdropout
returnsequencesTrue,
returnstateFalse,
gobackwardsFalse,
statefulFalse,
timemajorFalse,
unrollFalse,
nameLSTM
# Add the GlobalAveragePoolingD layer. Remember to call it 'mean' using the parameter name
branch.addtfkeras.layers.GlobalAveragePoolingD
dataformat'channelslast',
name'mean'
# Add the normalizing layer using the Lambda function. Remember to call it 'out' using the parameter name
branch.addtfkeras.layers.Lambda
lambda x: tfmath.lnormalizex
name'out'
# Define both inputs. Remember to call then 'input and 'input using the name parameter.
# Be mindful of the data type and size
input tfkeras.layers.InputNone dtypeNone, name'input
input tfkeras.layers.InputNone dtypeNone, name'input
# Define the output of each branch of your Siamese network. Remember that both branches have the same coefficients,
# but they each receive different inputs.
branch None
branch None
# Define the Concatenate layer. You should concatenate columns, you can fix this using the axisparameter
# This layer is applied over the outputs of each branch of the Siamese network
conc tfkeras.layers.ConcatenateaxisNone, name'concNone None
### END CODE HERE ###
return tfkeras.models.Modelinputsinput input outputsconc, name"SiameseModel"low
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
