Question: Complete the translate function below by replacing all None placeholders with proper parameters. This function will take care of the following steps: Process the sentence

Complete the translate function below by replacing all None placeholders with proper parameters.
This function will take care of the following steps:
Process the sentence to translate and encode it
Set the initial state of the decoder
Get predictions of the next token (starting with the token) for a maximum of iterations (in case the token is never returned)
Return the translated text (as a string), the logit of the last iteration (this helps measure how certain was that the sequence was translated in its totality) and the translation in token format.
def translate(model, text, max_length=50, temperature=0.0):
"""Translate a given sentence from English to Portuguese
Args:
model (tf.keras.Model): The trained translator
text (string): The sentence to translate
max_length (int, optional): The maximum length of the translation. Defaults to 50.
temperature (float, optional): The temperature that controls the randomness of the predicted tokens. Defaults to 0.0.
Returns:
tuple(str, np.float, tf.Tensor): The translation, logit that predicted token and the tokenized translation
"""
# Lists to save tokens and logits
tokens, logits =[],[]
### START CODE HERE ###
# PROCESS THE SENTENCE TO TRANSLATE
# Convert the original string into a tensor
text = tf.None(None)[tf.None]
# Vectorize the text using the correct vectorizer
context = None(None).to_tensor()
# Get the encoded context (pass the context through the encoder)
# Hint: Remember you can get the encoder by using model.encoder
context = None.None(None)
# INITIAL STATE OF THE DECODER
# First token should be SOS token with shape (1,1)
next_token = tf.None((None, None), None)
# Initial hidden and cell states should be tensors of zeros with shape (1, UNITS)
state =[tf.None((None, None)), tf.None((None, None))]
# You are done when you draw a EOS token as next token (initial state is False)
done = None
# Iterate for max_length iterations
for None in None(None):
# Generate the next token
try:
next_token, logit, state, done = None(
decoder=None,
context=None,
next_token=None,
done=None,
state=None,
temperature=None
)
except:
raise Exception("Problem generating the next token")
# If done then break out of the loop
if None:
None
# Add next_token to the list of tokens
None.None(None)
# Add logit to the list of logits
None.None(None)
### END CODE HERE ###
# Concatenate all tokens into a tensor
tokens = tf.concat(tokens, axis=-1)
# Convert the translated tokens into text
translation = tf.squeeze(tokens_to_text(tokens, id_to_word))
translation = translation.numpy().decode()
return translation, logits[-1], tokens

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!