Question: Problem 5 [ 1 0 points ] Before the invention of Transformer and LLMs , the RNN - based Seq 2 Seq model was a
Problem points
Before the invention of Transformer and LLMs the RNNbased SeqSeq model was a very popular architecture for tasks with sequences as both input and output, such as machine translations, summarization, etc. Read the SeqSeq paper and answer the following questions.
points How does the encoder pass all the information of the input sequence to the decoder?
points How does the decoder know when to stop generation?
points What are the benefits of the encoderdecoder Transformer compared to the SeqSeq model? Explain at least two benefits.
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
