Question: [ 5 pts ] In LSTM , the activation functions of gates ( forget , input, and output ) are sigmoid functions. Explain what will
pts In LSTM the activation functions of gates forget input, and output are sigmoid functions.
Explain what will happen if we use ReLU instead.
pts What are the problems with using vocabulary indexing in text processing. Explain it using the
example given in the slides.
In TFIDF,
pts when a word document contains the term "cat" times, compute TF value of 'cat'.
pts The size of the corpus is million documents. If we assume there are million documents
that contain the term "cat". Compute IDF use the log form
pt Compute TFIDF of 'cat'
In CBOW,
pts Can we use a vocabulary index method instead of a onehot vector as an input to CBOW
pts The activation function of the output layer should be softmax. Explain the reason.
pts After training is done in CBOW, how do we extract the word embedding vector for a certain
word.
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
