Question: a ) ( 1 5 points ) Take the pretrained BERT for BertForSequenceClassification model . Extract logits for each token in the sentence using BERT
a points Take the pretrained BERT for BertForSequenceClassification model Extract logits for each token in the sentence using BERT for a sentence. So for example, if you have the sentence Hello how are you doing today? you will get vectors corresponding to each token in the sentence. To get the embeddings, you get the output of BERT output and do embeddings output.lasthiddenstate.
b points Extract BERT embeddings for all positive and negative sentences in the train.tsv file we used in class for sentiment classification.
c points Train a single layer LSTM network with a hidden dimension of to do sentiment classification from BERT outputs. Use a learning rate of batchsize of
You dont have to tune the number of iterations. Just run a few iterations and report the accuracy.
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
