Question: Positional encoding. Consider the sentence I love Machine Learning. Here, we treat each word as a token, where I has an index value of 1

Positional encoding. Consider the sentence "I love Machine Learning". Here, we treat each word as a token, where "I"
has an index value of 1, "love" has an index value of 2, and so on. To construct the positional encoding matrix Wp for this
sentence, we will fix the embedding/concept space to be 4-dimensional such that WpinR44 and use a modified version
of the positional encoding function proposed in the original paper "Attention is All you Need" by Vaswani et al., namely,
for any i,jin{1,dots,4}
(Wp)ij={cos(i10000|j2|2),ifjisodd,sin(i10000|j2|2),ifjiseven,
where |~z~| is the smallest integer which is greater than or equal to z. Using this encoding function, write down the positional
encoding matrix.
Positional encoding. Consider the sentence "I

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Finance Questions!