Question: J = ( a ) ? , iinD ( Y a i - [ U V T ] a i ) 2 2 + 2
iinD
In order to break a big optimization problem into smaller pieces that we know how to
solve, we fix and find the best for that But a subtle and important point is
that even if is best for then the that is best for might not be the
original It's like how we might be some lonely person's best friend, even though
they are not our best friend. In light of this, we repeat, like this: we fix and solve
for then fix to be the result from the previous step and solve for and repeat
this alternate process until we find the solution. This is an example of iterative
optimization, where we greedily take steps in a good direction, but as we do so the
context shifts so 'the good direction' evolves. Gradient descent, which we've already
seen, has the same structure.
Consider the case The matrices and reduce to vectors and such
that and
When is fixed, finding that minimizes becomes equivalent to finding that
minimizes
Consider the case The matrices and reduce to vectors and such
that and
When is fixed, finding that minimizes becomes equivalent to finding that
minimizes
iinD
iinD
iinD
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
