Question: 2 . ( 3 5 points ) Effect of constraints on optimal solutions. A key result that optimal classifiers pick the most probable class, which
points Effect of constraints on optimal solutions. A key result that optimal classifiers pick the most probable class, which defines Bayes optimality. One of the consequences is that optimal decisions are pure or crisp, they don't weight or mix decisions. But is this result always true? Here we show how constraints can make the answer to this question NO by understanding how they change the nature of an optimal solution.
One of the key ideas in the course is that the knowledge we have about the structure of our data also constitutes data that we can encode as constraints. A simple example is when a variable has upper andor lower bounds. As a concrete example, we might want to choose the best product eg pizza for a discrete set of use cases eg food
restriction types like vegatarian glutenfree dairyfree etc. for the lowest price, given a dataset with features x Xitem Xcust, Xprice labeled by the best use case y
Consider a Nclass classification problem with features x in Rd and onehot encoded labels y ex where ex are unit vectors with at component k and zero elsewhere, and k in N Assume the data is Ddistributed x y ~ D where D is a fixed but unknown distribution on Rd x Assume py ex ak such that k Consider the classifier given by
fxk: Pyekx BjkPy exjk
The standard classification loss is the error rate, given by
Lf Pfx y Exy~D fx y where is the indicator function for loss and L is the expected loss or true error rate. Professor Bayes claims the following for any other classifier function
gf: Rd we have L f Lg which is the definition of Bayes optimal for the proper choice of Bjk The result is standard and easy to find.
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
