Question: Consider a two-class one-feature classification problem with the following Gaussian class-conditional densities. p(x|w1) = N(0, 1) and p(x|w2) = N(1, 2). Assume equal prior probabilities
Consider a two-class one-feature classification problem with the following Gaussian class-conditional densities. p(x|w1) = N(0, 1) and p(x|w2) = N(1, 2). Assume equal prior probabilities and 0-1 loss function. Solve the following two questions and show your work.
(i) What is the Bayes decision boundary? (ii) Suppose the prior probabilities are changed as follows: P(w1) = 0.6 and P(w2) = 0.4. How will the decision boundary change?
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
