Question: Consider a two-class one-feature classification problem with the following Gaussian class-conditional densities. p(x|w1) = N(0, 1) and p(x|w2) = N(1, 2). Assume equal prior probabilities

Consider a two-class one-feature classification problem with the following Gaussian class-conditional densities. p(x|w1) = N(0, 1) and p(x|w2) = N(1, 2). Assume equal prior probabilities and 0-1 loss function. Solve the following two questions and show your work.

(i) What is the Bayes decision boundary? (ii) Suppose the prior probabilities are changed as follows: P(w1) = 0.6 and P(w2) = 0.4. How will the decision boundary change?

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!