Question: 3. Consider the following data set: PlayTennis: training examples Day Outlook Temperature Humidity Wind Play Tennis D1 Sunny Hot High Weak No D2 Sunny Hot

3. Consider the following data set: PlayTennis: training examples Day Outlook Temperature Humidity Wind Play Tennis D1 Sunny Hot High Weak No D2 Sunny Hot High Strong No D3 Overcast Hot High Weak Yes D4 Rain Mild High Weak Yes D5 Rain Cool Normal Weak Yes D6 Rain Cool Normal Strong No D7 Overcast Cool Normal Strong Yes D8 Sunny Mild High Weak No D9 Sunny Cool Normal Weak Yes D10 Rain Mild Normal Weak Yes D11 Sunny Mild Normal Strong Yes D12 Overcast Mild High Strong Yes D13 Overcast Hot Normal Weak Yes D14 Rain Mild High Strong No Answer the following questions with respect to Naive Bayes Classification. Assume that Play Tennis is the binary target class. . a) What happens if we choose a prior on the class and no prior for the individual features? Assume a beta prior with alpha=5 and beta=1 and the Bayesian averaging method discussed in class. Given the above beta prior and for a new instance, x = (Outlook = Sunny, Temperature = Cool, Humidity = High, Wind = Weak), the probability of playing tennis is same as the probability of not playing tennis. b) For a new instance, x = (Outlook = Sunny, Temperature = Cool, Humidity = High, Wind = Weak), the probability of playing tennis is the same as probability of not playing tennis c) For a new instance, x = (Outlook = Sunny, Temperature = Cool, Humidity = High, Wind = Weak), the probability of playing tennis is greater than probability of not playing tennis d) For a new instance, x = (Outlook = Sunny, Temperature = Cool, Humidity = High, Wind = Weak), the class conditional probability P(x/PlayTennis = Yes) is 1/81
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
