Question: 6.6. Draw the Bayesian belief network that represents the conditional independence assumptions of the naive Bayes classifier for the PlayTennis problem of Section 6.9.1. Give

 6.6. Draw the Bayesian belief network that represents the conditional independence

assumptions of the naive Bayes classifier for the PlayTennis problem of Section

6.9.1. Give the conditional probability table associated with the node Wind. 6.9.1

An Illustrative Example Let us apply the naive Bayes classifier to a

6.6. Draw the Bayesian belief network that represents the conditional independence assumptions of the naive Bayes classifier for the PlayTennis problem of Section 6.9.1. Give the conditional probability table associated with the node Wind. 6.9.1 An Illustrative Example Let us apply the naive Bayes classifier to a concept learning problem we considered during our discussion of decision tree learning: classifying days according to whether someone will play tennis. Table 3.2 from Chapter 3 provides a set of 14 training examples of the target concept PlayTennis, where each day is described by the attributes Outlook, Temperature, Humidity, and Wind. Here we use the naive Bayes classifier and the training data from this table to classify the following novel instance: Outlook = sunny, Temperature = cool, Humidity = high, Wind = strong Our task is to predict the target value (yes or no) of the target concept PlayTennis for this new instance. Instantiating Equation (6.20) to fit the current task, the target value vNB is given by vNB=vj{yes,no}argmaxP(vj)iP(aivj)=vj[yes,no}argmaxP(vj)P(Outlook=sunnyvj)P(Temperature=coolvj)P(Humidity=highvj)P(Wind=strongvj) Notice in the final expression that ai has been instantiated using the particular attribute values of the new instance. To calculate vNB we now require 10 probabilities that can be estimated from the training data. First, the probabilities of the different target values can easily be estimated based on their frequencies over the Notice in the final expression that ai has been instantiated using the particular attribute values of the new instance. To calculate vNB we now require 10 probabilities that can be estimated from the training data. First, the probabilities of the different target values can easily be estimated based on their frequencies over the 14 training examples P(PlayTennis=yes)=9/14=.64P(PlayTennis=no)=5/14=.36 CHAPTER 6 BAYESIAN LEARNING 179 Similarly, we can estimate the conditional probabilities. For example, those for Wind = strong are P(Wind=strongPlayTennis=yes)=3/9=.33P(Wind=strongPlayTennis=no)=3/5=.60 Using these probability estimates and similar estimates for the remaining attribute values, we calculate vNB according to Equation (6.21) as follows (now omitting attribute names for brevity) Using these probability estimates and similar estimates for the remaining attribute values, we calculate vNB according to Equation (6.21) as follows (now omitting attribute names for brevity) P(yes)P(sunnyyes)P(cool|yes)P(highyes)P(strongyes)=.0053P(no)P(sunnyno)P(coolno)P(highno)P(strongno)=.0206 Thus, the naive Bayes classifier assigns the target value PlayTennis = no to this new instance, based on the probability estimates learned from the training data. Furthermore, by normalizing the above quantities to sum to one we can calculate the conditional probability that the target value is no, given the observed attribute values. For the current example, this probability is .0206+.0053.0206=.795

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!