Recent Question/Assignment

Part 1. Text Reading:
Pearson Artificial Intelligence A modern approach. Third Edition. Russell Norvig
Decision Trees (Chap. 18 Sec 18.3), Reasoning with uncertainty (Chap. 13, 14)
Course Slides
Part 2. Problems:
(Note: Please include any external reference materials other than the textbook. Use the APA format where appropriate.)
Problem 2.1: Decision Tree
For this question you need to refer to the decision tree section in the Course Slides (pp. 64-77 of the PDF file) posted on Blackboard.
One major issue for any decision tree algorithm is how to choose an attribute based on which the data set can be categorized and a well-balanced tree can be created. The most traditional approach is called the ID3 algorithm proposed by Quinlan in 1986. The detailed ID3 algorithm is shown in the slides. The textbook provides some discussions on the algorithm in Section 18.3. For this problem please follow the ID3 algorithm and manually calculate the values based on a data set similar to (but not the same as) the one in the slides (p. 74). This exercise should help you get deep insights on the execution of the ID3 algorithm. Please note that concepts discussed here (for example, entropy, information gain) are very important in information theory and signal processing fields. The new data set is shown as follows. In this example row 9 is removed from the original set and all other rows remain the same.
Following the conventions used in the slides, please show a manual process and calculate the following values: Entropy(S), Entropy(S weather=sunny),Entropy(S weather=windy), Entropy(S weather=rainy),
Gain (S, weather), Gain (S, parents) and Gain (S, money). Based on the last three values, which attribute should be chosen to split on?
Please show detailed process how you obtain the solutions.
Weekend Weather Parents Money Decision
(Category)
W1 Sunny Yes Rich Cinema
W2 Sunny No Rich Tennis
W3 Windy Yes Rich Cinema
W4 Rainy Yes Poor Cinema
W5 Rainy No Rich Stay in
W6 Rainy Yes Poor Cinema
W7 Windy No Poor Cinema
W8 Windy No Rich Shopping
W9 Sunny No Rich Tennis
Problem 2.2: Decision Tree [30 points]
The Decision Tree inductive learning algorithm may be used to generate “IF … THEN” rules that are consistent with a set of given examples. Consider an example where 10 binary input variables X1, X2, , X10 are used to classify a binary output variable (Y).
(i) At most how many examples do we need to exhaustively enumerate every possible combination of inputs?
(ii) At most how many leaf nodes can a decision tree have if it is consistent with a training set containing 100 examples?
Please show detailed process how you obtain the solutions.
Problem 2.3: Bayesian Belief Networks [30 points]
A quality control manager has used algorithm C4.5 to come up with rules that classify items based on several input factors. The output has two classes -- Accept and Reject. Test results with the rule set indicate that 5% of the good items are classified as Reject and 3% of the bad items classified as Accept.
Historical data suggests that two percent of the items are bad. Based on this information, what is the conditional probability that:
(i) An item classified as Reject is actually good?
(ii) An item classified as Accept is actually bad?
Please show detailed process how you obtain the solutions.