TransWikia.com

Decision tree root

Mathematics Asked by John.L on February 17, 2021

I have a set of examples in the table : https://imgur.com/a/mBc2td4 , there are 3 properties with possible values "a","b","c". I am going to build a decision tree on this table and I am trying to find which attribute F1,F2,F3,Output will be the root of the decision. Here is what I have tried using entropy and information gain.

$E(Output) = E(1,1) = E(0.5,0.5) = -(0.5 log 0.5) – (0.5log 0.5) = ½ + ½ = 1$

For F1:

$E(Outlook,F1) = P(a) * E(2,1) + P(b) * E(1,2) + E(0,0) *0
=½ * ( – ⅔ log(⅔) -⅓*log(⅓) ) + ½ (-⅓ *log(⅓) – ⅔ log(⅔)) + 0
=-⅓ *log(⅓) – ⅔ log(⅔) = 0.918$

$Gain(OutLook,F1) = Entropy(Outlook) – Entroly(Outlook,F1)=
1 – 0.918 = 0.072$

For F2:

$E(Outlook,F2) = P(a) * E(1,1) + P(b) * E(0,2) + P(c) * E(2,0)=
⅓*(-½*log(½) -½*log(½) ) + ⅓*( – 0*log(0) + 1log(1)) + ⅓*(-log(1) – 0log(0))
=⅓*(-½*log(½) -½*log(½) ) = ⅓ * ( ½ + ½) =⅓ = 0.333$

$Gain(OutLook,F2) = Entryopy(Outlook) – Entroly(Outlook,F2)=
1 – 0.333 = 0.666$

For F3:

$E(Outlook,F3) = P(a) * E(1,1) + 0 + P(c) * E(2,2)
= 1/3 * (-½ log(½)-½(log(½)) + ⅔ *(-½ log(½)-½(log(½) )
⅓ * 1 + ⅔ * 1 = 1 $

$Gain(OutLook,F3) = Entropy(Outlook) – Entropy(Outlook,F3)=
1 – 1 = 0$

So the biggest gain is from attribute F2 so we will put it as a root, but I am not sure if this is the correct answer. Is my solution correct?

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP