A Step by Step CART Decision Tree Example

AI入门学习

共 8043字,需浏览 17分钟

 ·

2021-10-18 03:21

最近看到一篇文章,关于决策树Gini系数计算的详细过程,写的非常细致清晰,分享给大家,只不过是英文版的,不过内容比较简单,看起来也不是很费劲。

原文:https://sefiks.com/2018/08/27/a-step-by-step-cart-decision-tree-example/


An algorithm can be transparent only if its decisions can be read and understood by people clearly. Even though deep learning is superstar of machine learning nowadays, it is an opaque algorithm and we do not know the reason of decision. Herein, Decision tree algorithms still keep their popularity because they can produce transparent decisions. ID3 uses information gain whereas C4.5 uses gain ratio for splitting. Here, CART is an alternative decision tree building algorithm. It can handle both classification and regression tasks. This algorithm uses a new metric named gini index to create decision points for classification tasks. We will mention a step by step CART decision tree example by hand from scratch.

We will work on same dataset in ID3. There are 14 instances of golf playing decisions based on outlook, temperature, humidity and wind factors.


 

DayOutlookTemp.HumidityWindDecision
1SunnyHotHighWeakNo
2SunnyHotHighStrongNo
3OvercastHotHighWeakYes
4RainMildHighWeakYes
5RainCoolNormalWeakYes
6RainCoolNormalStrongNo
7OvercastCoolNormalStrongYes
8SunnyMildHighWeakNo
9SunnyCoolNormalWeakYes
10RainMildNormalWeakYes
11SunnyMildNormalStrongYes
12OvercastMildHighStrongYes
13OvercastHotNormalWeakYes
14RainMildHighStrongNo

Gini index

Gini index is a metric for classification tasks in CART. It stores sum of squared probabilities of each class. We can formulate it as illustrated below.

Gini = 1 – Σ (Pi)2 for i=1 to number of classes

Outlook

Outlook is a nominal feature. It can be sunny, overcast or rain. I will summarize the final decisions for outlook feature.

OutlookYesNoNumber of instances
Sunny235
Overcast404
Rain325

Gini(Outlook=Sunny) = 1 – (2/5)2 – (3/5)2 = 1 – 0.16 – 0.36 = 0.48

Gini(Outlook=Overcast) = 1 – (4/4)2 – (0/4)2 = 0

Gini(Outlook=Rain) = 1 – (3/5)2 – (2/5)2 = 1 – 0.36 – 0.16 = 0.48

Then, we will calculate weighted sum of gini indexes for outlook feature.

Gini(Outlook) = (5/14) x 0.48 + (4/14) x 0 + (5/14) x 0.48 = 0.171 + 0 + 0.171 = 0.342

Temperature

Similarly, temperature is a nominal feature and it could have 3 different values: Cool, Hot and Mild. Let’s summarize decisions for temperature feature.

TemperatureYesNoNumber of instances
Hot224
Cool314
Mild426

Gini(Temp=Hot) = 1 – (2/4)2 – (2/4)2 = 0.5

Gini(Temp=Cool) = 1 – (3/4)2 – (1/4)2 = 1 – 0.5625 – 0.0625 = 0.375

Gini(Temp=Mild) = 1 – (4/6)2 – (2/6)2 = 1 – 0.444 – 0.111 = 0.445

We’ll calculate weighted sum of gini index for temperature feature

Gini(Temp) = (4/14) x 0.5 + (4/14) x 0.375 + (6/14) x 0.445 = 0.142 + 0.107 + 0.190 = 0.439

Humidity

Humidity is a binary class feature. It can be high or normal.

HumidityYesNoNumber of instances
High347
Normal617

Gini(Humidity=High) = 1 – (3/7)2 – (4/7)2 = 1 – 0.183 – 0.326 = 0.489

Gini(Humidity=Normal) = 1 – (6/7)2 – (1/7)2 = 1 – 0.734 – 0.02 = 0.244

Weighted sum for humidity feature will be calculated next

Gini(Humidity) = (7/14) x 0.489 + (7/14) x 0.244 = 0.367

Wind

Wind is a binary class similar to humidity. It can be weak and strong.

WindYesNoNumber of instances
Weak628
Strong336

Gini(Wind=Weak) = 1 – (6/8)2 – (2/8)2 = 1 – 0.5625 – 0.062 = 0.375

Gini(Wind=Strong) = 1 – (3/6)2 – (3/6)2 = 1 – 0.25 – 0.25 = 0.5

Gini(Wind) = (8/14) x 0.375 + (6/14) x 0.5 = 0.428

Time to decide

We’ve calculated gini index values for each feature. The winner will be outlook feature because its cost is the lowest.

FeatureGini index
Outlook0.342
Temperature0.439
Humidity0.367
Wind0.428

We’ll put outlook decision at the top of the tree.

First decision would be outlook feature

You might realize that sub dataset in the overcast leaf has only yes decisions. This means that overcast leaf is over.

Tree is over for overcast outlook leaf

We will apply same principles to those sub datasets in the following steps.

Focus on the sub dataset for sunny outlook. We need to find the gini index scores for temperature, humidity and wind features respectively.

DayOutlookTemp.HumidityWindDecision
1SunnyHotHighWeakNo
2SunnyHotHighStrongNo
8SunnyMildHighWeakNo
9SunnyCoolNormalWeakYes
11SunnyMildNormalStrongYes

Gini of temperature for sunny outlook

TemperatureYesNoNumber of instances
Hot022
Cool101
Mild112

Gini(Outlook=Sunny and Temp.=Hot) = 1 – (0/2)2 – (2/2)2 = 0

Gini(Outlook=Sunny and Temp.=Cool) = 1 – (1/1)2 – (0/1)2 = 0

Gini(Outlook=Sunny and Temp.=Mild) = 1 – (1/2)2 – (1/2)2 = 1 – 0.25 – 0.25 = 0.5

Gini(Outlook=Sunny and Temp.) = (2/5)x0 + (1/5)x0 + (2/5)x0.5 = 0.2

Gini of humidity for sunny outlook

HumidityYesNoNumber of instances
High033
Normal202

Gini(Outlook=Sunny and Humidity=High) = 1 – (0/3)2 – (3/3)2 = 0

Gini(Outlook=Sunny and Humidity=Normal) = 1 – (2/2)2 – (0/2)2 = 0

Gini(Outlook=Sunny and Humidity) = (3/5)x0 + (2/5)x0 = 0

Gini of wind for sunny outlook

WindYesNoNumber of instances
Weak123
Strong112

Gini(Outlook=Sunny and Wind=Weak) = 1 – (1/3)2 – (2/3)2 = 0.266

Gini(Outlook=Sunny and Wind=Strong) = 1- (1/2)2 – (1/2)2 = 0.2

Gini(Outlook=Sunny and Wind) = (3/5)x0.266 + (2/5)x0.2 = 0.466

Decision for sunny outlook

We’ve calculated gini index scores for feature when outlook is sunny. The winner is humidity because it has the lowest value.

FeatureGini index
Temperature0.2
Humidity0
Wind0.466

We’ll put humidity check at the extension of sunny outlook.

Sub datasets for high and normal humidity

As seen, decision is always no for high humidity and sunny outlook. On the other hand, decision will always be yes for normal humidity and sunny outlook. This branch is over.

Decisions for high and normal humidity

Now, we need to focus on rain outlook.

Rain outlook

DayOutlookTemp.HumidityWindDecision
4RainMildHighWeakYes
5RainCoolNormalWeakYes
6RainCoolNormalStrongNo
10RainMildNormalWeakYes
14RainMildHighStrongNo

We’ll calculate gini index scores for temperature, humidity and wind features when outlook is rain.

Gini of temprature for rain outlook

TemperatureYesNoNumber of instances
Cool112
Mild213

Gini(Outlook=Rain and Temp.=Cool) = 1 – (1/2)2 – (1/2)2 = 0.5

Gini(Outlook=Rain and Temp.=Mild) = 1 – (2/3)2 – (1/3)2 = 0.444

Gini(Outlook=Rain and Temp.) = (2/5)x0.5 + (3/5)x0.444 = 0.466

Gini of humidity for rain outlook

HumidityYesNoNumber of instances
High112
Normal213

Gini(Outlook=Rain and Humidity=High) = 1 – (1/2)2 – (1/2)2 = 0.5

Gini(Outlook=Rain and Humidity=Normal) = 1 – (2/3)2 – (1/3)2 = 0.444

Gini(Outlook=Rain and Humidity) = (2/5)x0.5 + (3/5)x0.444 = 0.466

Gini of wind for rain outlook

WindYesNoNumber of instances
Weak303
Strong022

Gini(Outlook=Rain and Wind=Weak) = 1 – (3/3)2 – (0/3)2 = 0

Gini(Outlook=Rain and Wind=Strong) = 1 – (0/2)2 – (2/2)2 = 0

Gini(Outlook=Rain and Wind) = (3/5)x0 + (2/5)x0 = 0

Decision for rain outlook

The winner is wind feature for rain outlook because it has the minimum gini index score in features.

FeatureGini index
Temperature0.466
Humidity0.466
Wind0

Put the wind feature for rain outlook branch and monitor the new sub data sets.

Sub data sets for weak and strong wind and rain outlook

As seen, decision is always yes when wind is weak. On the other hand, decision is always no if wind is strong. This means that this branch is over.

Final form of the decision tree built by CART algorithm

So, decision tree building is over. We have built a decision tree by hand. BTW, you might realize that we’ve created exactly the same tree in ID3 example. This does not mean that ID3 and CART algorithms produce same trees always. We are just lucky. Finally, I believe that CART is easier than ID3 and C4.5, isn’t it?

···  END  ···
一、Number(数字)
全面掌握Python基础,这一篇就够了,建议收藏
Python基础之数字(Number)超级详解
Python随机模块22个函数详解
Python数学math模块55个函数详解
二、String(字符串)
Python字符串的45个方法详解
Pandas向量化字符串操作
三、List(列表)
超级详解系列-Python列表全面解析
Python轻量级循环-列表推导式
四、Tuple(元组)
Python的元组,没想象的那么简单
五、Set(集合)
全面理解Python集合,17个方法全解,看完就够了
六、Dictionary(字典)
Python字典详解-超级完整版
七、内置函数
Python初学者必须吃透这69个内置函数!
八、正则模块
Python正则表达式入门到入魔
笔记 | 史上最全的正则表达式
八、系统操作
Python之shutil模块11个常用函数详解
Python之OS模块39个常用函数详解
九、进阶模块
【万字长文详解】Python库collections,让你击败99%的Pythoner
高手如何在Python中使用collections模块
扫描关注本号↓
浏览 34
点赞
评论
收藏
分享

手机扫一扫分享

分享
举报
评论
图片
表情
推荐
点赞
评论
收藏
分享

手机扫一扫分享

分享
举报