Advanced search
Start date
Betweenand


Hyper-parameter Tuning of a Decision Tree Induction Algorithm

Full text
Author(s):
Mantovani, Rafael G. ; Horvath, Tomas ; Cerri, Ricardo ; Vanschoren, Joaquin ; de Carvalho, Andre C. P. L. F. ; IEEE
Total Authors: 6
Document type: Journal article
Source: PROCEEDINGS OF 2016 5TH BRAZILIAN CONFERENCE ON INTELLIGENT SYSTEMS (BRACIS 2016); v. N/A, p. 6-pg., 2016-01-01.
Abstract

Supervised classification is the most studied task in Machine Learning. Among the many algorithms used in such task, Decision Tree algorithms are a popular choice, since they are robust and efficient to construct. Moreover, they have the advantage of producing comprehensible models and satisfactory accuracy levels in several application domains. Like most of the Machine Leaning methods, these algorithms have some hyper-parameters whose values directly affect the performance of the induced models. Due to the high number of possibilities for these hyper-parameter values, several studies use optimization techniques to find a good set of solutions in order to produce classifiers with good predictive performance. This study investigates how sensitive decision trees are to a hyper-parameter optimization process. Four different tuning techniques were explored to adjust J48 Decision Tree algorithm hyper-parameters. In total, experiments using 102 heterogeneous datasets analyzed the tuning effect on the induced models. The experimental results show that even presenting a low average improvement over all datasets, in most of the cases the improvement is statistically significant. (AU)

FAPESP's process: 13/07375-0 - CeMEAI - Center for Mathematical Sciences Applied to Industry
Grantee:Francisco Louzada Neto
Support Opportunities: Research Grants - Research, Innovation and Dissemination Centers - RIDC
FAPESP's process: 15/03986-0 - Use of Meta-learning to improve deep learning algorithms in classification problems
Grantee:Rafael Gomes Mantovani
Support Opportunities: Scholarships abroad - Research Internship - Doctorate
FAPESP's process: 12/23114-9 - Use of meta-learning for parameter tuning for classification problems
Grantee:Rafael Gomes Mantovani
Support Opportunities: Scholarships in Brazil - Doctorate