Advanced search
Start date
Betweenand
(Reference retrieved automatically from Web of Science through information on FAPESP grant and its corresponding number as mentioned in the publication by the authors.)

Fine-tuning Deep Belief Networks using Harmony Search

Full text
Author(s):
Papa, Joao Paulo [1] ; Scheirer, Walter [2] ; Cox, David Daniel [2]
Total Authors: 3
Affiliation:
[1] UNESP Univ Estadual Paulista, Dept Comp, Bauru - Brazil
[2] Harvard Univ, Ctr Brain Sci, Cambridge, MA 02138 - USA
Total Affiliations: 2
Document type: Journal article
Source: APPLIED SOFT COMPUTING; v. 46, p. 875-885, SEP 2016.
Web of Science Citations: 32
Abstract

In this paper, we deal with the problem of Deep Belief Networks (DBNs) parameters fine-tuning by means of a fast meta-heuristic approach named Harmony Search (HS). Although such deep learning-based technique has been widely used in the last years, more detailed studies about how to set its parameters may not be observed in the literature. We have shown we can obtain more accurate results comparing HS against with several of its variants, a random search and two variants of the well-known Hyperopt library. The experimental results were carried out in two public datasets considering the task of binary image reconstruction, three DBN learning algorithms and three layers. (C) 2015 Elsevier B.V. All rights reserved. (AU)

FAPESP's process: 13/20387-7 - Hyperparameter optimization in deep learning arquitectures
Grantee:João Paulo Papa
Support Opportunities: Scholarships abroad - Research
FAPESP's process: 14/16250-9 - On the parameter optimization in machine learning techniques: advances and paradigms
Grantee:João Paulo Papa
Support Opportunities: Regular Research Grants