Advanced search
Start date
Betweenand
(Reference retrieved automatically from Web of Science through information on FAPESP grant and its corresponding number as mentioned in the publication by the authors.)

Handling dropout probability estimation in convolution neural networks using meta-heuristics

Full text
Author(s):
de Rosa, Gustavo H. [1] ; Papa, Joao P. [1] ; Yang, Xin-S [2]
Total Authors: 3
Affiliation:
[1] Sao Paulo State Univ, Dept Comp, BR-17033360 Bauru, SP - Brazil
[2] Middlesex Univ, Sch Sci & Technol, London NW4 4BT - England
Total Affiliations: 2
Document type: Journal article
Source: SOFT COMPUTING; v. 22, n. 18, SI, p. 6147-6156, SEP 2018.
Web of Science Citations: 0
Abstract

Deep learning-based approaches have been paramount in recent years, mainly due to their outstanding results in several application domains, ranging from face and object recognition to handwritten digit identification. Convolutional neural networks (CNNs) have attracted a considerable attention since they model the intrinsic and complex brain working mechanisms. However, one main shortcoming of such models concerns their overfitting problem, which prevents the network from predicting unseen data effectively. In this paper, we address this problem by means of properly selecting a regularization parameter known as dropout in the context of CNNs using meta-heuristic-driven techniques. As far as we know, this is the first attempt to tackle this issue using this methodology. Additionally, we also take into account a default dropout parameter and a dropout-less CNN for comparison purposes. The results revealed that optimizing dropout-based CNNs is worthwhile, mainly due to the easiness in finding suitable dropout probability values, without needing to set new parameters empirically. (AU)

FAPESP's process: 14/12236-1 - AnImaLS: Annotation of Images in Large Scale: what can machines and specialists learn from interaction?
Grantee:Alexandre Xavier Falcão
Support type: Research Projects - Thematic Grants
FAPESP's process: 14/16250-9 - On the parameter optimization in machine learning techniques: advances and paradigms
Grantee:João Paulo Papa
Support type: Regular Research Grants
FAPESP's process: 15/25739-4 - Ón “The study of semantics ín deep learning models
Grantee:Gustavo Henrique de Rosa
Support type: Scholarships in Brazil - Master