Advanced search
Start date
Betweenand


Meta-Heuristic Optimization-based Regularization for Deep Learning Architectures

Full text
Author(s):
Gustavo Henrique de Rosa
Total Authors: 1
Document type: Master's Dissertation
Press: São José do Rio Preto. 2018-08-30.
Institution: Universidade Estadual Paulista (Unesp). Instituto de Biociências Letras e Ciências Exatas. São José do Rio Preto
Defense date:
Advisor: João Paulo Papa
Abstract

Deep learning architectures have been extensively studied in the last years, mainly due to their discriminative power in many crucial problems in computer vision. However, one problem related to these models concerns with their number of parameters, which can easily reach thousands of hundreds. Another drawback is related to the need for large datasets for train- ing purposes, as well as their high probability of overfitting, mainly because of their complex architecture. Recently, a naïve idea of disconnecting neurones or connections from a network, known as Dropout or Dropconnect, respectively, has shown to be a promising solution to this problem. Nevertheless, it still requires an adequate parameter setting. This project aims to iden- tify possible solutions to the depicted problem by means of meta-heuristic optimization, trying to find the most suitable drop rate. Several machine learning approaches, such as, Restricted Boltzmann Machines, Deep Boltzmann Machines, Deep Belief Networks, Convolutional Neural Networks and several meta-heuristic techniques, such as, Particle Swarm Optimization, Bat Algorithm, Firefly Algorithm, Cuckoo Search, were employed in the context. The presented results show a possible trend in using meta-heuristic optimization to find suitable parameters in a wide range of applications, helping the learning process and improving the network’s architecture. (AU)

FAPESP's process: 15/25739-4 - On the Study of Semantics in Deep Learning Models
Grantee:Gustavo Henrique de Rosa
Support Opportunities: Scholarships in Brazil - Master