Advanced search
Start date
Betweenand


Deep Boltzmann Machines Using Adaptive Temperatures

Full text
Author(s):
Passos Junior, Leandro A. ; Costa, Kelton A. P. ; Papa, Joao P. ; Felsberg, M ; Heyden, A ; Kruger, N
Total Authors: 6
Document type: Journal article
Source: COMPUTER ANALYSIS OF IMAGES AND PATTERNS; v. 10424, p. 12-pg., 2017-01-01.
Abstract

Deep learning has been considered a hallmark in a number of applications recently. Among those techniques, the ones based on Restricted Boltzmann Machines have attracted a considerable attention, since they are energy-driven models composed of latent variables that aim at learning the probability distribution of the input data. In a nutshell, the training procedure of such models concerns the minimization of the energy of each training sample in order to increase its probability. Therefore, such optimization process needs to be regularized in order to reach the best trade-off between exploitation and exploration. In this work, we propose an adaptive regularization approach based on temperatures, and we show its advantages considering Deep Belief Networks (DBNs) and Deep Boltzmann Machines (DBMs). The proposed approach is evaluated in the context of binary image reconstruction, thus outperforming temperature-fixed DBNs and DBMs. (AU)

FAPESP's process: 14/16250-9 - On the parameter optimization in machine learning techniques: advances and paradigms
Grantee:João Paulo Papa
Support Opportunities: Regular Research Grants
FAPESP's process: 14/12236-1 - AnImaLS: Annotation of Images in Large Scale: what can machines and specialists learn from interaction?
Grantee:Alexandre Xavier Falcão
Support Opportunities: Research Projects - Thematic Grants