Advanced search
Start date
Betweenand

Hyperparameter fine-tuning in long short-term memory nets using genetic programming

Grant number: 18/10100-6
Support Opportunities:Scholarships in Brazil - Scientific Initiation
Effective date (Start): September 01, 2018
Effective date (End): November 30, 2019
Field of knowledge:Physical Sciences and Mathematics - Computer Science - Computing Methodologies and Techniques
Principal Investigator:João Paulo Papa
Grantee:Vicente Coelho Lobo Neto
Host Institution: Faculdade de Ciências (FC). Universidade Estadual Paulista (UNESP). Campus de Bauru. Bauru , SP, Brazil
Associated research grant:14/12236-1 - AnImaLS: Annotation of Images in Large Scale: what can machines and specialists learn from interaction?, AP.TEM

Abstract

Machine learning techniques have been widely used in a wide range of applications, especially those based on deep learning. However, these techniques have several hyperparameters that require their adjustment in a personalized way for each base, being essential for a good performance of the technique. The present research project aims to introduce a Genetic Programming (GP) approach for the fine-tuning of hyperparameters of recurrent neural networks. More specifically, word representation, number of layers, number of hidden units and word batch size processed for LSTMs (Long Short-Term Memory) will be optimized, and the results will be validated in text-driven databases. The task to be studied is the grammatical class recognition of words, known as Part-of-Speech (POS) Tagging. For purposes of comparison, widely known public databases such as the Brown corpus will be used. Besides, the project also has a period of an internship abroad through the FAPESP/BEPE.

News published in Agência FAPESP Newsletter about the scholarship:
Articles published in other media outlets (0 total):
More itemsLess items
VEICULO: TITULO (DATA)
VEICULO: TITULO (DATA)

Please report errors in scientific publications list by writing to: cdi@fapesp.br.