Advanced search
Start date
Betweenand

Um modelo Autoencoder para lidar com dados ausentes e com ruído

Grant number: 23/13688-2
Support Opportunities:Scholarships abroad - Research Internship - Master's degree
Effective date (Start): March 15, 2024
Effective date (End): September 14, 2024
Field of knowledge:Physical Sciences and Mathematics - Computer Science - Computing Methodologies and Techniques
Principal Investigator:Ana Carolina Lorena
Grantee:Arthur Dantas Mangussi
Supervisor: Pedro Henriques Abreu
Host Institution: Divisão de Ciência da Computação (IEC). Instituto Tecnológico de Aeronáutica (ITA). Ministério da Defesa (Brasil). São José dos Campos , SP, Brazil
Research place: Universidade de Coimbra (UC), Portugal  
Associated to the scholarship:22/10553-6 - An unified approach for dealing with missing and noise data, BP.MS

Abstract

Various issues can deteriorate data quality in Machine Learning. Among them, one may cite the quality of the input features, which can be impaired by the presence of noise and missing data. Noise can be inputted into data in several steps during collection, storage, and transmission. In supervised problems, they can be present both in the labels and in the predictive input features. Missing values are also a concern, as other information about the data item can be essential and must be regarded. This project will investigate these two problems and their interplay, including strategies to deal with them by taking possible corrective approaches. The objective is to experiment with different Autoencoder algorithms proposed by the Portuguese research group for imputing missing data, which are currently state-of-the-art in this area, and extend these algorithms to also identify and potentially correct noisy data.

News published in Agência FAPESP Newsletter about the scholarship:
More itemsLess items
Articles published in other media outlets ( ):
More itemsLess items
VEICULO: TITULO (DATA)
VEICULO: TITULO (DATA)

Please report errors in scientific publications list using this form.