Advanced search
Start date
Betweenand


Training Deep Networks from Zero to Hero: avoiding pitfalls and going beyond

Full text
Author(s):
Ponti, Moacir A. ; dos Santos, Fernando P. ; Ribeiro, Leo S. F. ; Cavallari, Gabriel B. ; IEEE Comp Soc
Total Authors: 5
Document type: Journal article
Source: 2021 34TH SIBGRAPI CONFERENCE ON GRAPHICS, PATTERNS AND IMAGES (SIBGRAPI 2021); v. N/A, p. 8-pg., 2021-01-01.
Abstract

Training deep neural networks may be challenging in real world data. Using models as black-boxes, even with transfer learning, can result in poor generalization or inconclusive results when it comes to small datasets or specific applications. This tutorial covers the basic steps as well as more recent options to improve models, in particular, but not restricted to, supervised learning. It can be particularly useful in datasets that are not as well-prepared as those in challenges, and also under scarce annotation and/or small data. We describe basic procedures as data preparation, optimization and transfer learning, but also recent architectural choices such as use of transformer modules, alternative convolutional layers, activation functions, wide/depth, as well as training procedures including curriculum, contrastive and self-supervised learning. (AU)

FAPESP's process: 19/07316-0 - Singularity theory and its applications to differential geometry, differential equations and computer vision
Grantee:Farid Tari
Support Opportunities: Research Projects - Thematic Grants
FAPESP's process: 17/22366-8 - Generative networks and feature learning for cross domain visual search
Grantee:Leo Sampaio Ferraz Ribeiro
Support Opportunities: Scholarships in Brazil - Doctorate (Direct)
FAPESP's process: 19/02033-0 - A study of image representations from multiple domains using unsupervised and semi-supervised deep learning
Grantee:Gabriel Biscaro Cavallari
Support Opportunities: Scholarships in Brazil - Master