Busca avançada
Ano de início
Entree


Memory-efficient DRASiW Models

Texto completo
Autor(es):
Napoli, Otavio Oliveira ; de Almeida, Ana Maria ; Borin, Edson ; Breternitz Jr, Mauricio
Número total de Autores: 4
Tipo de documento: Artigo Científico
Fonte: Neurocomputing; v. 610, p. 12-pg., 2024-09-05.
Resumo

Weightless Neural Networks (WNN) are ideal for Federated Learning due to their robustness and computational efficiency. These scenarios require models with a small memory footprint and the ability to aggregate knowledge from multiple models. In this work, we demonstrate the effectiveness of using Bloom filter variations to implement DRASiW models-an adaptation of WNN that records both the presence and frequency of patterns-with minimized memory usage. Across various datasets, DRASiW models show competitive performance compared to models like Random Forest, k-Nearest Neighbors, Multi-layer Perceptron, and Support Vector Machines, with an acceptable space trade-off. Furthermore, our findings indicate that Bloom filter variations, such as Count Min Sketch, can reduce the memory footprint of DRASiW models by up to 27% while maintaining performance and enabling distributed and federated learning strategies. (AU)

Processo FAPESP: 13/08293-7 - CECC - Centro de Engenharia e Ciências Computacionais
Beneficiário:Munir Salomao Skaf
Modalidade de apoio: Auxílio à Pesquisa - Centros de Pesquisa, Inovação e Difusão - CEPIDs