Advanced search
Start date
Betweenand


Memory-efficient DRASiW Models

Full text
Author(s):
Napoli, Otavio Oliveira ; de Almeida, Ana Maria ; Borin, Edson ; Breternitz Jr, Mauricio
Total Authors: 4
Document type: Journal article
Source: Neurocomputing; v. 610, p. 12-pg., 2024-09-05.
Abstract

Weightless Neural Networks (WNN) are ideal for Federated Learning due to their robustness and computational efficiency. These scenarios require models with a small memory footprint and the ability to aggregate knowledge from multiple models. In this work, we demonstrate the effectiveness of using Bloom filter variations to implement DRASiW models-an adaptation of WNN that records both the presence and frequency of patterns-with minimized memory usage. Across various datasets, DRASiW models show competitive performance compared to models like Random Forest, k-Nearest Neighbors, Multi-layer Perceptron, and Support Vector Machines, with an acceptable space trade-off. Furthermore, our findings indicate that Bloom filter variations, such as Count Min Sketch, can reduce the memory footprint of DRASiW models by up to 27% while maintaining performance and enabling distributed and federated learning strategies. (AU)

FAPESP's process: 13/08293-7 - CCES - Center for Computational Engineering and Sciences
Grantee:Munir Salomao Skaf
Support Opportunities: Research Grants - Research, Innovation and Dissemination Centers - RIDC