Advanced search
Start date
Betweenand


Unrestricted Sequential Discrete Morphological Neural Networks

Full text
Author(s):
Marcondes, Diego ; Feldman, Mariana ; Barrera, Junior
Total Authors: 3
Document type: Journal article
Source: Journal of Mathematical Imaging and Vision; v. 67, n. 4, p. 22-pg., 2025-08-01.
Abstract

There have been attempts to insert mathematical morphology (MM) operators into convolutional neural networks (CNN), and the most successful endeavor to date has been the morphological neural networks (MNN). Although MNN have performed better than CNN in solving some problems, they inherit their black-box nature. Furthermore, in the case of binary images, they are approximations that lose the Boolean lattice structure of MM operators and, thus, it is not possible to represent a specific class of W-operators with desired properties. In a recent work, we proposed the discrete morphological neural networks (DMNN) for binary image transformation to represent specific classes of W-operators and estimate them via machine learning. We also proposed a stochastic lattice descent algorithm (SLDA) to learn the parameters of canonical discrete morphological neural networks (CDMNN), whose architecture is composed only of operators that can be decomposed as the supremum, infimum, and complement of erosions and dilations. In this paper, we propose an algorithm to learn unrestricted sequential DMNN (USDMNN) for image processing and binary classification, whose architecture is given by the composition of general W-operators. With an efficient implementation that leverages GPUs for matrix computations, we illustrate the algorithm in an example of image transformation, for learning the transition W-operator of the Conway's Game of Life and for classifying the manuscript digits of the MNIST dataset. The performance of USDMNN on the MNIST dataset was compared with a CNN and the USDMNN performed better when trained with small sample sizes. These examples illustrate the robustness of the method to noise and its advantages over CNN related to the ability to learn with fewer samples and the interpretability of the results. (AU)

FAPESP's process: 20/06950-4 - Center for Research and Development on Live Knowledge
Grantee:João Eduardo Ferreira
Support Opportunities: Research Grants - Problem-Oriented Research Centers in São Paulo
FAPESP's process: 22/06211-2 - Machine learning via learning spaces, from theory to practice: how the lack of data may be mitigated by high computational power
Grantee:Diego Ribeiro Marcondes
Support Opportunities: Scholarships in Brazil - Post-Doctoral
FAPESP's process: 14/50937-1 - INCT 2014: on the Internet of the Future
Grantee:Fabio Kon
Support Opportunities: Research Projects - Thematic Grants
FAPESP's process: 23/00256-7 - Prior information in neural networks
Grantee:Diego Ribeiro Marcondes
Support Opportunities: Scholarships abroad - Research Internship - Post-doctor