Advanced search
Start date
Betweenand
(Reference retrieved automatically from Web of Science through information on FAPESP grant and its corresponding number as mentioned in the publication by the authors.)

Hypercomplex-valued recurrent correlation neural networks

Full text
Author(s):
Valle, Marcos Eduardo [1] ; Lobo, Rodolfo Anibal [1]
Total Authors: 2
Affiliation:
[1] Univ Estadual Campinas, Inst Math Stat & Sci Comp, Campinas - Brazil
Total Affiliations: 1
Document type: Journal article
Source: Neurocomputing; v. 432, p. 111-123, APR 7 2021.
Web of Science Citations: 0
Abstract

Recurrent correlation neural networks (RCNNs), introduced by Chiueh and Goodman as an improved version of the bipolar correlation-based Hopfield neural network, can be used to implement high-capacity associative memories. In this paper, we extend the bipolar RCNNs for processing hypercomplex-valued data. Precisely, we present the mathematical background for a broad class of hypercomplex-valued RCNNs. Then, we address the stability of the new hypercomplex-valued RCNNs using synchronous and asynchronous update modes. Examples with bipolar, complex, hyperbolic, quaternion, and octonionvalued RCNNs are given to illustrate the theoretical results. Finally, computational experiments confirm the potential application of hypercomplex-valued RCNNs as associative memories designed for the storage and recall of gray-scale images. (c) 2020 Elsevier B.V. All rights reserved. (AU)

FAPESP's process: 19/02278-2 - Mathematical Morphology and Morphological Neural Networks for Multivalued Data
Grantee:Marcos Eduardo Ribeiro Do Valle Mesquita
Support Opportunities: Regular Research Grants