Busca avançada
Ano de início
Entree
(Referência obtida automaticamente do Web of Science, por meio da informação sobre o financiamento pela FAPESP e o número do processo correspondente, incluída na publicação pelos autores.)

Hypercomplex-valued recurrent correlation neural networks

Texto completo
Autor(es):
Valle, Marcos Eduardo [1] ; Lobo, Rodolfo Anibal [1]
Número total de Autores: 2
Afiliação do(s) autor(es):
[1] Univ Estadual Campinas, Inst Math Stat & Sci Comp, Campinas - Brazil
Número total de Afiliações: 1
Tipo de documento: Artigo Científico
Fonte: Neurocomputing; v. 432, p. 111-123, APR 7 2021.
Citações Web of Science: 0
Resumo

Recurrent correlation neural networks (RCNNs), introduced by Chiueh and Goodman as an improved version of the bipolar correlation-based Hopfield neural network, can be used to implement high-capacity associative memories. In this paper, we extend the bipolar RCNNs for processing hypercomplex-valued data. Precisely, we present the mathematical background for a broad class of hypercomplex-valued RCNNs. Then, we address the stability of the new hypercomplex-valued RCNNs using synchronous and asynchronous update modes. Examples with bipolar, complex, hyperbolic, quaternion, and octonionvalued RCNNs are given to illustrate the theoretical results. Finally, computational experiments confirm the potential application of hypercomplex-valued RCNNs as associative memories designed for the storage and recall of gray-scale images. (c) 2020 Elsevier B.V. All rights reserved. (AU)

Processo FAPESP: 19/02278-2 - Morfologia Matemática e Redes Neurais Morfológicas para Dados Multi-valorados
Beneficiário:Marcos Eduardo Ribeiro Do Valle Mesquita
Modalidade de apoio: Auxílio à Pesquisa - Regular