Perceptive Functions and Memory in Neural Network Model

Sumit Kumar, Tanya Sharma, Vishal Bhalla

Abstract


This paper provides the complete illustration about the observation of new group of distributive memory that is termed as R-nets. These networks are in sparse connection and are very much similar to the Hebbian network. A neural network model of associative memory in a small region of the human brain unconventionally depends, on dis-inhibitation of links between excitation neurons instead of long-term potential of excitation projections. Neural network model may have beneficial advantages over traditional neural network models both in sense of information storage capability and biological plausibility. The distributive memory class called R-nets mainly makes the use of simple common binary neurons and make the links between the excitation neurons and inhibition neurons. This paper is also aimed to show the implementation of associative memory that is capable to store sequential patterns in networks along with the higher perceptive or cognitive functions.

Keywords


Cognition, Conditioning, Neural Network, R-nets, Hebbian Networks

Full Text:

PDF

References


Vogel, D., and Boos, W. (1997). Sparsely connected, Hebbian networks with strikingly large storage capacities.

Vogel, D. (1998) Auto-associative memory produced by disinhibition in a sparsely connected network.

Vogel, D. (2001). A biologically plausible model of associative memory which uses disinhibition rather than long term potentiation, Brain cogn. 45, 212-228.

Thomson, A.M., West, D.C., Hahn, J., Deuchars, J., 1996. Single axon IPSP’s elicited in pyramidal cells by three classes of interneurones in slices of rat neocortex. Physiol. 496, 81-102.






Copyright (c) 2014 Sumit Kumar, Tanya Sharma, Vishal Bhalla

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

 

All published Articles are Open Access at  https://journals.pen2print.org/index.php/ijr/ 


Paper submission: ijr@pen2print.org