Bidirectional Associative Memory (BAM) Network
A Bidirectional Associative Memory (BAM) Network is a bidirectional recurrent neural network based on hetero-associative memory connections.
- Example(s):
- Counter-Example(s):
- See: Memory, Recurrent Neural Network, Bart Kosko, Hopfield Network, Association (Psychology), Adaptive Resonance Theory.
References
2018
- (Wikipedia, 2018) ⇒ https://en.wikipedia.org/wiki/Bidirectional_associative_memory Retrieved:2018-3-4.
- Bidirectional associative memory (BAM) is a type of recurrent neural network. BAM was introduced by Bart Kosko in 1988.[1] There are two types of associative memory, auto-associative and hetero-associative. BAM is hetero-associative, meaning given a pattern it can return another pattern which is potentially of a different size. It is similar to the Hopfield network in that they are both forms of associative memory. However, Hopfield nets return patterns of the same size.
- ↑ Kosko, B. (1988). "Bidirectional Associative Memories" (PDF). IEEE Transactions on Systems, Man, and Cybernetics. 18 (1).
2013
- (Grossberg, 2013) ⇒ Stephen Grossberg (2013). "Adaptive_Bidirectional_Associative_Memory". In: Scholarpedia, 8(2):1888. doi:10.4249/scholarpedia.1888
- QUOTE: Kosko (1987, 1988) adapted the Cohen-Grossberg model and Liapunov function (Cohen and Grossberg, 1983), which proved global convergence of STM, to define a system that combines STM and LTM and which also globally converges to a limit. The main trick was to observe how the symmetric connections in the Cohen-Grossberg equation (32) could be used to define symmetric LTM traces interacting reciprocally between two processing levels. An Additive Model BAM system is, accordingly, defined by:
[math]\displaystyle{ \frac{d}{dt} x_i = -x_i + \sum_k f(y_k) z_{ki} + I_i \quad (37) }[/math]
and
[math]\displaystyle{ \frac{d}{dt} y_j = -y_j + \sum_m f(x_m) z_{mj} + J_i \quad (38) }[/math].
A Shunting Model BAM can also be analogously defined. One type of learning law to which BAM methods apply is the passive decay associative law that was introduced in Grossberg (1967, 1968b, 1968c); see Fig.1 and Fig.3:
[math]\displaystyle{ \frac{d}{dt} z_{ij} = -z_{ij} + f(x_i) f(x_j) \quad (39) }[/math]
Kosko calls the equation in (39) the signal Hebb law, although it does not obey the property of monotonely increasing learned weights that Hebb (1949) ascribed to his law. Kosko (1988) wrote that: "When the BAM neurons are activated, the network quickly evolves to a stable state of two-pattern reverberation, or resonance". Indeed, another inspiration for BAM was Adaptive Resonance Theory, or ART.
- QUOTE: Kosko (1987, 1988) adapted the Cohen-Grossberg model and Liapunov function (Cohen and Grossberg, 1983), which proved global convergence of STM, to define a system that combines STM and LTM and which also globally converges to a limit. The main trick was to observe how the symmetric connections in the Cohen-Grossberg equation (32) could be used to define symmetric LTM traces interacting reciprocally between two processing levels. An Additive Model BAM system is, accordingly, defined by:
1988
- (Kosko,1988) ⇒ Bart Kosko (1988). "Bidirectional associative memories". IEEE Transactions on Systems, man, and Cybernetics, 18(1), 49-60. doi: 10.1109/21.87054
- QUOTE: The bidirectional associative memory (BAM) is the minimal two-layer nonlinear feedback network. Information passes forward from one neuron field to the other by passing through the connection matrix M. Information passes backward through the matrix transpose MT. All other two-layer networks require more information in the form of connections N different from MT. The underlying mathematics are closely related to the properties of adjoint operators in function spaces, in particular how quadratic forms are essentially linearized by real matrices and their adjoints (transposes).
Since every matrix M is bidirectionally stable, we suspect that gradually changes due to learning in M will result in stability. We show that this is so quite naturally for real-time unsupervised learning. This extends Lyapunov convergence of neural networks for the first time to learning.
The neural network interpretation of a BAM is a two-layer hierarchy of symmetrically connected neurons. When the neurons are activated, the network quickly evolves to a stable state of two-pattern reverberation. The stable reverberation corresponds to a system energy local minimum. In the learning or adaptive BAM, the stable reverberation of a pattern [math]\displaystyle{ (A_i, B_i) }[/math] across the two fields of neurons seeps pattern information into the long-term memory connections M, allowing input associations [math]\displaystyle{ (A_i, B_i) }[/math] to dig their own energy wells in which to reverberate.
- QUOTE: The bidirectional associative memory (BAM) is the minimal two-layer nonlinear feedback network. Information passes forward from one neuron field to the other by passing through the connection matrix M. Information passes backward through the matrix transpose MT. All other two-layer networks require more information in the form of connections N different from MT. The underlying mathematics are closely related to the properties of adjoint operators in function spaces, in particular how quadratic forms are essentially linearized by real matrices and their adjoints (transposes).