Hierarchical Recurrent Encoder-Decoder (HRED) Neural Network Training Algorithm: Difference between revisions

From GM-RKB
Jump to navigation Jump to search
m (Text replacement - "\<P\>([\s]{1,7})([^\s])" to "<P> $2")
m (Text replacement - "]]," to ", [[")
Line 7: Line 7:
** a [[Pointer Network (Ptr-Net) Training Algorithm]],
** a [[Pointer Network (Ptr-Net) Training Algorithm]],
** a [[Pointer-Generator Network Training Algorithm]].
** a [[Pointer-Generator Network Training Algorithm]].
* <B>See:</B> [[Long Short-Term Memory]], [[Recurrent Neural Network]], [[Convolutional Neural Network]], [[Gating  Mechanism]],[[Encoder-Decoder Neural Network]].
* <B>See:</B> [[Long Short-Term Memory]], [[Recurrent Neural Network]], [[Convolutional Neural Network]], [[Gating  Mechanism]], [[Encoder-Decoder Neural Network]].


----
----

Revision as of 20:52, 1 October 2021

A Hierarchical Recurrent Encoder-Decoder (HRED) Neural Network Training Algorithm is a feedforward NNet training algorithm that implements a hierarchical recurrent encoder-decoder neural network.



References

2015

2015 AHierarchicalRecurrentEncoderDe Fig3.png
Figure 3: The hierarchical recurrent encoder-decoder (HRED) for query suggestion. Each arrow is a non-linear transformation. The user types cleveland gallerylake erie art. During training, the model encodes cleveland gallery, updates the session-level recurrent state and maximize the probability of seeing the following query lake erie art. The process is repeated for all queries in the session. During testing, a contextual suggestion is generated by encoding the previous queries, by updating the session-level recurrent states accordingly and by sampling a new query from the last obtained session-level recurrent state. In the example, the generated contextual suggestion is cleveland indian art.