huggingface.co/transformers.EncoderDecoderModel Library
A huggingface.co/transformers.EncoderDecoderModel Library is a huggingface.co/transformers Library that is an EncoderDecoder library.
- See: Neural NLU, Neural NLG.
References
2021
- https://huggingface.co/transformers/model_doc/encoderdecoder.html
- QUOTE: The EncoderDecoderModel can be used to initialize a sequence-to-sequence model with any pretrained autoencoding model as the encoder and any pretrained autoregressive model as the decoder.
The effectiveness of initializing sequence-to-sequence models with pretrained checkpoints for sequence generation tasks was shown in Leveraging Pre-trained Checkpoints for Sequence Generation Tasks by Sascha Rothe, Shashi Narayan, Aliaksei Severyn.
After such an EncoderDecoderModel has been trained/fine-tuned, it can be savedloaded just like any other models (see the examples for more information).
An application of this architecture could be to leverage two pretrained BertModel as the encoder and decoder for a summarization model as was shown in: Text Summarization with Pretrained Encoders by Yang Liu and Mirella Lapata.
- QUOTE: The EncoderDecoderModel can be used to initialize a sequence-to-sequence model with any pretrained autoencoding model as the encoder and any pretrained autoregressive model as the decoder.