Test:Replacement Test Gmelli 181028.1

From GM-RKB
Jump to navigation Jump to search

A Neural-based Character-Level Text Error Correction (TEC) System is a neural-based TEC system that is a character-level TEC system by implementing a neural character-level TEC algorithm.

  • Context:
    • ...
  • Example(s):
    • a Neural Language Model-based Character-Level TEC System.
    • a Neural Encoder-Decoder RNN-based Character-Level TEC System, such as:
    • a Fairseq-based Character-Level Text Error Correction System.
  • Counter-Example(s):
    • an MLE-based Character-Level TEC Algorithm.
    • a Neural-based Word/Token-level TEC Algorithm.
  • See: Character-level Text Generation Algorithm.


References

2017

  • (Schmaltz et al., 2017) ⇒ Allen Schmaltz, Yoon Kim, Alexander Rush, and Stuart Shieber. (2017). “Adapting Sequence Models for Sentence Correction.” In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing.
    • QUOTE: … In a controlled experiment of sequence-to-sequence approaches for the task of sentence correction, 2017_AdaptingSequenceModelsforSenten|we find that character-based text error correction algorithm|character-based models are generally more effective than word-based models and TEC algorithm|models that encode subword information via NNet convolution|convolutions, and that modeling the output data as a series of diffs improves effectiveness over standard TEC algorithm|approaches. ...

2016

  • (2016_NeuralLanguageCorrectionwithCha|Xie et al., 2016) ⇒ Ziang Xie, Anand Avati, Naveen Arivazhagan, Dan Jurafsky, and Andrew Y. Ng. (2016). “Neural Language Correction with Character-Based Attention.” In: CoRR, abs/1603.09727.
    • QUOTE: text correction|Natural language correction has the potential to help language learners improve their writing skills. While text correction approach|approaches with separate text error recognizer|classifiers for different text error type|error types have high precision, they do not flexibly handle text error type|errors such as redundant text token error|redundancy or non-idiomatic phrasing. On the other hand, word-based MT method|word and phrase-based machine translation methods are not designed to cope with orthographic errors, and have recently been outpaced by neural models. Motivated by these issues, 2016_NeuralLanguageCorrectionwithCha|we present a neural network-based approach to language correction. The core component of our method is an encoder-decoder recurrent neural network with an attention mechanism. ...

Category:Concept