Deep Neural Network-based Text Segmentation Algorithm: Difference between revisions

From GM-RKB
Jump to navigation Jump to search
No edit summary
m (Text replacement - ". ↵" to ". ")
Line 23: Line 23:
=== 2017 ===
=== 2017 ===
* ([[Zhai et al., 2017]]) ⇒ [[Feifei Zhai]], [[Saloni Potdar]], [[Bing Xiang]], and [[Bowen Zhou]]. ([[2017]]). “Neural Models for Sequence Chunking.” In: Proceedings of the AAAI conference on artificial intelligence, 31(1).
* ([[Zhai et al., 2017]]) ⇒ [[Feifei Zhai]], [[Saloni Potdar]], [[Bing Xiang]], and [[Bowen Zhou]]. ([[2017]]). “Neural Models for Sequence Chunking.” In: Proceedings of the AAAI conference on artificial intelligence, 31(1).
** ABSTRACT: Many natural language understanding (NLU) tasks, such as shallow parsing (i.e., text chunking) and semantic slot filling, require the assignment of representative labels to the meaningful chunks in a sentence. Most of the current deep neural network (DNN) based methods consider these tasks as a sequence labeling problem, in which a word, rather than a chunk, is treated as the basic unit for labeling. These chunks are then inferred by the standard IOB (Inside-Outside- Beginning) labels. In this paper, we propose an alternative approach by investigating the use of DNN for sequence chunking, and propose three neural models so that each chunk can be treated as a complete unit for labeling. Experimental results show that the proposed neural sequence chunking models can achieve start-of-the-art performance on both the text chunking and slot filling tasks.  
** ABSTRACT: Many natural language understanding (NLU) tasks, such as shallow parsing (i.e., text chunking) and semantic slot filling, require the assignment of representative labels to the meaningful chunks in a sentence. Most of the current deep neural network (DNN) based methods consider these tasks as a sequence labeling problem, in which a word, rather than a chunk, is treated as the basic unit for labeling. These chunks are then inferred by the standard IOB (Inside-Outside- Beginning) labels. In this paper, we propose an alternative approach by investigating the use of DNN for sequence chunking, and propose three neural models so that each chunk can be treated as a complete unit for labeling. Experimental results show that the proposed neural sequence chunking models can achieve start-of-the-art performance on both the text chunking and slot filling tasks.


----
----

Revision as of 01:47, 28 January 2024

A Deep Neural Network-based Text Segmentation Algorithm is a text segmentation algorithm that employs deep neural networks (to perform text segmentation task)s.



References

2017

  • (Zhai et al., 2017) ⇒ Feifei Zhai, Saloni Potdar, Bing Xiang, and Bowen Zhou. (2017). “Neural Models for Sequence Chunking.” In: Proceedings of the AAAI conference on artificial intelligence, 31(1).
    • ABSTRACT: Many natural language understanding (NLU) tasks, such as shallow parsing (i.e., text chunking) and semantic slot filling, require the assignment of representative labels to the meaningful chunks in a sentence. Most of the current deep neural network (DNN) based methods consider these tasks as a sequence labeling problem, in which a word, rather than a chunk, is treated as the basic unit for labeling. These chunks are then inferred by the standard IOB (Inside-Outside- Beginning) labels. In this paper, we propose an alternative approach by investigating the use of DNN for sequence chunking, and propose three neural models so that each chunk can be treated as a complete unit for labeling. Experimental results show that the proposed neural sequence chunking models can achieve start-of-the-art performance on both the text chunking and slot filling tasks.