2016 GloballyNormalizedTransitionbas
Jump to navigation
Jump to search
- (Andor et al., 2016) ⇒ Daniel Andor, Chris Alberti, David Weiss, Aliaksei Severyn, Alessandro Presta, Kuzman Ganchev, Slav Petrov, and Michael Collins. (2016). “Globally Normalized Transition-based Neural Networks.” In: arXiv preprint arXiv:1603.06042.
Subject Headings: SyntaxNet, Parse McParseface Model.
Notes
Cited By
2016
- https://github.com/tensorflow/models/tree/master/syntaxnet
- QUOTE: A TensorFlow implementation of the models described in Andor et al. (2016). … SyntaxNet, an open-source neural network framework for TensorFlow that provides a foundation for Natural Language Understanding (NLU) systems. Our release includes all the code needed to train new SyntaxNet models on your own data, as well as Parsey McParseface, an English parser that we have trained for you, and that you can use to analyze English text. We see that Parsey McParseface is state-of-the-art; more importantly, with SyntaxNet you can train larger networks with more hidden units and bigger beam sizes if you want to push the accuracy even further: Andor et al. (2016)* is simply a SyntaxNet model with a larger beam and network. …
Quotes
Abstract
We introduce a globally normalized transition-based neural network model that achieves state-of-the-art part-of-speech tagging, dependency parsing and sentence compression results. Our model is a simple feed-forward neural network that operates on a task-specific transition system, yet achieves comparable or better accuracies than recurrent models. The key insight is based on a novel proof illustrating the label bias problem and showing that globally normalized models can be strictly more expressive than locally normalized models.
References
;
Author | volume | Date Value | title | type | journal | titleUrl | doi | note | year | |
---|---|---|---|---|---|---|---|---|---|---|
2016 GloballyNormalizedTransitionbas | Michael Collins Slav Petrov Kuzman Ganchev Daniel Andor Chris Alberti David Weiss Aliaksei Severyn Alessandro Presta | Globally Normalized Transition-based Neural Networks |