A Neural Probabilistic Structured-Prediction Model for Transition-Based Dependency Parsing
http://anthology.aclweb.org/P/P15/P15-1117.pdf
Neural probabilistic parsers are attractive for their capability of automatic feature combination and small data sizes.
A transition-based greedy neural parser has given better accuracies over its linear counterpart. We propose a neural
probabilistic structured-prediction model for transition-based dependency parsing, which integrates search and learning.
Beam search is used for decoding, and contrastive learning is performed for maximizing the sentence-level log-likelihood.
In standard Penn Treebank experiments, the structured neural parser achieves a 1.8% accuracy improvement upon a competitive greedy neural parser baseline, giving performance comparable to the best linear parser.