NeurIPS 2020

Strongly Incremental Constituency Parsing with Graph Neural Networks

Meta Review

This is a borderline paper. The technical contribution is interesting and appreciated by the reviewers. The results match the state of the art on PTB and are better on CTB. There are, however, some concerns with the paper. One of the reviewers summarized it very well: "In its present form, the scope of the paper seems too narrow. It is also somewhat unclear whom the intended audience ought to be. If the work aims to say something about psycholinguistics, the experiment should reflect that. If the work's goal is to support NLP applications, further justifications and motivations should be provided as to how a strongly incremental constituency parser might be useful in a current NLP pipeline. If the work aims to shed lights on our understanding of GNN, the paper would need to be refocused accordingly." It is indeed unclear what the impact of the paper in its current form is. The authors claim that the main novelty with respect to other incremental parses is the action space, but the ablation experiments show that the attach-juxtapose transitions have a much smaller impact on performance compare do the GNN, at least on PTB (maybe the impact on CTB is larger?). The authors acknowledge that incremental parsing would indeed be more important in speech than in text, but then consider evaluation on speech to be future work. The authors claim that incremental parsing would be useful in NLP applications, but for the example application, human-like conversational agent, speech would again be more appropriate than text.