Bangda Sun

Practice makes perfect

Stanford NLP (Coursera) Notes (15) - Dependency Parsing

Dependency Parsing.

1. Introduction

Dependency syntax postulates that syntactic structure consists of lexical items linked by dependency arcs: head \(\rightarrow\) dependent. Usually dependencies form a tree.

Compared with phrase structures (previous couple of sections), dependency grammar has notion of “head”. If we have the heads represented, we can also get dependency representation inside the phrase structure tree.

Dependency parsing methods include:

  • Dynamic programming (like CKY algorithm)

  • Graph algorithms (create minimum spanning trees)

  • Constraint satisfaction (remove edges that don’t satisfy hard constraints)
  • “Deterministic parsing” (MaltParser)

Source of information:

  • Bilexical affinities: [issues \(\rightarrow\) the] is plausible
  • Dependency distance: mostly with nearby words
  • Intervening material: dependencies rarely span intervening verbs or punctuation
  • Valency of heads: how many dependents on which side are usual for a head?

2. Greedy Transition-Based Parsing