Lexicalized Parsing.
1. Lexicalization of PCFG
In previous section, the rules make no reference to actual words, but the actual words are helpful in some cases. For example, the likelihood of having \(PP\) after “walked” is very high, but it will be very low after “saw”. Put the properties of words back into a PCFG can capture more necessary probabilistic conditioning information to make parsing decisions.
2. Charniak’s Model
The actual parsing is “bottom-up” like CKY algorithm.