ML p(r)ior | Inducing Probabilistic Grammars by Bayesian Model Merging
Processing...

Inducing Probabilistic Grammars by Bayesian Model Merging

1994-09-13
We describe a framework for inducing probabilistic grammars from corpora of positive samples. First, samples are {\em incorporated} by adding ad-hoc rules to a working grammar; subsequently, elements of the model (such as states or nonterminals) are {\em merged} to achieve generalization and a more compact representation. The choice of what to merge and when to stop is governed by the Bayesian posterior probability of the grammar given the data, which formalizes a trade-off between a close fit to the data and a default preference for simpler models (`Occam's Razor'). The general scheme is illustrated using three types of probabilistic grammars: Hidden Markov models, class-based $n$-grams, and stochastic context-free grammars.
PDF

Highlights - Most important sentences from the article

Login to like/save this paper, take notes and configure your recommendations