An editor has performed a search and found that sufficient sources exist to establish the subject's notability. (October 2020) |
Lexicon-Grammar is a method and a praxis of formalized description of human languages. It was developed by Maurice Gross since the end of the 1960s.
Its theoretical basis is Zellig S. Harris's[1] [2] distributionalism, and notably the notion of transformational grammar. The notation conventions are meant to be as clear and comprehensible as possible.
The method of Lexicon-Grammar is inspired from hard sciences. It focuses on data collection, hence on real use of language, both from a quantitative and qualitative point of view.
Lexicon-grammar also requires formalization. The results of the description must be sufficiently formal to allow an application to parsing, in particular through the realization of syntax analyzers. The formal model is such that the results of the description take the form of double-entry tables, also called matrixes. Lexicon-grammar tables code lexical entries together with their syntaxico-semantic properties. As a result, they formalize syntaxico-semantic information.