Corpus Pattern Analysis -- mapping meaning onto use

Patrick Hanks

Brandeis University

Le 8 mars 2006 à 14 h

Salle B-4220, Pavillon Jean-Brillant


How do people know what a word means in any given text? How can this knowledge be made machine-tractable? The problem of "word sense disambiguation" has proved intractable despite decades of effort. Now, two of the leaders in the field (Wilks and Ide 2005) offer a counsel of despair: NLP should restrict itself to coarse-grained disambiguations, distinguishing only mutually exclusive senses ("homographs") and abandoning any attempt at "sense determination". Since the sense of most words consists of fuzzy sets of overlapping semantic features, this implies giving up altogether on any serious attempt to process the meaning of texts computationally. In this talk, I suggest that sense determination does not have to be abandoned, but can be approached in a new way. Instead of being attached to words in isolation, meanings can be attached to words in context --normal context. Instead of manipulating existing dictionaries, which were created for human use and which contain little information about syntagmatics, NLP needs lexicographic expertise to create a new kind of resource ---a Pattern Dictionary1, listing all normal syntagmatics of predicators (verbs and predicative adjectives) and providing links to an ontology stating the relevant semantics of nouns in each argument role. The aim is to offer fine- grained semantic distinctions for words in different contexts, so that meanings can be processed by pattern matching on a "best-match" basis. Words in isolation have massive entropy. The entropy of a word in isolation is reduced by putting it in context. Add more context and lexical entropy is reduced still further. The implications for NLP of this aspect of lexical semantics remain to be worked through. Up to now, American linguists and NLP researchers have not taken pattern analysis seriously, perhaps because they assume that the number of patterns will be unmanageably large. In fact, serious corpus analysis shows that the number of normal syntagmatic patterns for each predicator is manageably small. Regular usage is very regular and can be captured in a lexicon. Irregular usage must be the subject of second-order computation in relation to regular usage.


Pour recevoir les annonces hebdomadaires par courriel, visitez http://rali.iro.umontreal.ca/rali/?q=fr/node/1631

Liste des autres séminaires