1 of 1 people found the following review helpful
Lots of good ideas for Computational Linguists,
This review is from: Foundations of Language: Brain, Meaning, Grammar, Evolution (Paperback)
I have a rather specialist interest.
I am writing software to do semantic searching of the Web, with the aid of Wordnet, and a home-brewed ruleset that maps directly from non-consecutive words onto semantic relations, and eventually with an analysis of Wikipedia definitions to override the shortcomings in Wordnet.
These shortcomings are illustrated by Wordnet's classification of "cow" as a bovine which is a bovid which is a ruminant which is an ungulate which is a mammal.
The fact that "cow" is a domesticated animal, is involved with "dairy", and is a "component of the countryside" are facts which are totally unknown to Wordnet, although probably well-known to any teenager.
So my viewpoint is not at all Chomskyan. This is in contrast to systems like Head Driven Phrase Structure Grammar which are currently in vogue.
Jackendoff provides me with a theoretical justification, and possibly a framework, for this work. His parallel model provides a way for explaining how psychological "priming" can influence how we understand language, and how ontology reference can be used to cross-check syntactic parses and the semantic networks that they generate. At the lower end of his multi-level model, word morphology can be used to identify the POS of words not in the hearer's lexicon, and by stemming can identify them in the ontology. So the model accounts for language acquisition as well as expert performance.
Word morphology applied to the phoneme stream routinely produces competing analyses, as in "he's got a shoe" as opposed to "he's going to shoot", as is recognized by the work on Hidden Markov Models in speech recognition. Prime the spectator with a video of a chase down the street as the sentence is spoken, and one analysis is preferred over the other.
The following sentences illustrate how an analysis can be checked at many different levels in the multi-level model.
"The horse raced past the barn fell"
"The horse entered in the first race fell at the second fence"
"The horse raced past the barn door"
"The hippo raced past the barn door"
"The mouse raced past the barn door"
"The horse raced passed the bedroom door"
Actually, Jackendoff does not give many example sentences of this sort, which is a shame. AI researchers who have worked with "constraint programming" will recognize that this is what Jackendoff is heading towards, but probably does not have the mathematical background to recognize.
Reading the odd cognitive psychology book and one on psycholinguistics will prime the reader to recognize the issues that Jackendoff is addressing. Reading up on Machine Learning and Data Mining will also alter the way you read his book.
I do think Jackendoff's book differs a lot from earlier linguistic approaches, so is well worth reading. Another book that helps, particularly with getting a model of language acquisition, is Adele Goldberg's "Constructions at Work", which was recommended to me by Guy Deutscher. This is full of good examples, but lacks the model that Jackendoff is building.
(3 customer reviews)