Natural Language Understanding Paperback – 3 Aug 1994
Customers Who Bought This Item Also Bought
Enter your mobile number below and we'll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer - no Kindle device required.
Getting the download link through email is temporarily not available. Please check back later.
To get the free app, enter your mobile phone number.
From the Back Cover
This long-awaited revision offers a comprehensive introduction to natural language understanding with developments and research in the field today. Building on the effective framework of the first edition, the new edition gives the same balanced coverage of syntax, semantics, and discourse, and offers a uniform framework based on feature-based context-free grammars and chart parsers used for syntactic and semantic processing. Thorough treatment of issues in discourse and context-dependent interpretation is also provided.
In addition, this title offers coverage of two entirely new subject areas. First, the text features a new chapter on statistically-based methods using large corpora. Second, it includes an appendix on speech recognition and spoken language understanding. Also, the information on semantics that was covered in the first edition has been largely expanded in this edition to include an emphasis on compositional interpretation.
About the Author
James Allen is the John H. Dessaurer Professor of Computer Science at the University of Rochester. He has taught natural language processing to undergraduate and graduate students for 14 years. He is a fellow of the AAAI and was the recipient of the Presidential Young Investigator Award (1985-1989). In addition, Professor Allen was the Editor-in- Chief of Computational Linguistics from 1983-1993.
Most Helpful Customer Reviews on Amazon.com (beta)
Well, I was looking at a list abbrieviations of the categories (parts of speech) which the book used, and I noticed, for the first time after owning this book for over 10 years, that there was no abbriviation for "conjunction" listed. And indeed, after consulting the index and looking through the book, it is plain to see that this book doesn't treat conjunction at all!
I have many fond memories of this book--it is the book which my beloved professor at grad school taught me NLP from, and indeed, it contains far more information about NLP than most of its successors. For example, this book gives perhaps the best discussion of quantifier scope ambiguities of all the major NLP textbooks. (cf. with Jurafski and Martin's book, which devotes about 1/2 a sentence to quantifier scope ambiguities).
But it has odd ommissions, one of which is the lack of treatment of conjunction/disjunction. After devoting so much time to quantifier scope, why does Allen leave me in the dark about whether "Every woman" can take scope over "a man" in the sentence "A man and every woman hug each other?" Does that scope differently from "Every woman and a man hug each other?" Or what about "Every woman and her mother fight?" Can that mean "Every woman fights with her mother" or are we to look for another antecedient for "her"?
Or again, Allen's treatment of prolog-esque definite clause grammars. Allen deserves major kudus here for including them. Its obvious that he comes from the LISP side of the tracks, and most LISPy books on NLP ignore DCG's altogether (Norvig's "Paridigms of AI programming" being a notable exception). But it seems almost like Allen goes out of his way to present DCGs in the most unattractive light possible. Prolog has a nice syntactic sugar which makes a DCG look almost exactly like a context-free grammar specification, but you'd never know that if you only read this book--Allen chooses a wierd way to translate strings into clauses, which implies a bizzare-looking prolog grammar for them. The student naturally recoils in horror, but unless she reads a prolog-oriented book on NLP, she would never know how much easier DCGs are to program than ATNs or the bottom-up parsing methods which Allen goes on to expostulate.
Since this book was published, the field of NLP has taken a bit of a side-track through statistical learning of grammars--the thought being that, well, we really don't know how to do knowledge representation or pronoun resolution very well, so lets all spend a decade or so on how to induce grammars from corpora. This book doesn't cover any of this research, but frankly, I really don't consider that a critique of the book. Because now that grammar induction has been done to death, we're right back where this book leaves off--computers can parse sentences all right: heck, these days, computers can even assign numbers between 0 and 1 to parse trees-- but can computers UNDERSTAND sentences?
I would love to see a 3rd edition of this book, and I'm sure I'm not alone. What I'd like to see it cover is (surprise surprise) conjunction/disjunction, discourse representation theory, underspecification, and a more meaty discussion of knowledge representation and inference. Also, a few chapters on natural language generation would be nice, as well as discussions on dialogue. Skip the sections on ATNs and other parsing methods which are only of historical interest now.
Flaws and all, this book is beloved of generations of NLP researchers and is still indispensible, after all these years.
However, some developments in the past few years outpace his treatment. In particular, the stochastic viewpoint has become more common in natural-language processing, and Allen does not consider related innovations in great depth. Passing mention of the interaction between NLP and the speech processing and information retrievals would also benefit a revised version.
I'd love to tell you more, but I only stopped by to catch the author's name and look for whitepapers on his (hopefully existant) website.
The TRAINS project at Rochester (the author's institution) was based on many of the concepts outlined in this book... proof that they work and can be made to handle real-world situations.
I continue to use it, now in its second edition as a reference for myself, and to train those who need to work with our project.
I cannot recommend a book more highly. If you want to learn computational linguistics, or need to push the state of the art, this is the book you need.
I have been working on a knowledge base ("KB") for NLP/NLU for many years and one shortcoming that I see in all books like this is that the authors do not seem to have any direct experience with creating a KB and as a result, they spend a lot of pages, and even chapters, trying to solve problems which are already solved by a good KB design. Many of the middle chapters, as well as the appendix on Logic, fall into this category.
Nevertheless, there is a lot of worthwhile material in this book. Even things which seem off-base to me still raise some issues which will require further thought and analysis. In comparison, I felt like I got nothing at all of value out of the 1000-page Speech and Language Processing book.