Context Inducing Nouns        Charlotte Price          Valeria de Paiva       Tracy Holloway King   Palo Alto Research Cen...
b.   RELATIVE CLAUSE:      John liked the song                 (7) Paul believes [that John’s lie [that Mary wor-         ...
of aiding inferencing in the B RIDGE system, the                  Deverbal nouns can take that complements or, asnouns tha...
These semantic classes provided a starting point            b. The falsehood that Mary had returned sur-for discovering mo...
(cat(N), word(fact), subcat(NOUN - EXTRA),                      @(NOUN - EXTRA %stem) is added to the entry forconcept(%1)...
for testing purposes. It it is clear that these noun             lows the word that, as in (24). These commonlycomplements...
7.3    Other languages                                          courage, decency, foresight, gall, gumption, gut,The fact ...
tion, somberness, sorrow, sorrowfulness, suspense,         Crouch, Dick. 2005. Packed rewriting for mapping se-terror, tre...
Upcoming SlideShare
Loading in …5

Context Inducing Nouns


Published on

Price, King and de Paiva
COLING 2008 workshop

Published in: Technology, Education
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Context Inducing Nouns

  1. 1. Context Inducing Nouns Charlotte Price Valeria de Paiva Tracy Holloway King Palo Alto Research Center Palo Alto Research Center Palo Alto Research Center 3333 Coyote Hill Rd. 3333 Coyote Hill Rd. 3333 Coyote Hill Rd. Palo Alto, CA 94304 USA Palo Alto, CA 94304 USA Palo Alto, CA 94304 USA Abstract 2 What is a complement-taking noun? It is important to identify complement- Identifying complement-taking nouns is somewhat taking nouns in order to properly analyze involved. It is important to identify the clause, to the grammatical and implicative structure ensure that the clause is indeed a complement and of the sentence. This paper examines the not an adjunct (e.g. a relative clause or a purpose ways in which these nouns were identified infinitive), and to figure out what is licensing the and classified for addition to the B RIDGE complement, as it is not only nouns that license natural language understanding system. complements. 2.1 Verbal vs. nominal complements1 Introduction A clause is a portion of a sentence that includes aOne of the goals of computational linguistics is to predicate and its arguments. Clauses come in a va-draw inferences from a text: that is, for the sys- riety of forms, a subset of which is shown in (1)tem to be able to process a text, and then to con- for verbs taking complements. The italicized partclude, based on the text, whether some other state- is the complement, and the part in bold is what li-ment is true.1 Clausal complements confound the censes it. The surface form of the clause can varyprocess because, despite their surface similarity to significantly depending on the licensing verb.adjuncts, they generate very different inferences. In this paper we examine complement-taking (1) a. Mary knows that Bob is happy.nouns: how to identify them and how to incorpo- b. John wants (Mary) to leave right now.rate them into an inferencing system. We first dis- c. John likes fixing his bike.cuss what we mean by complement-taking nouns d. John let Mary fix his bike.(section 2) and how to identify a list of suchnouns (section 3). We then describe the question- For this paper, we touch briefly on nouns takinganswering system that uses the complement-taking to clauses, as in (2b), but the main focus is on thatnouns as part of its inferencing (section 4), how the clauses, as in (2a).nouns are added to the system (section 5), and howthe coverage is tested (section 6). Finally, we dis- (2) a. the fact that Mary hoppedcuss several avenues for future work (section 7), b. the courage to hopincluding automating the search process, identify- Both types of complements pose problems ining other context-inducing forms, and taking ad- mining corpora for lexicon development. The thatvantage of cross-linguistic data. clauses can superficially resemble relative clauses, c 2008. Licensed under the Creative Commons as in (3), and the to clauses can resemble purposeAttribution-Noncommercial-Share Alike 3.0 Unported li- infinitives, as in (4).cense ( rights reserved. 1 We would like to thank the Natural Language Theory and (3) a. COMPLEMENT- TAKING NOUN : JohnTechnology group at PARC, Dick Crouch, and the three re- liked the idea that Mary sang lastviewers for their input. evening.
  2. 2. b. RELATIVE CLAUSE: John liked the song (7) Paul believes [that John’s lie [that Mary wor- that Mary sang last evening. ries [that fish can fly]] surprised us]. (4) a. COMPLEMENT- TAKING NOUN : John had Contexts may have an implication signature a chance to sing that song. (Nairn et al., 2006) attached to them, specifying, b. PURPOSE INFINITIVE: John had a song for example, that the clause is something that the book (in order) to sing that song. speaker presupposes to be true or that the speaker believes the truth value of the clause should be re-As discussed in section 3, this superficial re- versed. The default for a context is to allow nosemblance makes the automatic identification of implications to be drawn, as in (1b), where thecomplement-taking nouns very difficult: simple speaker has not committed to whether or not Marystring-based searches would return large numbers is leaving.of incorrect candidates which would have to be vet- Below is a more detailed example showing howted before incorporating the new nouns into the the context introduced by a noun changes the im-system. plications of the sentence, and how it would behave differently from a relative clause adjunct to a noun.2.2 Contexts introduced by nominals Consider the pair of sentences in (8).Complements and relative clause adjuncts allowvery different inferences. Whereas the speaker’s (8) a. The lie that Mary had won surprised John.beliefs about adjuncts take on the truth value of Mary did not win.the clause they are embedded in, the truth value of b. The bonus that Mary had won surprisedclausal complements is also affected by the licens- John. Mary won a noun. Compare the sentences below. The itali-cized clause in (5) is a complement, while in (6) it In (8), that John was surprised is in the speaker’sis an adjunct. top context, which is what the author commits to as truth. In (8a), lie is within the context of surprised. (5) The lie that Mary was ill paralyzed Bob. Surprised does not change the implications of ele- Mary was not ill. ments within its context.3 Therefore, lie gets a true value: that a lie was told is considered true. That (6) The situation that she had gotten herself into Mary won, however, is within the context of lie, paralyzed Bob. She had gotten herself which reverses the polarity of implications within into a situation. its scope or context. If that Mary won were only within the context of surprised instead of within To explain how this is possible, we introduce the lie, which would be the case if lie did not createnotion of implicative contexts (Nairn et al., 2006), a context, then that Mary won would fall withinand claim that complement-taking nouns introduce the context of surprised. The implication signa-a context for the complement, whereas no such ture of surprised would determine the veridicalitycontext is created for the adjuncts. Perhaps the eas- of the embedded clause instead of the signature ofiest way to think of a context is to imagine em- lie: this would incorrectly allow the conclusion thatbedding the complement in an extra layer, with the Mary won.layer adding information about how to adjust the The content of the relative clause in (8b) is in thetruth-value of its contents.2 This allows us to con- same context as surprise since no additional con-clude in (5) that the speaker believes that Mary and text is introduced by bonus. As such, we can con-Bob exist, as does the event of Bob’s paralysis, but clude that Mary did win a bonus.the event Mary was ill does not. These are ref-ered to as the (un)instantiability of the components 2.3 Complements introduced by toin the sentence. Contexts can be embedded within The previous subsection focused on finite comple-each other recursively, as in (7). Note that these se- ments introduced by that. From the perspectivemantic contexts often, but not always, correspond 3to syntactic embedding. We say surprise has the implication signature ++/--: el- ements within its context have a positive implication in a pos- 2 In the semantic representations, the contexts are flattened, itive context and negative in a negative context. See (Nairn etor projected, onto the leaf nodes of the parse tree, so that every al., 2006) for detailed discussion of possible implication sig-leaf has access to information locally. natures and how to propagate them through contexts.
  3. 3. of aiding inferencing in the B RIDGE system, the Deverbal nouns can take that complements or, asnouns that take to complements that are not dever- in (13), to complements. Most often, the contextbal nouns (see section 2.4 for discussion of dever- introduced by a deverbal noun does not add an im-bals) seem to fall into three main classes:4 ability, plication signature, as in (11), which results in thebravery, and chance. Examples are shown in (9). answer UNKNOWN to the question Was Mary ill?. (9) a. John has the ability to sing. (13) a. John’s promise to go swimming surprised b. John has the guts to sing out loud. us. c. John’s chance to sing came quickly. b. John’s persuasion of Mary to sing at the party surprised us. These all have an implication signature thatgives a (negative) implication only in a negative Gerunds, being even more verb-like, are treated ascontext, as in (10); in a positive context as in (9), verbs in our system and hence inherit the implica-no implication can be drawn. tive properties from the corresponding verb.(10) John didn’t have the opportunity to sing. (14) Knowing that Mary had sung upset John. John didn’t sing. Mary sang. Note also that the implication only applies when Gerunds and deverbal nouns are discussed in de-the verb is have. Other light verbs, such as take in tail in (Gurevich et al., 2006) and are outside of the(11) change the implications. scope of this paper.(11) John took the opportunity to sing. 3 Finding complement-taking nouns John sang. In order for the system to draw the inferences dis-For this reason, these nouns are treated differ- cussed above, the complement-taking nouns mustently than those with that complements. They are first be identified and then classified and incorpo-marked in the grammar as taking a complement in rated into the B RIDGE system (section 4). First,the same way that that complements are (section 5), the gerunds are removed since these are mapped bybut the mechanism which attaches an implication the syntax into their verbal counterparts. Then thesignature takes the governing verb into account. non-gerund deverbal nouns (section 2.4) are linked to their verbal counterpart so that they can be ana-2.4 Deverbal nouns lyzed by the system as events. These two classesA large number of complement-taking nouns are represent a significant number of the nouns thatrelated to verbs that take complements. These take that complements.nouns are analyzed differently than non-deverbalnouns. They are linked to their related verb and 3.1 Syntactic classificationclassified according to how the arguments of the However, there are many complement-takingnoun and the sentence relate to the arguments for nouns that are not deverbal. To expand ourthe verb (e.g. -ee, -er).5 The B RIDGE system uses lexicon of these nouns, we started with a seedthis linking to map these nouns to their verbal coun- set garnered from the Penn Treebank (Marcus etterparts and to draw conclusions of implicativity as al., 1994), which uses distinctive tree structuresif they were verbs, as explained in (Gurevich et al., for complement-taking nouns, and a small list2006). Consider (12) where the paraphrases using of linguistically prominent nouns. For each offear as a verb or a noun are clearly related. these lexical items, we extracted words in the same semantic class from WordNet. Classes(12) a. The fear that Mary was ill paralyzed Bob. include words like fact, which direct attention to the clausal complement, as in (15), and nouns b. Bob feared that Mary was ill; this fear par- expressing emotion, as in (16). alyzed Bob. 4 (15) It’s a fact that Mary came. The work described in this section was done by LauriKarttunen and Karl Pichotta (Pichotta, 2008). 5 NOMLEX (Macleod et al., 1998) is an excellent source of (16) Bob’s joy that Mary had returned reduced himthese deverbal nouns. to tears.
  4. 4. These semantic classes provided a starting point b. The falsehood that Mary had returned sur-for discovering more of these nouns: the class of prised John. Mary had not returned.emotion nouns, for example, has more than a hun- c. The possibility that Mary had returneddred hyponyms. surprised John. ? Mary had returned. Identifying the class is not enough, as not allmembers take clausal complements. Compare joy These nouns have different implication signa-in (16) and warmheartedness in (17) from the emo- tures: facts imply truth; lies imply falsehood; andtion class. The sentence containing joy is much possibilities do not allow truth or falsehood to bemore natural than that in (17). established. The default for complements is that no implications can be drawn, as in (20c), which in the(17) #Bob’s warmheartedness that Mary had re- B RIDGE system is expressed as the noun having no turned reduced him to tears. implication signature.6 From the candidate list, the deverbal nouns are Once identified and its implication signature de-added to the lexicon of deverbal noun mappings. termined, adding the complement-taking noun toThe remaining list is checked word-by-word. To the B RIDGE system and deriving the correct infer-ease the process, test sentences that take a range of ences is straightforward. This process is describedmeanings are created for each class of nouns, as in in section 5.(18). 4 The B RIDGE system(18) Bob’s that Mary visited her mother re- duced him to tears. The B RIDGE system (Bobrow et al., 2007) includes a syntactic grammar, a semantics rule set (Crouch If the noun does not fit the test sentences, a and King, 2006), an abstract knowledge represen-web search is done on “X that” to extract po- tation (AKR) rule set, and an entailment and con-tential complement-bearing sentences. These are tradiction detection (ECD) system. The syntax, se-checked to eliminate sentences with adjuncts, or mantics, and AKR all depend on lexicons.where some other feature licenses the clause, such The B RIDGE grammar defines syntactic proper-as in (19) where the bold faced structure is licens- ties of words, such as predicate-argument structure,ing the italicized clause. tense, number, and nominal specifiers. The gram- mar produces a packed representation of the sen-(19) a. John is so warmhearted that he took her in tence which allows ambiguity to be dealt with effi- without question. ciently (Maxwell and Kaplan, 1991). b. They had such a good friendship that she The parses are passed to the semantic rules could tell him anything. which also work on packed structures (Crouch, 2005). The semantic layer looks up words in a Using these methods, from a seed set of 13 Unified Lexicon (UL), connects surface argumentsnouns, 170 non-deverbal complement-taking of verbs to their roles, and determines the contextnouns were identified, most in the emotion and within which a word occurs in the sentence. Nega-feeling classes. The same techniques were then tion introduces a context, as do the complement-applied to the state and information classes. Once taking nouns discussed here (Bobrow et al., 2005).the Penn Treebank seeds were incorporated, the The UL combines several sources of informationsame process was applied to the complement- (Crouch and King, 2005). Much of the informationtaking nouns from NOMLEX (Macleod et al., comes from the syntactic lexicon, VerbNet (Kipper1998). et al., 2000), and WordNet (Fellbaum, 1998), but3.2 Determining implications there are also handcoded entries that add semanti- cally relevant information such as its implicationAs examples (8a) and (8b) showed, whether a word signature. A sample UL entry is given in Figure 1.takes a complement is lexically determined; so is The current number of complement-takingthe type of implication signature introduced by the nouns in the system is shown in (21). Only aword. Compare the implications in (20). 6 A context is still generated for these. Adjuncts, having no(20) a. The fact that Mary had returned surprised context of their own, inherit the implication signature of the John. Mary had returned. clause containing them (section 2.2).
  5. 5. (cat(N), word(fact), subcat(NOUN - EXTRA), @(NOUN - EXTRA %stem) is added to the entry forconcept(%1), fact.source(hand annotated data), source(xle), If there is an implication signature for the com-xfr:concept for(%1,fact), plement, this is added to the noun’s entry in thexfr:lex class(%1,impl pp nn), file for hand-annotated data used to build the UL.xfr:wordnet classes(%1,[])). The fifth line in Figure 1 is an example. The AKR and ECD rules that calculate the context and im-Figure 1: One entry for the word fact in the Uni- plications on verbs and deverbal nouns general-fied Lexicon. NOUN - EXTRA states that this use of ize to handle implications on complement-takingfact fits in structures such as it is a fact that The nouns and so do not need to be altered as newWordNet meaning is found by looking up the con- complement-taking nouns are found.cept for fact in the WordNet database. The implica- As described in section 3, deciding which nounstion signature of the word is impl pp nn or ++/-- take complements is currently hand curated, as it isas seen in (22). Lastly, the sources for this informa- quite difficult to distinguish them entirely automat-tion are noted. ically.fifth of the nouns have implication signatures. 6 TestingHowever, all of the nouns introduce contexts; thedefault implication for contexts is to allow neither To ensure that complement-taking nouns are work-true nor false to be concluded, as in (20c). ing properly in the system, for each noun, a passage-query-correct answer triplet such as:(21) Complement-taking Nouns (22) PASSAGE: The fact that Mary had returned that complements 411 surprised John. to complements 173 Q UERY : Had Mary returned? with implication signatures 107 A NSWER: YES The output of the semantics level is fed into is added to a testsuite. The testsuites are run andthe AKR. At this level, contexts are used to deter- the results reported as part of the daily regres-mine (un)instantiability based on the relationship sion testing (Chatzichrisafis et al., 2007). Bothbetween contexts.7 An entity’s (un)instantiability naturally occurring and hand-crafted examples areencodes whether it exists in some context. In (8a), used to ensure that the correct implications arefor example, we can conclude that the speaker be- being drawn. Natural examples test interactionslieves that Mary exists, but that the event Mary won between phenomena such as noun complementa-is uninstantiated: the speaker believes it did not tion and copular constructions, while hand-craftedhappen. examples allow isolation of the phenomenon and The final layer is the ECD, which uses the struc- show that all cases are being tested (Cohen et al.,tures built by the AKR to reason about a given 2008), e.g., that the correct entailments emerge un-passage-query pair to determine whether or not the der negation as well as in the positive case.query is inferred by the passage, answering with Our current testsuites contain about 180 hand-YES, NO , UNKNOWN , or AMBIGUOUS. For more crafted examples. The number of natural exam-details, see (Bobrow et al., 2005). ples is harder to count as they occur somewhat5 Adding complement-taking nouns to rarely in the mixed-phenomena testsuites. One the system of our natural example files, which is based on newswire extracts from the PASCAL RecognizingAdding complement-taking nouns to the B RIDGE Textual Entailment Challenge (Dagan et al., 2005),system is straightforward. A syntactic entry is shows an approximate breakdown of the uses of theadded indicating that the noun takes a complement. word that is as shown in (23). This sample, whichThe syntactic classes are defined by templates, and is somewhat biased towards verbal complementsthe relevant template is called in the lexical en- since it contains many examples that can be para-try for that word. For example, the template call phrased as said that, nonetheless shows the relative 7 See (Bobrow et al., 2007; Bobrow et al., 2005) for other scarcity of noun complements in the wild and un-information contained in the AKR. derscores the importance of hand-crafted examples
  6. 6. for testing purposes. It it is clear that these noun lows the word that, as in (24). These commonlycomplements were being analyzed incorrectly be- identify adjuncts.fore; what is unclear is how much of an impactthe misanalysis would have caused. Perhaps some (24) The shark that bit the swimmer appears toother domain would demonstrate a significantly have left.higher presence of non-deverbal nouns that takecomplements and would be more significantly im- By eliminating these adjuncts and by removingpacted by their misanalysis. those sentences where it is known that the clause is a complement of the verb based on the syntac-(23) tic classification of that verb (the syntactic lexicon Uses of the word that in RTE 2007 contains 2500 verbs with various clausal comple- verbal complements 68 ments), as in (25), the search space could be signif- adjuncts 50 icantly reduced. deverbal complements 14 noun complements 3 (25) The judge announced that the defendant was other 8 19 guilty.7 Future work 7.2 Other parts of speech that introduce contextsThe detection and incorporation of noun comple-ments for use in the B RIDGE system can be ex- Verbs, adjectives, and adverbs can also licensepanded in several directions, such as automat- complements and hence contexts with implicationing the search process, identifying and classifying signatures. Examples in (26) show different partsother parts of speech that take complements, and of speech that introduce contexts.9exploring transferability to other languages. (26) a. Verb: John said that Paul had arrived.7.1 Automating the search b. Adjective: It is possible that someone ateTesting whether a clause is an adjunct or a noun the last piece of cake.complement or is licensed by something else is cur- c. Adjective: John was available to seerently done by hand. Automating the testing would Mary.allow many more nouns to be tested. However, thisis non-trivial. As (8a) and (8b) demonstrated, the d. Adverb: John falsely reported that Marysurface structure can appear very similar; it is only saw Bill.when we try to figure out the implications of the ex-amples that the differences emerge. Many classes of verbs have already been iden- The Penn Treebank (Marcus et al., 1994) was tified and are incorporated into the system (Nairninitially used to extract complement-taking nouns. et al., 2006): verbs relating to speech (e.g., say,As more tree and dependency banks, as well as lex- report, etc.), implicative verbs such as manageical resources (Macleod et al., 1998), are available, and fail (Karttunen, 2007), and factive verbs (e.g.further lexical items can be extracted in this way. agree, realize, consider) (Vendler, 1967; KiparskyHowever, such resources are costly to build and and Kiparsky, 1971), to name a few. Many adjec-so are only slowly added to the available NLP re- tives have also been added to the system, includ-sources. ing ones taking to and that complements.10 As with Rather than trying to identify all potential noun the complement-taking nouns, a significant part ofcomplement clauses, a simpler approach would be the effort in incorporating the complement-takingto reduce the search space for the human judge. For adjectives into the system was identifying whichexample, some adjuncts (perhaps three quarters of adjectives license complements. The adverbs havethem) could be eliminated from natural examples not been explored in as much using a part-of-speech tagger to identify occur- 9 From a syntactic perspective, the adverb falsely does notrences where a conjugated verb immediately fol- take a complement. However, it does introduce a context in the semantics and hence requires a lexical entry similar to 8 those discussed for the complement-taking nouns. This includes demonstrative uses, uses licensed by other 10parts of speech such as so, and clauses which are the subject This work was largely done by Hannah Copperman dur-of a sentence or the object of a prepositional phrase. ing her internship at PARC.
  7. 7. 7.3 Other languages courage, decency, foresight, gall, gumption, gut,The fact that it has been productive to search impudence, nerve, strength, temerityfor complement-taking nouns through synonyms Chance nouns (impl nn with verb have): chance,and WordNet classes suggests that other languages occasion, opportunitycould benefit from the work done in English. Itwould be interesting to see to what extent the im- Effort nouns (impl nn with verb have): initiative,plicative signatures from one language carry over liberty, troubleinto another, and to what extent they differ. Strongsimilarities could, for example, suggest some com- Other nouns (no implicativity or not yet classi-mon mechanism at work in these nouns that we fied): accord, action, agreement, aim, ambition,have been unable to identify by studying only one appetite, application, appointment, approval, at-language. Searching in other languages could also tempt, attitude, audition, authority, authorization,potentially turn up classes or candidates that were battle, bid, blessing, campaign, capacity, clear-missed in English.11 ance, commission, commitment, concession, con- fidence, consent, consideration, conspiracy, con-8 Conclusions tract, cost, decision, demand, desire, determina-It is important to identify complement-taking tion, directive, drive, duty, eagerness, effort, ev-nouns in order to properly analyze the grammati- idence, expectation, failure, fear, fight, figure,cal and implicative structure of the sentence. Here franchise, help, honor, hunger, hurry, idea, im-we described a bootstrapping approach whereby pertinence, inability, incentive, inclination, indi-annotated corpora and existing lexical resources cation, information, intent, intention, invitation,were used to identify complement-taking nouns. itch, job, journey, justification, keenness, legisla-WordNet was used to find semantically similar tion, license, luck, mandate, moment, motion, mo-nouns. These were then tested in closed examples tive, move, movement, need, note, notice, notifi-and in Web searches in order to determine whether cation, notion, obligation, offer, order, pact, pat-they licensed complements and what the implica- tern, permission, plan, pledge, ploy, police, posi-tive signature of the complement was. Although tion, potential, power, pressure, principle, process,identifying the complete set of these nouns is program, promise, propensity, proposal, proposi-non-trivial, the context mechanism for dealing tion, provision, push, readiness, reason, recom-with implicatives makes adding them to the mendation, refusal, reluctance, reminder, removal,B RIDGE system to derive the correct implications request, requirement, responsibility, right, rush,straightforward. scheme, scramble, sense, sentiment, shame, sign, signal, stake, stampede, strategy, study, support,9 Appendix: Complement-taking nouns task, temptation, tendency, threat, understanding, undertaking, unwillingness, urge, venture, vote,This appendix contains sample complement-taking willingness, wish, word, worknouns and their classification in the B RIDGE sys-tem. 9.2 Nouns that take that clauses Nouns with impl pp nn: abomination, angriness,9.1 Noun that take to clauses angst, animosity, anxiousness, apprehensiveness,Ability nouns (impl nn with verb have): ability, ardor, awe, bereavement, bitterness, case, choler,choice, energy, flexibility, freedom, heart, means, consequence, consternation, covetousness, discon-way, wherewithal certion, disconcertment, disquiet, disquietude, ec- stasy, edginess, enmity, enviousness, event, fact,Asset nouns (impl nn with verb have): money, op- fearfulness, felicity, fright, frustration, fury, gall,tion, time gloom, gloominess, grudge, happiness, hesitancy, hostility, huffiness, huffishness, inquietude, in-Bravery nouns (impl nn with verb have): au- security, ire, jealousy, jitteriness, joy, joyous-dacity, ball, cajones, cheek, chutzpah, cojones, ness, jubilance, jumpiness, lovingness, poignance, 11 Thanks to Martin Forst (p.c.) for suggesting this direc- poignancy, premonition, presentiment, problem,tion. qualm, rancor, rapture, sadness, shyness, situa-
  8. 8. tion, somberness, sorrow, sorrowfulness, suspense, Crouch, Dick. 2005. Packed rewriting for mapping se-terror, trepidation, truth, uneasiness, unhappiness, mantics to KR. In Proceedings of the International Workshop on Computational Semantics.wrath Dagan, Ido, Oren Glickman, and Bernardo Magnini.Nouns with fact p: absurdity, accident, hypocrisy, 2005. The PASCAL recognizing textual entailmentidiocy, irony, miracle challenge. In Proceedings of the PASCAL Chal- lenges Workshop on Recognizing Textual Entailment,Nouns with impl pn np: falsehood, lie Southampton, U.K. Fellbaum, Christiane, editor. 1998. WordNet: An Elec-Other nouns (no implicativity or not yet classi- tronic Lexical Database. The MIT Press.fied): avowal, axiom, conjecture, conviction, cri- Gurevich, Olga, Richard Crouch, Tracy Hollowaytique, effort, fear, feeling, hunch, hysteria, idea, King, and Valeria de Paiva. 2006. Deverbal nounsimpudence, inability, incentive, likelihood, news, in knowledge representation. In Proceedings of thenotion, opinion, optimism, option, outrage, pact, 19th International Florida AI Research Society Con-ploy, point, police, possibility, potential, power, ference (FLAIRS ’06), pages 670–675.precedent, premise, principle, problem, prospect, Karttunen, Lauri. 2007. Word play. Computationalproviso, reluctance, responsibility, right, rumor, Linguistics, 33:443–467.scramble, sentiment, showing, sign, skepticism, Kiparsky, Paul and Carol Kiparsky. 1971. Fact. Instake, stand, story, strategy, tendency, unwilling- Steinberg, D. and L. Jakobovits, editors, Semantics.ness, viewpoint, vision, willingness, word An Inderdisciplinary Reader, pages 345–369. Cam- bridge University Press.References Kipper, Karin, Hoa Trang Dang, and Martha Palmer. 2000. Class-based construction of a verb lexicon.Bobrow, Daniel G., Cleo Condoravdi, Richard Crouch, In AAAI-2000 17th National Conference on Artificial Ron Kaplan, Lauri Karttunen, Tracy Holloway King, Intelligence. Valeria de Paiva, and Annie Zaenen. 2005. A ba- sic logic for textual inference. In Proceedings of Macleod, Catherine, Ralph Grishman, Adam Meyers, the AAAI Workshop on Inference for Textual Question Leslie Barrett, and Ruth Reeves. 1998. NOMLEX: Answering. A lexicon of nominalizations. In EURALEX’98. Marcus, Mitchell, Grace Kim, Mary AnnBobrow, Daniel G., Bob Cheslow, Cleo Condoravdi, Marcinkiewicz, Robert MacIntyre, Ann Bies adn Lauri Karttunen, Tracy Holloway King, Rowan Mark Ferguson, Karen Katz, and Britta Schasberger. Nairn, Valeria de Paiva, Charlotte Price, and Annie 1994. The Penn treebank: Annotative predicate Zaenen. 2007. PARC’s Bridge and question answer- argument structure. In ARPA Human Language ing system. In Grammar Engineering Across Frame- Technology Workshop. works, pages 46–66. CSLI Publications. Maxwell, John and Ron Kaplan. 1991. A method forChatzichrisafis, Nikos, Dick Crouch, Tracy Holloway disjunctive constraint satisfaction. Current Issues in King, Rowan Nairn, Manny Rayner, and Marianne Parsing Technologies. Santaholma. 2007. Regression testing for grammar- based systems. In Grammar Engineering Across Nairn, Rowan, Cleo Condoravdi, and Lauri Karttunen. Frameworks, pages 28–143. CSLI Publications. 2006. Computing relative polarity for textual in- ference. In Inference in Computational SemanticsCohen, K. Bretonnel, William A. Baumgartner Jr., and (ICoS-5). Lawrence Hunter. 2008. Software testing and the naturally occurring data assumption in natural lan- Pichotta, Karl. 2008. Processing paraphrases guage processing. In Software Engineering, Testing, and phrasal implicatives in the Bridge question- and Quality Assurance for Natural Language Pro- answering system. Stanford University, Symbolic cessing, pages 23–30. Association for Computational Systems undergraduate honors thesis. Linguistics. Vendler, Zeno. 1967. Linguistics and Philosophy. Cor-Crouch, Dick and Tracy Holloway King. 2005. Unify- nell University Press. ing lexical resources. In Proceedings of the Interdis- ciplinary Workshop on the Identification and Repre- sentation of Verb Features and Verb Classes.Crouch, Dick and Tracy Holloway King. 2006. Seman- tics via f-structure rewriting. In LFG06 Proceedings. CSLI Publications.