Top profile Call Girls In Satna [ 7014168258 ] Call Me For Genuine Models We ...
Ā
NLP introduced and in 47 slides Lecture 1.ppt
1. CMSC 723 / LING 645: Intro to
Computational Linguistics
September 1, 2004: Dorr
Overview, History, Goals, Problems,
Techniques; Intro to MT (J&M 1, 21)
Prof. Bonnie J. Dorr
Dr. Christof Monz
TA: Adam Lee
3. Other Important Stuff
ļ¬ This course is interdisciplinaryācuts across different areas of expertise.
Expect that a subset of the class will be learning new material at any time,
while others will have to be patient! (The subsets will swap frequently!)
ļ¬ Project 1 and Project 2 are designed differently. Be prepared for this
distinction!
ā P1 will focus on the fundamentals, getting your feet wet with software. By the
end, you should feel comfortable using/testing certain types of NLP software.
ā P2 will require a significantly deeper level of understanding, critique, analysis.
Youāll be expected to think deeply and write a lot in the second project. What you
write will be a major portion of the grade!
ļ¬ No solutions will be handed out. Written comments will be sent to you by
the TA.
ļ¬ All email correspondence MUST HAVE āCMSC 723ā in the Subject line!!!
ļ¬ Submission format for assignments, projects: plain ascii, pdf
ļ¬ Assignment 1 will be posted next week.
4. CL vs NLP
Why āComputational Linguistics (CL)ā rather
than āNatural Language Processingā (NLP)?
ā¢Computational Linguistics
ā Computers dealing with language
ā Modeling what people do
ā¢Natural Language Processing
āApplications on the computer side
5. Relation of CL to
Other Disciplines
Artificial Intelligence (AI)
(notions of rep, search, etc.)
Machine Learning
(particularly, probabilistic
or statistic ML techniques) CL
Linguistics (Syntax,
Semantics, etc.)
Psychology
Electrical Engineering
(EE) (Optical Character
Recognition)
Philosophy of Language,
Formal Logic
Information
Retrieval
Theory of
Computation
Human Computer
Interaction (HCI)
6. A Sampling of
āOther Disciplinesā
ļ¬Linguistics: formal grammars, abstract
characterization of what is to be learned.
ļ¬Computer Science: algorithms for efficient
learning or online deployment of these systems in
automata.
ļ¬Engineering: stochastic techniques for
characterizing regular patterns for learning and
ambiguity resolution.
ļ¬Psychology: Insights into what linguistic
constructions are easy or difficult for people to
learn or to use
7. History: 1940-1950ās
ļ¬Development of formal language theory
(Chomsky, Kleene, Backus).
ā Formal characterization of classes of grammar
(context-free, regular)
ā Association with relevant automata
ļ¬ Probability theory: language understanding as
decoding through noisy channel (Shannon)
ā Use of information theoretic concepts like entropy to
measure success of language models.
8. 1957-1983
Symbolic vs. Stochastic
ļ¬Symbolic
ā Use of formal grammars as basis for natural language
processing and learning systems. (Chomsky, Harris)
ā Use of logic and logic based programming for
characterizing syntactic or semantic inference (Kaplan, Kay,
Pereira)
ā First toy natural language understanding and generation
systems (Woods, Minsky, Schank, Winograd, Colmerauer)
ā Discourse Processing: Role of Intention, Focus (Grosz,
Sidner, Hobbs)
ļ¬Stochastic Modeling
ā Probabilistic methods for early speech recognition, OCR
(Bledsoe and Browning, Jelinek, Black, Mercer)
9. 1983-1993:
Return of Empiricism
ļ¬Use of stochastic techniques for part of
speech tagging, parsing, word sense
disambiguation, etc.
ļ¬Comparison of stochastic, symbolic, more
or less powerful models for language
understanding and learning tasks.
10. 1993-Present
ļ¬Advances in software and hardware create
NLP needs for information retrieval (web),
machine translation, spelling and grammar
checking, speech recognition and
synthesis.
ļ¬Stochastic and symbolic methods combine
for real world applications.
11. Language and Intelligence:
Turing Test
ļ¬Turing test:
ā machine, human, and human judge
ļ¬Judge asks questions of computer and human.
ā Machineās job is to act like a human, humanās job is to
convince judge that heās not the machine.
ā Machine judged āintelligentā if it can fool judge.
ļ¬Judgement of āintelligenceā linked to appropriate
answers to questions from the system.
13. Whatās involved in an
āintelligentā Answer?
Analysis:
Decomposition of the signal (spoken or
written) eventually into meaningful units.
This involves ā¦
14. Speech/Character Recognition
ļ¬Decomposition into words, segmentation
of words into appropriate phones or letters
ļ¬Requires knowledge of phonological
patterns:
ā Iām enormously proud.
ā I mean to make you proud.
16. Syntactic Analysis
ļ¬Associate constituent structure with string
ļ¬Prepare for semantic interpretation
S
NP VP
I V NP
watched det N
the terrapin
OR: watch
Subject Object
I terrapin
Det
the
17. Semantics
ļ¬A way of representing meaning
ļ¬Abstracts away from syntactic structure
ļ¬Example:
ā First-Order Logic: watch(I,terrapin)
ā Can be: āI watched the terrapinā or āThe
terrapin was watched by meā
ļ¬Real language is complex:
ā Who did I watch?
18. Lexical Semantics
The Terrapin, is who I watched.
Watch the Terrapin is what I do best.
*Terrapin is what I watched the
I= experiencer
Watch the Terrapin = predicate
The Terrapin = patient
19. Compositional Semantics
ļ¬Association of parts of a proposition with
semantic roles
ļ¬Scoping
Experiencer Predicate: Be (perc)
I (1st pers, sg) pred patient
saw the Terrapin
Proposition
20. Word-Governed Semantics
ļ¬Any verb can add āableā to form an
adjective.
ā I taught the class . The class is teachable
ā I rejected the idea. The idea is rejectable.
ļ¬Association of particular words with
specific semantic forms.
ā John (masculine)
ā The boys ( masculine, plural, human)
21. Pragmatics
ļ¬Real world knowledge, speaker intention,
goal of utterance.
ļ¬Related to sociology.
ļ¬Example 1:
ā Could you turn in your assignments now (command)
ā Could you finish the homework? (question, command)
ļ¬Example 2:
ā I couldnāt decide how to catch the crook. Then I decided
to spy on the crook with binoculars.
ā To my surprise, I found out he had them too. Then I knew
to just follow the crook with binoculars.
[ the crook [with binoculars]]
[ the crook] [ with binoculars]
22. Discourse Analysis
ļ¬Discourse: How propositions fit together in
a conversationāmulti-sentence processing.
ā Pronoun reference:
The professor told the student to finish the assignment.
He was pretty aggravated at how long it was taking to
pass it in.
ā Multiple reference to same entity:
George W. Bush, president of the U.S.
ā Relation between sentences:
John hit the man. He had stolen his bicycle
25. Ambiguity
I made her duck
I made duckling for her
I made the duckling belonging to her
I created the duck she owns
I forced her to lower her head
By magic, I changed her into a duck
26. S S
NP VP NP VP
I V NP VP I V NP
made her V made det N
duck her duck
Syntactic Disambiguation
ļ¬Structural ambiguity:
27. Part of Speech Tagging and
Word Sense Disambiguation
ļ¬[verb Duck ] !
[noun Duck] is delicious for dinner
ļ¬I went to the bank to deposit my check.
I went to the bank to look out at the river.
I went to the bank of windows and chose
the one dealing with last names beginning
with ādā.
28. Resources for
NLP Systems
ā¢ Dictionary
ā¢ Morphology and Spelling Rules
ā¢ Grammar Rules
ā¢ Semantic Interpretation Rules
ā¢ Discourse Interpretation
Natural Language processing involves (1) learning
or fashioning the rules for each component, (2)
embedding the rules in the relevant automaton, (3)
and using the automaton to efficiently process the
input .
29. Some NLP Applications
ļ¬ Machine TranslationāBabelfish (Alta Vista):
ļ¬ Question AnsweringāAsk Jeeves (Ask Jeeves):
ļ¬ Language SummarizationāMEAD (U. Michigan):
ļ¬ Spoken Language Recognitionā EduSpeak (SRI):
ļ¬ Automatic Essay evaluationāE-Rater (ETS):
ļ¬ Information Retrieval and ExtractionāNetOwl (SRA):
http://babelfish.altavista.com/translate.dyn
http://www.ets.org/research/erater.html
http://www.eduspeak.com/
http://www.netowl.com/extractor_summary.html
http://www.ask.com/
http://www.summarization.com/mead
30. What is MT?
ļ¬Definition: Translation from one natural
language to another by means of a
computerized system
ļ¬Early failures
ļ¬Later: varying degrees of success
31. An Old Example
The spirit is willing but the flesh is weak
The vodka is good but the meat is rotten
32. Machine Translation History
ļ¬1950ās: Intensive research activity in MT
ļ¬1960ās: Direct word-for-word replacement
ļ¬1966 (ALPAC): NRC Report on MT
ļ¬Conclusion: MT no longer worthy of serious
scientific investigation.
ļ¬1966-1975: `Recovery periodā
ļ¬1975-1985: Resurgence (Europe, Japan)
ļ¬1985-present: Resurgence (US)
http://ourworld.compuserve.com/homepages/WJHutchins/MTS-93.htm.
33. What happened between
ALPAC and Now?
ļ¬Need for MT and other NLP applications
confirmed
ļ¬Change in expectations
ļ¬Computers have become faster, more powerful
ļ¬WWW
ļ¬Political state of the world
ļ¬Maturation of Linguistics
ļ¬Development of hybrid statistical/symbolic
approaches
34. Three MT Approaches: Direct,
Transfer, Interlingual
Interlingua
Semantic
Structure
Semantic
Structure
Syntactic
Structure
Syntactic
Structure
Word
Structure
Word
Structure
Source Text Target Text
Semantic
Composition
Semantic
Decomposition
Semantic
Analysis
Semantic
Generation
Syntactic
Analysis
Syntactic
Generation
Morphological
Analysis
Morphological
Generation
Semantic
Transfer
Syntactic
Transfer
Direct
36. MT Systems: 1964-1990
ļ¬Direct: GAT [Georgetown, 1964],
TAUM-METEO [Colmerauer et al. 1971]
ļ¬Transfer: GETA/ARIANE [Boitet, 1978]
LMT [McCord, 1989], METAL [Thurmair,
1990], MiMo [Arnold & Sadler, 1990], ā¦
ļ¬Interlingual: MOPTRANS [Schank, 1974],
KBMT [Nirenburg et al, 1992], UNITRAN
[Dorr, 1990]
37. Statistical MT and Hybrid
Symbolic/Stats MT: 1990-Present
Candide [Brown, 1990, 1992];
Halo/Nitrogen [Langkilde and Knight,
1998], [Yamada and Knight, 2002];
GHMT [Dorr and Habash, 2002];
DUSTer [Dorr et al. 2002]
38. Direct MT: Pros and Cons
ļ¬Pros
ā Fast
ā Simple
ā Inexpensive
ā No translation rules hidden in lexicon
ļ¬Cons
ā Unreliable
ā Not powerful
ā Rule proliferation
ā Requires too much context
ā Major restructuring after lexical substitution
39. Transfer MT: Pros and Cons
ļ¬Pros
ā Donāt need to find language-neutral rep
ā Relatively fast
ļ¬Cons
ā N2 sets of transfer rules: Difficult to extend
ā Proliferation of language-specific rules in
lexicon and syntax
ā Cross-language generalizations lost
40. Interlingual MT: Pros and Cons
ļ¬Pros
ā Portable (avoids N2 problem)
ā Lexical rules and structural transformations stated more
simply on normalized representation
ā Explanatory Adequacy
ļ¬Cons
ā Difficult to deal with terms on primitive level:
universals?
ā Must decompose and reassemble concepts
ā Useful information lost (paraphrase)
41. Approximate IL Approach
ļ¬Tap into richness of TL resources
ļ¬Use some, but not all, components of
IL representation
ļ¬Generate multiple sentences that are
statistically pared down
43. Interlingual vs. Approximate IL
ļ¬ Interlingual MT:
ā primitives & relations
ā bi-directional lexicons
ā analysis: compose IL
ā generation: decompose IL
ļ¬ Approximate IL
ā hybrid symbolic/statistical design
ā overgeneration with statistical ranking
ā uses dependency rep input and structural expansion
for ādeeperā overgeneration
44. Mapping from Input Dependency
to English Dependency Tree
Knowledge Resources in English only: (LVD; Dorr, 2001).
Goal
GIVEV
MARY KICKN JOHN
Theme
Agent
[CAUSE GO]
Goal
KICKV
MARY JOHN
Agent
[CAUSE GO]
Mary le dio patadas a John ā Mary kicked John
45. Statistical Extraction
Mary kicked John . [-0.670270 ]
Mary gave a kick at John . [-2.175831]
Mary gave the kick at John . [-3.969686]
Mary gave an kick at John . [-4.489933]
Mary gave a kick by John . [-4.803054]
Mary gave a kick to John . [-5.045810]
Mary gave a kick into John . [-5.810673]
Mary gave a kick through John . [-5.836419]
Mary gave a foot wound by John . [-6.041891]
Mary gave John a foot wound . [-6.212851]
46. Benefits of Approximate
IL Approach
ļ¬Explaining behaviors that appear to be
statistical in nature
ļ¬āRe-sourceabilityā: Re-use of already
existing components for MT from new
languages.
ļ¬Application to monolingual
alternations
47. What Resources are Required?
ļ¬Deep TL resources
ļ¬Requires SL parser and tralex
ļ¬TL resources are richer: LVD
representations, CatVar database
ļ¬Constrained overgeneration