SlideShare a Scribd company logo
Applied Artificial Intelligence
Unit – 5
Topics to Cover…!
• Advanced Knowledge Representation Techniques: Conceptual
dependency theory, script structure, CYC theory, case grammars,
semantic web.
• Natural Language Processing: Sentence Analysis phases, grammars
and parsers, types of parsers, semantic analysis, universal networking
language, dictionary
PPT BY: MADHAV MISHRA 2
PPT BY: MADHAV MISHRA 3
PPT BY: MADHAV MISHRA 4
PPT BY: MADHAV MISHRA 5
PPT BY: MADHAV MISHRA 6
PPT BY: MADHAV MISHRA 7
PPT BY: MADHAV MISHRA 8
PPT BY: MADHAV MISHRA 9
PPT BY: MADHAV MISHRA 10
PPT BY: MADHAV MISHRA 11
PPT BY: MADHAV MISHRA 12
PPT BY: MADHAV MISHRA 13
PPT BY: MADHAV MISHRA 14
PPT BY: MADHAV MISHRA 15
PPT BY: MADHAV MISHRA 16
PPT BY: MADHAV MISHRA 17
PPT BY: MADHAV MISHRA 18
PPT BY: MADHAV MISHRA 19
PPT BY: MADHAV MISHRA 20
PPT BY: MADHAV MISHRA 21
PPT BY: MADHAV MISHRA 22
PPT BY: MADHAV MISHRA 23
script structures
• A script is a structured representation
describing a stereotyped sequence of events in
a particular context.
• Scripts are used in natural language
understanding systems to organize a knowledge
base in terms of the situations that the system
should understand. Scripts use a frame-like
structure to represent the commonly occurring
experience like going to the movies eating in a
restaurant, shopping in a supermarket, or
visiting an ophthalmologist.
• Thus, a script is a structure that prescribes a set
of circumstances that could be expected to
follow on from one another.
PPT BY: MADHAV MISHRA 24
• Scripts are beneficial because:
• Events tend to occur in known runs or
patterns.
• A casual relationship between events
exist.
• An entry condition exists which allows
an event to take place.
• Prerequisites exist upon events taking
place.
PPT BY: MADHAV MISHRA 25
Components of a script:
• The components of a script include:
• Entry condition: These are basic condition which must
be fulfilled before events in the script can occur.
• Results: Condition that will be true after events in script
occurred.
• Props: Slots representing objects involved in events
• Roles: These are the actions that the individual
participants perform.
• Track: Variations on the script. Different tracks may
share components of the same scripts.
• Scenes: The sequence of events that occur.
PPT BY: MADHAV MISHRA 26
Describing a
script, special
symbols of
actions are used.
These are:
PPT BY: MADHAV MISHRA 27
Example: Script for going to the bankto withdrawmoney.
PPT BY: MADHAV MISHRA 28
Advantages of Scripts
• Ability to predict events.
• A single coherent interpretation maybe builds up
from a collection of observations.
Disadvantages of Scripts
• Less general than frames.
• May not be suitable to represent all kinds of
knowledge
PPT BY: MADHAV MISHRA 29
CYC Theory
• Cyc has a huge knowledge base which it uses for reasoning.
• Contains
• 15,000 predicates
• 300,000 concepts
• 3,200,000 assertions
• All these predicates, concepts and assertions are arranged in
numerous ontologies.
PPT BY: MADHAV MISHRA 31
Cyc: Features
Uncertain Results
• Query: “who had the motive for the assassination of
Rafik Hariri?”
• Since the case is still an unsolved political mystery, there
is no way we can ever get the answer.
• In cases like these Cyc returns the various view points,
quoting the sources from which it built its inferences.
• For the above query, it gives two view points
• “USA and Israel” as quoted from a editorial in Al
Jazeera
• “Syria” as quoted from a news report from CNN
PPT BY: MADHAV MISHRA 32
• It uses Google as the search engine in the background.
• It filters results according to the context of the query.
• For example, if we search for assassination of Rafik Hariri, then it
omits results which have a time stamp before that of the
assassination date.
PPT BY: MADHAV MISHRA 33
Qualitative Queries
• Query: “Was Bill Clinton a good President of the United
States?”
• In cases like these, Cyc returns the results in a pros and cons
type and leave it to the user to make a conclusion.
Queries With No Answer
• Query: “At this instance of time, Is Alice inhaling or Exhaling?”
• The Cyc system is intelligent enough to figure out queries
which can never be answered correctly.
PPT BY: MADHAV MISHRA 34
• The ultimate goal is to build enough common sense into the Cyc system
such that it can understand Natural Language.
• Once it understands Natural Language, all the system has to do is crawl
through all the online material and learn new common sense rules and
evolve.
• This two step process of building common sense and using machine
learning techniques to learn new things will make the Cyc system an
infinite source of knowledge.
PPT BY: MADHAV MISHRA 35
Drawbacks
• There is no single Ontology that works in all
cases.
• Although Cyc is able to simulate common sense
it cannot distinguish between facts and fiction.
• In Natural Language Processing there is no way
the Cyc system can figure out if a particular word
is used in the normal sense or in the sarcastic
sense.
• Adding knowledge is a very tedious process.
PPT BY: MADHAV MISHRA 36
Semantic Web
• The development of Semantic Web is well underway with a goal that it would be
possible for machines to understand the information on the web rather than
simply display.
• The major obstacle to this goal is the fact that most information on the web is
designed solely for human consumption. This information should be structured
in a way that machines can understand and process that information.
• The concept of machine-understandable documents does not imply “Artificial
Intelligence”. It only indicates a machine’s ability to solve well-defined problems
by performing well-defined operations on well-defined data.
• The key technological threads that are currently employed in the development
of Semantic Web are: eXtensible Markup Language (XML), Resource Description
Framework (RDF), DAML (DARPA Agent Markup Language).
PPT BY: MADHAV MISHRA 37
• Most of the web’s content today is designed for humans to read , and not for
computer programs to process meaningfully.
• Computers can
- parse the web pages.
- perform routine processing (here a header, there a link, etc.)
• In general, they have no reliable way to understand and process the semantics.
• The Semantic Web will bring structure to the meaningful content of the web of
web pages, creating an environment where software agents roaming from page to
page carry out sophisticated tasks for users.
• The Semantic Web is not a separate web
PPT BY: MADHAV MISHRA 38
Knowledge
Representation
• For Semantic Web to function, the computers should have access
to • Structured Collections of Information
• Meaning of this Information
• Sets of Inference Rules/Logic.
These sets of Inference rules can be used to conduct automated
reasoning.
• Technological Threads for developing the Semantic
Web:
- XML
- RDF
- Ontologies
PPT BY: MADHAV MISHRA
39
XML
• XML lets everyone to create their own tags.
• These tags can be used by the script programs in sophisticated ways to
perform various tasks, but the script writer has to know what the page
writer uses each tag for.
• In short, XML allows you to add arbitrary structure to the documents but
says nothing about what the structures mean.
• It has no built mechanism to convey the meaning of the user’s new tags to
other users.
40
PPT BY: MADHAV MISHRA
PPT BY: MADHAV MISHRA 41
• A scheme for defining information on the web. It provides the technology
for
expressing the meaning of terms and concepts in a form that computers
can
readily process.
• RDF encodes this information on the XML page in sets of triples. The
triple is an information on the web about related things.
• Each triple is a combination of Subject, Verb and Object, similar to an
elementary sentence.
• Subjects, Verbs and Objects are each identified by a URI, which enable
anyone to define a new concept/new verb just by defining a URI for it
somewhere on the web.
RDF
PPT BY: MADHAV MISHRA
42
These triples can be written using XML tags as shown,
RDF (contd.)
• An RDF document can make assertions that particular things (people, web
pages or whatever) have properties ( “is a sister of”, “is the author of”) with
values (another person, another person, etc.)
• RDF uses a different URI for each specific concept. Solves the problem of
same definition but different concepts. Eg. AddressTags in an XML page.
PPT BY: MADHAV MISHRA 43
• Ontologies are collections of statements written in a language such as RDF that define relations
between concepts and specifies logical rules for reasoning about them.
• Computers/agents/services will understand the meaning of semantic data on
a web page by following links to specified ontologies.
• Ontologies can express a large number of relationships among entities
(objects) by assigning properties to classes and allowing subclasses to inherit
such properties.
• An Ontology may express the rule,
If City Code State Code
and Address City Code then Address State Code
• Enhances the functioning of semantic web: Improves accuracy of web
searches, Easy development of programs that can tackle complicated queries.
Ontologies
PPT BY: MADHAV MISHRA
44
PPT BY: MADHAV MISHRA 45
PPT BY: MADHAV MISHRA 46
Case Grammars
• Case grammars use the functional relationships between noun phrases and verbs
to conduct the more deeper case of a sentence
• Generally in our English sentences, the difference between different forms of a
sentence is quite negligible.
• In early 1970’s Fillmore gave some idea about different cases of a English
sentence.
• He extended the transformational grammars of Chomsky by focusing more on the
semantic aspects of view of a sentence.
• In case grammars a sentence id defined as being composed of a preposition P, a
modality constituent M, composed of mood, tense, aspect, negation and so on.
Thus we can represent a sentence like
Where P - Set of relationships among verbs and noun phrases i.e. P = (C=Case)
M - Modality constituent
PPT BY: MADHAV MISHRA
47
PPT BY: MADHAV MISHRA 48
PPT BY: MADHAV MISHRA 49
Natural
Language
Processing
Components of NLP
• There are two components of NLP as given −
Natural Language Understanding (NLU)
• Understanding involves the following tasks −
• Mapping the given input in natural language into useful representations.
• Analysing different aspects of the language.
Natural Language Generation (NLG)
• It is the process of producing meaningful phrases and sentences in the form of
natural language from some internal representation. It involves :
• Text planning − It includes retrieving the relevant content from knowledge base.
• Sentence planning − It includes choosing required words, forming meaningful
phrases, setting tone of the sentence.
• Text Realization − It is mapping sentence plan into sentence structure.
PPT BY: MADHAV MISHRA 51
NLP Terminology
• Phonology − It is study of organizing sound systematically.
• Morphology − It is a study of construction of words from primitive
meaningful units.
• Syntax − It refers to arranging words to make a sentence. It also involves
determining the structural role of words in the sentence and in phrases.
• Semantics − It is concerned with the meaning of words and how to
combine words into meaningful phrases and sentences.
• Pragmatics − It deals with using and understanding sentences in different
situations and how the interpretation of the sentence is affected.
• Discourse − It deals with how the immediately preceding sentence can
affect the interpretation of the next sentence.
• World Knowledge − It includes the general knowledge about the world.
PPT BY: MADHAV MISHRA 52
Sentence Analysis Phases
• Lexical Analysis − It involves identifying and analyzing the
structure of words. Lexicon of a language means the collection
of words and phrases in a language. Lexical analysis is dividing
the whole chunk of txt into paragraphs, sentences, and words.
• Syntactic Analysis (Parsing) − It involves analysis of words in
the sentence for grammar and arranging words in a manner
that shows the relationship among the words. The sentence
such as “The school goes to boy” is rejected by English syntactic
analyzer.
• Semantic Analysis − It draws the exact meaning or the
dictionary meaning from the text. The text is checked for
meaningfulness. It is done by mapping syntactic structures and
objects in the task domain. The semantic analyzer disregards
sentence such as “hot ice-cream”.
PPT BY: MADHAV MISHRA 53
• Discourse Integration − The meaning of any sentence depends upon
the meaning of the sentence just before it. In addition, it also brings
about the meaning of immediately succeeding sentence.
• Pragmatic Analysis − During this, what was said is re-interpreted on
what it actually meant. It involves deriving those aspects of language
which require real world knowledge.
PPT BY: MADHAV MISHRA 54
Grammars And Parsers
• Context-Free Grammar
• It is the grammar that consists rules with a single symbol on the left-
hand side of the rewrite rules. Let us create grammar to parse a
sentence −
“The bird pecks the grains”
Articles (DETERMINER(DET)) − a | an | the
Nouns − bird | birds | grain | grains
Noun Phrase (NP) − Article + Noun | Article + Adjective + Noun
= DET N | DET ADJ N
Verbs − pecks | pecking | pecked
Verb Phrase (VP) − NP V | V NP
Adjectives (ADJ) − beautiful | small | chirping
PPT BY: MADHAV MISHRA 55
• The parse tree breaks down the sentence into structured parts so that
the computer can easily understand and process it. In order for the
parsing algorithm to construct this parse tree, a set of rewrite rules,
which describe what tree structures are legal, need to be constructed.
• These rules say that a certain symbol may be expanded in the tree by a
sequence of other symbols. According to first order logic rule, if there
are two strings Noun Phrase (NP) and Verb Phrase (VP), then the string
combined by NP followed by VP is a sentence. The rewrite rules for the
sentence are as follows −
• S → NP VP
• NP → DET N | DET ADJ N
• VP → V NP
PPT BY: MADHAV MISHRA 56
• Lexocon −
• DET → a | the
• ADJ → beautiful | perching
• N → bird | birds | grain | grains
• V → peck | pecks | pecking
• The parse tree can be created as
shown −
PPT BY: MADHAV MISHRA 57
PARSING PROCESS
• Parsing is the term used to describe the process
of automatically building syntactic analysis of a
sentence in terms of a given grammar and
lexicon.
• The resulting syntactic analysis may be used as
input to a process of semantic interpretation.
• Occasionally, parsing is also used to include both
syntactic and semantic analysis.
• The parsing process is done by the parser.
• The parsing performs grouping and labeling of
parts of a sentence in a way that displays their
relationships to each other in a proper way.
• The parser is a computer program which accepts
the natural language sentence as input and
generates an output structure suitable for
analysis.
PPT BY: MADHAV MISHRA 58
Types of Parsing
• The parsing technique can be categorized into two types such as
- Top down Parsing
- Bottom up Parsing
Top down Parsing
Top down parsing starts with the starting symbol and proceeds towards the goal. We can say
it is the process of construction the parse tree starting at the root and proceeds towards the
leaves.
It is a strategy of analyzing unknown data relationships by hypothesizing general parse tree
structures and then considering whether the known fundamental structures are compatible
with the hypothesis.
In top down parsing words of the sentence are replaced by their categories like verb phrase
(VP), Noun phrase (NP), Preposition phrase (PP), etc.
Let us consider some examples to illustrate top down parsing. We will consider both the
symbolical representation and the graphical representation. We will take the words of the
sentences and reach at the complete sentence. For parsing we will consider the previous
symbols like PP, NP, VP, ART, N, V and so on. Examples of top down parsing are LL (Left-to-
right, left most derivation), recursive descent parser etc.
PPT BY: MADHAV MISHRA 59
PPT BY: MADHAV MISHRA 60
Bottom up Parsing
• In this parsing technique the process begins with the sentence and
the words of the sentence is replaced by their relevant symbols.
• It is also called shift reducing parsing.
• In bottom up parsing the construction of parse tree starts at the
leaves and proceeds towards the root.
• Bottom up parsing is a strategy for analyzing unknown data
relationships that attempts to identify the most fundamental units
first and then to infer higher order structures for them.
• This process occurs in the analysis of both natural languages and
computer languages.
• It is common for bottom up parsers to take the form of general
parsing engines that can wither parse or generate a parser for a
specific programming language given a specific of its grammar.
PPT BY: MADHAV MISHRA 61
PPT BY: MADHAV MISHRA 62
Semantic Analysis
• Semantic Analysis is the process of drawing meaning from text.
• It allows computers to understand and interpret sentences, paragraphs, or
whole documents, by analysing their grammatical structure, and identifying
relationships between individual words in a particular context.
• It’s an essential sub-task of Natural Language Processing (NLP) and the
driving force behind machine learning tools like chatbots, search engines,
and text analysis.
• Semantic analysis-driven tools can help companies automatically extract
meaningful information from unstructured data, such as emails, support
tickets, and customer feedback.
PPT BY: MADHAV MISHRA 63
How Semantic Analysis Works
• Lexical semantics plays an important role in semantic analysis, allowing
machines to understand relationships between lexical items (words, phrasal
verbs, etc.):
• Hyponyms: specific lexical items of a generic lexical item (hypernym) e.g.
orange is a hyponym of fruit (hypernym).
• Meronomy: a logical arrangement of text and words that denotes a
constituent part of or member of something e.g., a segment of an orange
• Polysemy: a relationship between the meanings of words or phrases,
although slightly different, share a common core meaning e.g. I read a paper,
and I wrote a paper)
• Synonyms: words that have the same sense or nearly the same meaning as
another, e.g., happy, content, ecstatic, overjoyed
• Antonyms: words that have close to opposite meanings e.g., happy, sad
• Homonyms: two words that are sound the same and are spelled alike but
have a different meaning e.g., orange (color), orange (fruit)
PPT BY: MADHAV MISHRA
64
• Semantic analysis also takes into account signs and symbols (semiotics)
and collocations (words that often go together).
• Automated semantic analysis works with the help of machine learning
algorithms.
• By feeding semantically enhanced machine learning algorithms with
samples of text, you can train machines to make accurate predictions
based on past observations.
• There are various sub-tasks involved in a semantic-based approach for
machine learning, including word sense disambiguation and relationship
extraction:
Word Sense Disambiguation & Relationship Extraction
PPT BY: MADHAV MISHRA 65
Word Sense Disambiguation:
• The automated process of identifying in which sense is a word used
according to its context.
• Natural language is ambiguous and polysemic; sometimes, the same
word can have different meanings depending on how it’s used.
• The word “orange,” for example, can refer to a color, a fruit, or even a
city in Florida!
• The same happens with the word “date,” which can mean either a
particular day of the month, a fruit, or a meeting.
PPT BY: MADHAV MISHRA 66
• Relationship Extraction
• This task consists of detecting the semantic relationships present in a
text. Relationships usually involve two or more entities (which can be
names of people, places, company names, etc.). These entities are
connected through a semantic category, such as “works at,” “lives in,”
“is the CEO of,” “headquartered at.”
• For example, the phrase “Steve Jobs is one of the founders of Apple,
which is headquartered in California” contains two different
relationships:
PPT BY: MADHAV MISHRA 67
PPT BY: MADHAV MISHRA 68
PPT BY: MADHAV MISHRA 69
PPT BY: MADHAV MISHRA 70
PPT BY: MADHAV MISHRA 71
PPT BY: MADHAV MISHRA 72
PPT BY: MADHAV MISHRA 73
PPT BY: MADHAV MISHRA 74
PPT BY: MADHAV MISHRA 75
PPT BY: MADHAV MISHRA 76
Dictionary
• Also Known as UNL Dictionary.
• It stores concepts, represented by the language words.
• It stores universal words for identifying concepts, words headings that can express concepts and
information on the syntactical behaviour.
• Each entry consists of a correspondence between a concept and a word along with information
concerning syntactic properties.
• The Grammar for defining words of the language in the dictionary is shown below
PPT BY: MADHAV MISHRA 77
PPT BY: MADHAV MISHRA 78

More Related Content

What's hot

And then there were ... Large Language Models
And then there were ... Large Language ModelsAnd then there were ... Large Language Models
And then there were ... Large Language Models
Leon Dohmen
 
Transformers, LLMs, and the Possibility of AGI
Transformers, LLMs, and the Possibility of AGITransformers, LLMs, and the Possibility of AGI
Transformers, LLMs, and the Possibility of AGI
SynaptonIncorporated
 
Large Language Models - Chat AI.pdf
Large Language Models - Chat AI.pdfLarge Language Models - Chat AI.pdf
Large Language Models - Chat AI.pdf
David Rostcheck
 
Large Language Models, No-Code, and Responsible AI - Trends in Applied NLP in...
Large Language Models, No-Code, and Responsible AI - Trends in Applied NLP in...Large Language Models, No-Code, and Responsible AI - Trends in Applied NLP in...
Large Language Models, No-Code, and Responsible AI - Trends in Applied NLP in...
David Talby
 
Bert.pptx
Bert.pptxBert.pptx
Bert.pptx
Divya Gera
 
1909 BERT: why-and-how (CODE SEMINAR)
1909 BERT: why-and-how (CODE SEMINAR)1909 BERT: why-and-how (CODE SEMINAR)
1909 BERT: why-and-how (CODE SEMINAR)
WarNik Chow
 
Machine Learning Unit 2 Semester 3 MSc IT Part 2 Mumbai University
Machine Learning Unit 2 Semester 3  MSc IT Part 2 Mumbai UniversityMachine Learning Unit 2 Semester 3  MSc IT Part 2 Mumbai University
Machine Learning Unit 2 Semester 3 MSc IT Part 2 Mumbai University
Madhav Mishra
 
A Comprehensive Review of Large Language Models for.pptx
A Comprehensive Review of Large Language Models for.pptxA Comprehensive Review of Large Language Models for.pptx
A Comprehensive Review of Large Language Models for.pptx
SaiPragnaKancheti
 
An Introduction to Soft Computing
An Introduction to Soft ComputingAn Introduction to Soft Computing
An Introduction to Soft Computing
Tameem Ahmad
 
The Rise of the LLMs - How I Learned to Stop Worrying & Love the GPT!
The Rise of the LLMs - How I Learned to Stop Worrying & Love the GPT!The Rise of the LLMs - How I Learned to Stop Worrying & Love the GPT!
The Rise of the LLMs - How I Learned to Stop Worrying & Love the GPT!
taozen
 
Explainable AI in Industry (FAT* 2020 Tutorial)
Explainable AI in Industry (FAT* 2020 Tutorial)Explainable AI in Industry (FAT* 2020 Tutorial)
Explainable AI in Industry (FAT* 2020 Tutorial)
Krishnaram Kenthapadi
 
AI, Machine Learning, and Data Science Concepts
AI, Machine Learning, and Data Science ConceptsAI, Machine Learning, and Data Science Concepts
AI, Machine Learning, and Data Science Concepts
Dan O'Leary
 
KBS Lecture Notes
KBS Lecture NotesKBS Lecture Notes
KBS Lecture Notes
butest
 
Deep Learning for Natural Language Processing
Deep Learning for Natural Language ProcessingDeep Learning for Natural Language Processing
Deep Learning for Natural Language Processing
Devashish Shanker
 
Dendral
DendralDendral
Dendral
gupta8741
 
Introduction to Natural Language Processing
Introduction to Natural Language ProcessingIntroduction to Natural Language Processing
Introduction to Natural Language Processing
Pranav Gupta
 
Responsible AI
Responsible AIResponsible AI
Responsible AI
Neo4j
 
Generative AI and ChatGPT - Scope of AI and advance Generative AI
Generative AI and ChatGPT - Scope of AI and advance Generative AIGenerative AI and ChatGPT - Scope of AI and advance Generative AI
Generative AI and ChatGPT - Scope of AI and advance Generative AI
Kumaresan K
 
Fine-tuning BERT for Question Answering
Fine-tuning BERT for Question AnsweringFine-tuning BERT for Question Answering
Fine-tuning BERT for Question Answering
Apache MXNet
 
Designing Human-Centered AI Products & Systems
Designing Human-Centered AI Products & SystemsDesigning Human-Centered AI Products & Systems
Designing Human-Centered AI Products & Systems
Uday Kumar
 

What's hot (20)

And then there were ... Large Language Models
And then there were ... Large Language ModelsAnd then there were ... Large Language Models
And then there were ... Large Language Models
 
Transformers, LLMs, and the Possibility of AGI
Transformers, LLMs, and the Possibility of AGITransformers, LLMs, and the Possibility of AGI
Transformers, LLMs, and the Possibility of AGI
 
Large Language Models - Chat AI.pdf
Large Language Models - Chat AI.pdfLarge Language Models - Chat AI.pdf
Large Language Models - Chat AI.pdf
 
Large Language Models, No-Code, and Responsible AI - Trends in Applied NLP in...
Large Language Models, No-Code, and Responsible AI - Trends in Applied NLP in...Large Language Models, No-Code, and Responsible AI - Trends in Applied NLP in...
Large Language Models, No-Code, and Responsible AI - Trends in Applied NLP in...
 
Bert.pptx
Bert.pptxBert.pptx
Bert.pptx
 
1909 BERT: why-and-how (CODE SEMINAR)
1909 BERT: why-and-how (CODE SEMINAR)1909 BERT: why-and-how (CODE SEMINAR)
1909 BERT: why-and-how (CODE SEMINAR)
 
Machine Learning Unit 2 Semester 3 MSc IT Part 2 Mumbai University
Machine Learning Unit 2 Semester 3  MSc IT Part 2 Mumbai UniversityMachine Learning Unit 2 Semester 3  MSc IT Part 2 Mumbai University
Machine Learning Unit 2 Semester 3 MSc IT Part 2 Mumbai University
 
A Comprehensive Review of Large Language Models for.pptx
A Comprehensive Review of Large Language Models for.pptxA Comprehensive Review of Large Language Models for.pptx
A Comprehensive Review of Large Language Models for.pptx
 
An Introduction to Soft Computing
An Introduction to Soft ComputingAn Introduction to Soft Computing
An Introduction to Soft Computing
 
The Rise of the LLMs - How I Learned to Stop Worrying & Love the GPT!
The Rise of the LLMs - How I Learned to Stop Worrying & Love the GPT!The Rise of the LLMs - How I Learned to Stop Worrying & Love the GPT!
The Rise of the LLMs - How I Learned to Stop Worrying & Love the GPT!
 
Explainable AI in Industry (FAT* 2020 Tutorial)
Explainable AI in Industry (FAT* 2020 Tutorial)Explainable AI in Industry (FAT* 2020 Tutorial)
Explainable AI in Industry (FAT* 2020 Tutorial)
 
AI, Machine Learning, and Data Science Concepts
AI, Machine Learning, and Data Science ConceptsAI, Machine Learning, and Data Science Concepts
AI, Machine Learning, and Data Science Concepts
 
KBS Lecture Notes
KBS Lecture NotesKBS Lecture Notes
KBS Lecture Notes
 
Deep Learning for Natural Language Processing
Deep Learning for Natural Language ProcessingDeep Learning for Natural Language Processing
Deep Learning for Natural Language Processing
 
Dendral
DendralDendral
Dendral
 
Introduction to Natural Language Processing
Introduction to Natural Language ProcessingIntroduction to Natural Language Processing
Introduction to Natural Language Processing
 
Responsible AI
Responsible AIResponsible AI
Responsible AI
 
Generative AI and ChatGPT - Scope of AI and advance Generative AI
Generative AI and ChatGPT - Scope of AI and advance Generative AIGenerative AI and ChatGPT - Scope of AI and advance Generative AI
Generative AI and ChatGPT - Scope of AI and advance Generative AI
 
Fine-tuning BERT for Question Answering
Fine-tuning BERT for Question AnsweringFine-tuning BERT for Question Answering
Fine-tuning BERT for Question Answering
 
Designing Human-Centered AI Products & Systems
Designing Human-Centered AI Products & SystemsDesigning Human-Centered AI Products & Systems
Designing Human-Centered AI Products & Systems
 

Similar to Applied Artificial Intelligence Unit 5 Semester 3 MSc IT Part 2 Mumbai University

Adopting Data Science and Machine Learning in the financial enterprise
Adopting Data Science and Machine Learning in the financial enterpriseAdopting Data Science and Machine Learning in the financial enterprise
Adopting Data Science and Machine Learning in the financial enterprise
QuantUniversity
 
An Introduction to Semantic Web Technology
An Introduction to Semantic Web TechnologyAn Introduction to Semantic Web Technology
An Introduction to Semantic Web Technology
Ankur Biswas
 
Intelligent expert systems for location planning
Intelligent expert systems for location planningIntelligent expert systems for location planning
Intelligent expert systems for location planning
Navid Milanizadeh
 
Integrating Semantic Systems
Integrating Semantic SystemsIntegrating Semantic Systems
Integrating Semantic Systems
Kingsley Uyi Idehen
 
Precision Content™ Tools, Techniques, and Technology
Precision Content™ Tools, Techniques, and TechnologyPrecision Content™ Tools, Techniques, and Technology
Precision Content™ Tools, Techniques, and Technology
dclsocialmedia
 
How to Future-proof Your Content by Sarah Beckley
How to Future-proof Your Content by Sarah BeckleyHow to Future-proof Your Content by Sarah Beckley
How to Future-proof Your Content by Sarah Beckley
Content Strategy Workshops
 
The Semantic Web: What IAs Need to Know About Web 3.0
The Semantic Web: What IAs Need to Know About Web 3.0The Semantic Web: What IAs Need to Know About Web 3.0
The Semantic Web: What IAs Need to Know About Web 3.0
Chiara Fox Ogan
 
Semantic Web Analytics.pptx
Semantic Web Analytics.pptxSemantic Web Analytics.pptx
Semantic Web Analytics.pptx
celestinananditha
 
Data Discovery and Metadata
Data Discovery and MetadataData Discovery and Metadata
Data Discovery and Metadata
markgrover
 
Fitsum ristu lakew the semantic web
Fitsum ristu lakew the semantic webFitsum ristu lakew the semantic web
Fitsum ristu lakew the semantic web
FITSUM RISTU LAKEW
 
The Real-time Web in the Age of Agents
The Real-time Web in the Age of AgentsThe Real-time Web in the Age of Agents
The Real-time Web in the Age of Agents
Joshua Shinavier
 
No more BITS - Blind Insignificant Technologies ands Systems by Roger Roberts...
No more BITS - Blind Insignificant Technologies ands Systems by Roger Roberts...No more BITS - Blind Insignificant Technologies ands Systems by Roger Roberts...
No more BITS - Blind Insignificant Technologies ands Systems by Roger Roberts...
ACTUONDA
 
How does semantic technology work?
How does semantic technology work? How does semantic technology work?
How does semantic technology work?
Graeme Wood
 
Strategies for integrating semantic and blockchain technologies
Strategies for integrating semantic and blockchain technologiesStrategies for integrating semantic and blockchain technologies
Strategies for integrating semantic and blockchain technologies
Héctor Ugarte
 
Semantics and Machine Learning
Semantics and Machine LearningSemantics and Machine Learning
Semantics and Machine Learning
Vladimir Alexiev, PhD, PMP
 
Semantic framework for web scraping.
Semantic framework for web scraping.Semantic framework for web scraping.
Semantic framework for web scraping.
Shyjal Raazi
 
Semantic web technology
Semantic web technologySemantic web technology
Semantic web technology
Stanley Wang
 
Semantic web
Semantic webSemantic web
Semantic web
Hon Lasisi H
 
The Semantic Web – A Vision Come True, or Giving Up the Great Plan?
The Semantic Web – A Vision Come True, or Giving Up the Great Plan?The Semantic Web – A Vision Come True, or Giving Up the Great Plan?
The Semantic Web – A Vision Come True, or Giving Up the Great Plan?
Martin Hepp
 
LSESU a Taste of R Language Workshop
LSESU a Taste of R Language WorkshopLSESU a Taste of R Language Workshop
LSESU a Taste of R Language Workshop
Korkrid Akepanidtaworn
 

Similar to Applied Artificial Intelligence Unit 5 Semester 3 MSc IT Part 2 Mumbai University (20)

Adopting Data Science and Machine Learning in the financial enterprise
Adopting Data Science and Machine Learning in the financial enterpriseAdopting Data Science and Machine Learning in the financial enterprise
Adopting Data Science and Machine Learning in the financial enterprise
 
An Introduction to Semantic Web Technology
An Introduction to Semantic Web TechnologyAn Introduction to Semantic Web Technology
An Introduction to Semantic Web Technology
 
Intelligent expert systems for location planning
Intelligent expert systems for location planningIntelligent expert systems for location planning
Intelligent expert systems for location planning
 
Integrating Semantic Systems
Integrating Semantic SystemsIntegrating Semantic Systems
Integrating Semantic Systems
 
Precision Content™ Tools, Techniques, and Technology
Precision Content™ Tools, Techniques, and TechnologyPrecision Content™ Tools, Techniques, and Technology
Precision Content™ Tools, Techniques, and Technology
 
How to Future-proof Your Content by Sarah Beckley
How to Future-proof Your Content by Sarah BeckleyHow to Future-proof Your Content by Sarah Beckley
How to Future-proof Your Content by Sarah Beckley
 
The Semantic Web: What IAs Need to Know About Web 3.0
The Semantic Web: What IAs Need to Know About Web 3.0The Semantic Web: What IAs Need to Know About Web 3.0
The Semantic Web: What IAs Need to Know About Web 3.0
 
Semantic Web Analytics.pptx
Semantic Web Analytics.pptxSemantic Web Analytics.pptx
Semantic Web Analytics.pptx
 
Data Discovery and Metadata
Data Discovery and MetadataData Discovery and Metadata
Data Discovery and Metadata
 
Fitsum ristu lakew the semantic web
Fitsum ristu lakew the semantic webFitsum ristu lakew the semantic web
Fitsum ristu lakew the semantic web
 
The Real-time Web in the Age of Agents
The Real-time Web in the Age of AgentsThe Real-time Web in the Age of Agents
The Real-time Web in the Age of Agents
 
No more BITS - Blind Insignificant Technologies ands Systems by Roger Roberts...
No more BITS - Blind Insignificant Technologies ands Systems by Roger Roberts...No more BITS - Blind Insignificant Technologies ands Systems by Roger Roberts...
No more BITS - Blind Insignificant Technologies ands Systems by Roger Roberts...
 
How does semantic technology work?
How does semantic technology work? How does semantic technology work?
How does semantic technology work?
 
Strategies for integrating semantic and blockchain technologies
Strategies for integrating semantic and blockchain technologiesStrategies for integrating semantic and blockchain technologies
Strategies for integrating semantic and blockchain technologies
 
Semantics and Machine Learning
Semantics and Machine LearningSemantics and Machine Learning
Semantics and Machine Learning
 
Semantic framework for web scraping.
Semantic framework for web scraping.Semantic framework for web scraping.
Semantic framework for web scraping.
 
Semantic web technology
Semantic web technologySemantic web technology
Semantic web technology
 
Semantic web
Semantic webSemantic web
Semantic web
 
The Semantic Web – A Vision Come True, or Giving Up the Great Plan?
The Semantic Web – A Vision Come True, or Giving Up the Great Plan?The Semantic Web – A Vision Come True, or Giving Up the Great Plan?
The Semantic Web – A Vision Come True, or Giving Up the Great Plan?
 
LSESU a Taste of R Language Workshop
LSESU a Taste of R Language WorkshopLSESU a Taste of R Language Workshop
LSESU a Taste of R Language Workshop
 

Recently uploaded

The binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defectsThe binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defects
Sérgio Sacani
 
Unveiling the Energy Potential of Marshmallow Deposits.pdf
Unveiling the Energy Potential of Marshmallow Deposits.pdfUnveiling the Energy Potential of Marshmallow Deposits.pdf
Unveiling the Energy Potential of Marshmallow Deposits.pdf
Erdal Coalmaker
 
Phenomics assisted breeding in crop improvement
Phenomics assisted breeding in crop improvementPhenomics assisted breeding in crop improvement
Phenomics assisted breeding in crop improvement
IshaGoswami9
 
NuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyerNuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyer
pablovgd
 
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
Abdul Wali Khan University Mardan,kP,Pakistan
 
Deep Software Variability and Frictionless Reproducibility
Deep Software Variability and Frictionless ReproducibilityDeep Software Variability and Frictionless Reproducibility
Deep Software Variability and Frictionless Reproducibility
University of Rennes, INSA Rennes, Inria/IRISA, CNRS
 
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptxANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
RASHMI M G
 
Nucleophilic Addition of carbonyl compounds.pptx
Nucleophilic Addition of carbonyl  compounds.pptxNucleophilic Addition of carbonyl  compounds.pptx
Nucleophilic Addition of carbonyl compounds.pptx
SSR02
 
What is greenhouse gasses and how many gasses are there to affect the Earth.
What is greenhouse gasses and how many gasses are there to affect the Earth.What is greenhouse gasses and how many gasses are there to affect the Earth.
What is greenhouse gasses and how many gasses are there to affect the Earth.
moosaasad1975
 
ESR spectroscopy in liquid food and beverages.pptx
ESR spectroscopy in liquid food and beverages.pptxESR spectroscopy in liquid food and beverages.pptx
ESR spectroscopy in liquid food and beverages.pptx
PRIYANKA PATEL
 
Medical Orthopedic PowerPoint Templates.pptx
Medical Orthopedic PowerPoint Templates.pptxMedical Orthopedic PowerPoint Templates.pptx
Medical Orthopedic PowerPoint Templates.pptx
terusbelajar5
 
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
Ana Luísa Pinho
 
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdfTopic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
TinyAnderson
 
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
Sérgio Sacani
 
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...
University of Maribor
 
Oedema_types_causes_pathophysiology.pptx
Oedema_types_causes_pathophysiology.pptxOedema_types_causes_pathophysiology.pptx
Oedema_types_causes_pathophysiology.pptx
muralinath2
 
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...
Wasswaderrick3
 
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
yqqaatn0
 
8.Isolation of pure cultures and preservation of cultures.pdf
8.Isolation of pure cultures and preservation of cultures.pdf8.Isolation of pure cultures and preservation of cultures.pdf
8.Isolation of pure cultures and preservation of cultures.pdf
by6843629
 
如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样
如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样
如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样
yqqaatn0
 

Recently uploaded (20)

The binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defectsThe binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defects
 
Unveiling the Energy Potential of Marshmallow Deposits.pdf
Unveiling the Energy Potential of Marshmallow Deposits.pdfUnveiling the Energy Potential of Marshmallow Deposits.pdf
Unveiling the Energy Potential of Marshmallow Deposits.pdf
 
Phenomics assisted breeding in crop improvement
Phenomics assisted breeding in crop improvementPhenomics assisted breeding in crop improvement
Phenomics assisted breeding in crop improvement
 
NuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyerNuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyer
 
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
 
Deep Software Variability and Frictionless Reproducibility
Deep Software Variability and Frictionless ReproducibilityDeep Software Variability and Frictionless Reproducibility
Deep Software Variability and Frictionless Reproducibility
 
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptxANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
 
Nucleophilic Addition of carbonyl compounds.pptx
Nucleophilic Addition of carbonyl  compounds.pptxNucleophilic Addition of carbonyl  compounds.pptx
Nucleophilic Addition of carbonyl compounds.pptx
 
What is greenhouse gasses and how many gasses are there to affect the Earth.
What is greenhouse gasses and how many gasses are there to affect the Earth.What is greenhouse gasses and how many gasses are there to affect the Earth.
What is greenhouse gasses and how many gasses are there to affect the Earth.
 
ESR spectroscopy in liquid food and beverages.pptx
ESR spectroscopy in liquid food and beverages.pptxESR spectroscopy in liquid food and beverages.pptx
ESR spectroscopy in liquid food and beverages.pptx
 
Medical Orthopedic PowerPoint Templates.pptx
Medical Orthopedic PowerPoint Templates.pptxMedical Orthopedic PowerPoint Templates.pptx
Medical Orthopedic PowerPoint Templates.pptx
 
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
 
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdfTopic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
 
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
 
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...
 
Oedema_types_causes_pathophysiology.pptx
Oedema_types_causes_pathophysiology.pptxOedema_types_causes_pathophysiology.pptx
Oedema_types_causes_pathophysiology.pptx
 
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...
 
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
 
8.Isolation of pure cultures and preservation of cultures.pdf
8.Isolation of pure cultures and preservation of cultures.pdf8.Isolation of pure cultures and preservation of cultures.pdf
8.Isolation of pure cultures and preservation of cultures.pdf
 
如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样
如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样
如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样
 

Applied Artificial Intelligence Unit 5 Semester 3 MSc IT Part 2 Mumbai University

  • 2. Topics to Cover…! • Advanced Knowledge Representation Techniques: Conceptual dependency theory, script structure, CYC theory, case grammars, semantic web. • Natural Language Processing: Sentence Analysis phases, grammars and parsers, types of parsers, semantic analysis, universal networking language, dictionary PPT BY: MADHAV MISHRA 2
  • 3. PPT BY: MADHAV MISHRA 3
  • 4. PPT BY: MADHAV MISHRA 4
  • 5. PPT BY: MADHAV MISHRA 5
  • 6. PPT BY: MADHAV MISHRA 6
  • 7. PPT BY: MADHAV MISHRA 7
  • 8. PPT BY: MADHAV MISHRA 8
  • 9. PPT BY: MADHAV MISHRA 9
  • 10. PPT BY: MADHAV MISHRA 10
  • 11. PPT BY: MADHAV MISHRA 11
  • 12. PPT BY: MADHAV MISHRA 12
  • 13. PPT BY: MADHAV MISHRA 13
  • 14. PPT BY: MADHAV MISHRA 14
  • 15. PPT BY: MADHAV MISHRA 15
  • 16. PPT BY: MADHAV MISHRA 16
  • 17. PPT BY: MADHAV MISHRA 17
  • 18. PPT BY: MADHAV MISHRA 18
  • 19. PPT BY: MADHAV MISHRA 19
  • 20. PPT BY: MADHAV MISHRA 20
  • 21. PPT BY: MADHAV MISHRA 21
  • 22. PPT BY: MADHAV MISHRA 22
  • 23. PPT BY: MADHAV MISHRA 23
  • 24. script structures • A script is a structured representation describing a stereotyped sequence of events in a particular context. • Scripts are used in natural language understanding systems to organize a knowledge base in terms of the situations that the system should understand. Scripts use a frame-like structure to represent the commonly occurring experience like going to the movies eating in a restaurant, shopping in a supermarket, or visiting an ophthalmologist. • Thus, a script is a structure that prescribes a set of circumstances that could be expected to follow on from one another. PPT BY: MADHAV MISHRA 24
  • 25. • Scripts are beneficial because: • Events tend to occur in known runs or patterns. • A casual relationship between events exist. • An entry condition exists which allows an event to take place. • Prerequisites exist upon events taking place. PPT BY: MADHAV MISHRA 25
  • 26. Components of a script: • The components of a script include: • Entry condition: These are basic condition which must be fulfilled before events in the script can occur. • Results: Condition that will be true after events in script occurred. • Props: Slots representing objects involved in events • Roles: These are the actions that the individual participants perform. • Track: Variations on the script. Different tracks may share components of the same scripts. • Scenes: The sequence of events that occur. PPT BY: MADHAV MISHRA 26
  • 27. Describing a script, special symbols of actions are used. These are: PPT BY: MADHAV MISHRA 27
  • 28. Example: Script for going to the bankto withdrawmoney. PPT BY: MADHAV MISHRA 28
  • 29. Advantages of Scripts • Ability to predict events. • A single coherent interpretation maybe builds up from a collection of observations. Disadvantages of Scripts • Less general than frames. • May not be suitable to represent all kinds of knowledge PPT BY: MADHAV MISHRA 29
  • 31. • Cyc has a huge knowledge base which it uses for reasoning. • Contains • 15,000 predicates • 300,000 concepts • 3,200,000 assertions • All these predicates, concepts and assertions are arranged in numerous ontologies. PPT BY: MADHAV MISHRA 31
  • 32. Cyc: Features Uncertain Results • Query: “who had the motive for the assassination of Rafik Hariri?” • Since the case is still an unsolved political mystery, there is no way we can ever get the answer. • In cases like these Cyc returns the various view points, quoting the sources from which it built its inferences. • For the above query, it gives two view points • “USA and Israel” as quoted from a editorial in Al Jazeera • “Syria” as quoted from a news report from CNN PPT BY: MADHAV MISHRA 32
  • 33. • It uses Google as the search engine in the background. • It filters results according to the context of the query. • For example, if we search for assassination of Rafik Hariri, then it omits results which have a time stamp before that of the assassination date. PPT BY: MADHAV MISHRA 33
  • 34. Qualitative Queries • Query: “Was Bill Clinton a good President of the United States?” • In cases like these, Cyc returns the results in a pros and cons type and leave it to the user to make a conclusion. Queries With No Answer • Query: “At this instance of time, Is Alice inhaling or Exhaling?” • The Cyc system is intelligent enough to figure out queries which can never be answered correctly. PPT BY: MADHAV MISHRA 34
  • 35. • The ultimate goal is to build enough common sense into the Cyc system such that it can understand Natural Language. • Once it understands Natural Language, all the system has to do is crawl through all the online material and learn new common sense rules and evolve. • This two step process of building common sense and using machine learning techniques to learn new things will make the Cyc system an infinite source of knowledge. PPT BY: MADHAV MISHRA 35
  • 36. Drawbacks • There is no single Ontology that works in all cases. • Although Cyc is able to simulate common sense it cannot distinguish between facts and fiction. • In Natural Language Processing there is no way the Cyc system can figure out if a particular word is used in the normal sense or in the sarcastic sense. • Adding knowledge is a very tedious process. PPT BY: MADHAV MISHRA 36
  • 37. Semantic Web • The development of Semantic Web is well underway with a goal that it would be possible for machines to understand the information on the web rather than simply display. • The major obstacle to this goal is the fact that most information on the web is designed solely for human consumption. This information should be structured in a way that machines can understand and process that information. • The concept of machine-understandable documents does not imply “Artificial Intelligence”. It only indicates a machine’s ability to solve well-defined problems by performing well-defined operations on well-defined data. • The key technological threads that are currently employed in the development of Semantic Web are: eXtensible Markup Language (XML), Resource Description Framework (RDF), DAML (DARPA Agent Markup Language). PPT BY: MADHAV MISHRA 37
  • 38. • Most of the web’s content today is designed for humans to read , and not for computer programs to process meaningfully. • Computers can - parse the web pages. - perform routine processing (here a header, there a link, etc.) • In general, they have no reliable way to understand and process the semantics. • The Semantic Web will bring structure to the meaningful content of the web of web pages, creating an environment where software agents roaming from page to page carry out sophisticated tasks for users. • The Semantic Web is not a separate web PPT BY: MADHAV MISHRA 38
  • 39. Knowledge Representation • For Semantic Web to function, the computers should have access to • Structured Collections of Information • Meaning of this Information • Sets of Inference Rules/Logic. These sets of Inference rules can be used to conduct automated reasoning. • Technological Threads for developing the Semantic Web: - XML - RDF - Ontologies PPT BY: MADHAV MISHRA 39
  • 40. XML • XML lets everyone to create their own tags. • These tags can be used by the script programs in sophisticated ways to perform various tasks, but the script writer has to know what the page writer uses each tag for. • In short, XML allows you to add arbitrary structure to the documents but says nothing about what the structures mean. • It has no built mechanism to convey the meaning of the user’s new tags to other users. 40 PPT BY: MADHAV MISHRA
  • 41. PPT BY: MADHAV MISHRA 41 • A scheme for defining information on the web. It provides the technology for expressing the meaning of terms and concepts in a form that computers can readily process. • RDF encodes this information on the XML page in sets of triples. The triple is an information on the web about related things. • Each triple is a combination of Subject, Verb and Object, similar to an elementary sentence. • Subjects, Verbs and Objects are each identified by a URI, which enable anyone to define a new concept/new verb just by defining a URI for it somewhere on the web. RDF
  • 42. PPT BY: MADHAV MISHRA 42 These triples can be written using XML tags as shown, RDF (contd.) • An RDF document can make assertions that particular things (people, web pages or whatever) have properties ( “is a sister of”, “is the author of”) with values (another person, another person, etc.) • RDF uses a different URI for each specific concept. Solves the problem of same definition but different concepts. Eg. AddressTags in an XML page.
  • 43. PPT BY: MADHAV MISHRA 43 • Ontologies are collections of statements written in a language such as RDF that define relations between concepts and specifies logical rules for reasoning about them. • Computers/agents/services will understand the meaning of semantic data on a web page by following links to specified ontologies. • Ontologies can express a large number of relationships among entities (objects) by assigning properties to classes and allowing subclasses to inherit such properties. • An Ontology may express the rule, If City Code State Code and Address City Code then Address State Code • Enhances the functioning of semantic web: Improves accuracy of web searches, Easy development of programs that can tackle complicated queries. Ontologies
  • 44. PPT BY: MADHAV MISHRA 44
  • 45. PPT BY: MADHAV MISHRA 45
  • 46. PPT BY: MADHAV MISHRA 46
  • 47. Case Grammars • Case grammars use the functional relationships between noun phrases and verbs to conduct the more deeper case of a sentence • Generally in our English sentences, the difference between different forms of a sentence is quite negligible. • In early 1970’s Fillmore gave some idea about different cases of a English sentence. • He extended the transformational grammars of Chomsky by focusing more on the semantic aspects of view of a sentence. • In case grammars a sentence id defined as being composed of a preposition P, a modality constituent M, composed of mood, tense, aspect, negation and so on. Thus we can represent a sentence like Where P - Set of relationships among verbs and noun phrases i.e. P = (C=Case) M - Modality constituent PPT BY: MADHAV MISHRA 47
  • 48. PPT BY: MADHAV MISHRA 48
  • 49. PPT BY: MADHAV MISHRA 49
  • 51. Components of NLP • There are two components of NLP as given − Natural Language Understanding (NLU) • Understanding involves the following tasks − • Mapping the given input in natural language into useful representations. • Analysing different aspects of the language. Natural Language Generation (NLG) • It is the process of producing meaningful phrases and sentences in the form of natural language from some internal representation. It involves : • Text planning − It includes retrieving the relevant content from knowledge base. • Sentence planning − It includes choosing required words, forming meaningful phrases, setting tone of the sentence. • Text Realization − It is mapping sentence plan into sentence structure. PPT BY: MADHAV MISHRA 51
  • 52. NLP Terminology • Phonology − It is study of organizing sound systematically. • Morphology − It is a study of construction of words from primitive meaningful units. • Syntax − It refers to arranging words to make a sentence. It also involves determining the structural role of words in the sentence and in phrases. • Semantics − It is concerned with the meaning of words and how to combine words into meaningful phrases and sentences. • Pragmatics − It deals with using and understanding sentences in different situations and how the interpretation of the sentence is affected. • Discourse − It deals with how the immediately preceding sentence can affect the interpretation of the next sentence. • World Knowledge − It includes the general knowledge about the world. PPT BY: MADHAV MISHRA 52
  • 53. Sentence Analysis Phases • Lexical Analysis − It involves identifying and analyzing the structure of words. Lexicon of a language means the collection of words and phrases in a language. Lexical analysis is dividing the whole chunk of txt into paragraphs, sentences, and words. • Syntactic Analysis (Parsing) − It involves analysis of words in the sentence for grammar and arranging words in a manner that shows the relationship among the words. The sentence such as “The school goes to boy” is rejected by English syntactic analyzer. • Semantic Analysis − It draws the exact meaning or the dictionary meaning from the text. The text is checked for meaningfulness. It is done by mapping syntactic structures and objects in the task domain. The semantic analyzer disregards sentence such as “hot ice-cream”. PPT BY: MADHAV MISHRA 53
  • 54. • Discourse Integration − The meaning of any sentence depends upon the meaning of the sentence just before it. In addition, it also brings about the meaning of immediately succeeding sentence. • Pragmatic Analysis − During this, what was said is re-interpreted on what it actually meant. It involves deriving those aspects of language which require real world knowledge. PPT BY: MADHAV MISHRA 54
  • 55. Grammars And Parsers • Context-Free Grammar • It is the grammar that consists rules with a single symbol on the left- hand side of the rewrite rules. Let us create grammar to parse a sentence − “The bird pecks the grains” Articles (DETERMINER(DET)) − a | an | the Nouns − bird | birds | grain | grains Noun Phrase (NP) − Article + Noun | Article + Adjective + Noun = DET N | DET ADJ N Verbs − pecks | pecking | pecked Verb Phrase (VP) − NP V | V NP Adjectives (ADJ) − beautiful | small | chirping PPT BY: MADHAV MISHRA 55
  • 56. • The parse tree breaks down the sentence into structured parts so that the computer can easily understand and process it. In order for the parsing algorithm to construct this parse tree, a set of rewrite rules, which describe what tree structures are legal, need to be constructed. • These rules say that a certain symbol may be expanded in the tree by a sequence of other symbols. According to first order logic rule, if there are two strings Noun Phrase (NP) and Verb Phrase (VP), then the string combined by NP followed by VP is a sentence. The rewrite rules for the sentence are as follows − • S → NP VP • NP → DET N | DET ADJ N • VP → V NP PPT BY: MADHAV MISHRA 56
  • 57. • Lexocon − • DET → a | the • ADJ → beautiful | perching • N → bird | birds | grain | grains • V → peck | pecks | pecking • The parse tree can be created as shown − PPT BY: MADHAV MISHRA 57
  • 58. PARSING PROCESS • Parsing is the term used to describe the process of automatically building syntactic analysis of a sentence in terms of a given grammar and lexicon. • The resulting syntactic analysis may be used as input to a process of semantic interpretation. • Occasionally, parsing is also used to include both syntactic and semantic analysis. • The parsing process is done by the parser. • The parsing performs grouping and labeling of parts of a sentence in a way that displays their relationships to each other in a proper way. • The parser is a computer program which accepts the natural language sentence as input and generates an output structure suitable for analysis. PPT BY: MADHAV MISHRA 58
  • 59. Types of Parsing • The parsing technique can be categorized into two types such as - Top down Parsing - Bottom up Parsing Top down Parsing Top down parsing starts with the starting symbol and proceeds towards the goal. We can say it is the process of construction the parse tree starting at the root and proceeds towards the leaves. It is a strategy of analyzing unknown data relationships by hypothesizing general parse tree structures and then considering whether the known fundamental structures are compatible with the hypothesis. In top down parsing words of the sentence are replaced by their categories like verb phrase (VP), Noun phrase (NP), Preposition phrase (PP), etc. Let us consider some examples to illustrate top down parsing. We will consider both the symbolical representation and the graphical representation. We will take the words of the sentences and reach at the complete sentence. For parsing we will consider the previous symbols like PP, NP, VP, ART, N, V and so on. Examples of top down parsing are LL (Left-to- right, left most derivation), recursive descent parser etc. PPT BY: MADHAV MISHRA 59
  • 60. PPT BY: MADHAV MISHRA 60
  • 61. Bottom up Parsing • In this parsing technique the process begins with the sentence and the words of the sentence is replaced by their relevant symbols. • It is also called shift reducing parsing. • In bottom up parsing the construction of parse tree starts at the leaves and proceeds towards the root. • Bottom up parsing is a strategy for analyzing unknown data relationships that attempts to identify the most fundamental units first and then to infer higher order structures for them. • This process occurs in the analysis of both natural languages and computer languages. • It is common for bottom up parsers to take the form of general parsing engines that can wither parse or generate a parser for a specific programming language given a specific of its grammar. PPT BY: MADHAV MISHRA 61
  • 62. PPT BY: MADHAV MISHRA 62
  • 63. Semantic Analysis • Semantic Analysis is the process of drawing meaning from text. • It allows computers to understand and interpret sentences, paragraphs, or whole documents, by analysing their grammatical structure, and identifying relationships between individual words in a particular context. • It’s an essential sub-task of Natural Language Processing (NLP) and the driving force behind machine learning tools like chatbots, search engines, and text analysis. • Semantic analysis-driven tools can help companies automatically extract meaningful information from unstructured data, such as emails, support tickets, and customer feedback. PPT BY: MADHAV MISHRA 63
  • 64. How Semantic Analysis Works • Lexical semantics plays an important role in semantic analysis, allowing machines to understand relationships between lexical items (words, phrasal verbs, etc.): • Hyponyms: specific lexical items of a generic lexical item (hypernym) e.g. orange is a hyponym of fruit (hypernym). • Meronomy: a logical arrangement of text and words that denotes a constituent part of or member of something e.g., a segment of an orange • Polysemy: a relationship between the meanings of words or phrases, although slightly different, share a common core meaning e.g. I read a paper, and I wrote a paper) • Synonyms: words that have the same sense or nearly the same meaning as another, e.g., happy, content, ecstatic, overjoyed • Antonyms: words that have close to opposite meanings e.g., happy, sad • Homonyms: two words that are sound the same and are spelled alike but have a different meaning e.g., orange (color), orange (fruit) PPT BY: MADHAV MISHRA 64
  • 65. • Semantic analysis also takes into account signs and symbols (semiotics) and collocations (words that often go together). • Automated semantic analysis works with the help of machine learning algorithms. • By feeding semantically enhanced machine learning algorithms with samples of text, you can train machines to make accurate predictions based on past observations. • There are various sub-tasks involved in a semantic-based approach for machine learning, including word sense disambiguation and relationship extraction: Word Sense Disambiguation & Relationship Extraction PPT BY: MADHAV MISHRA 65
  • 66. Word Sense Disambiguation: • The automated process of identifying in which sense is a word used according to its context. • Natural language is ambiguous and polysemic; sometimes, the same word can have different meanings depending on how it’s used. • The word “orange,” for example, can refer to a color, a fruit, or even a city in Florida! • The same happens with the word “date,” which can mean either a particular day of the month, a fruit, or a meeting. PPT BY: MADHAV MISHRA 66
  • 67. • Relationship Extraction • This task consists of detecting the semantic relationships present in a text. Relationships usually involve two or more entities (which can be names of people, places, company names, etc.). These entities are connected through a semantic category, such as “works at,” “lives in,” “is the CEO of,” “headquartered at.” • For example, the phrase “Steve Jobs is one of the founders of Apple, which is headquartered in California” contains two different relationships: PPT BY: MADHAV MISHRA 67
  • 68. PPT BY: MADHAV MISHRA 68
  • 69. PPT BY: MADHAV MISHRA 69
  • 70. PPT BY: MADHAV MISHRA 70
  • 71. PPT BY: MADHAV MISHRA 71
  • 72. PPT BY: MADHAV MISHRA 72
  • 73. PPT BY: MADHAV MISHRA 73
  • 74. PPT BY: MADHAV MISHRA 74
  • 75. PPT BY: MADHAV MISHRA 75
  • 76. PPT BY: MADHAV MISHRA 76
  • 77. Dictionary • Also Known as UNL Dictionary. • It stores concepts, represented by the language words. • It stores universal words for identifying concepts, words headings that can express concepts and information on the syntactical behaviour. • Each entry consists of a correspondence between a concept and a word along with information concerning syntactic properties. • The Grammar for defining words of the language in the dictionary is shown below PPT BY: MADHAV MISHRA 77
  • 78. PPT BY: MADHAV MISHRA 78