SlideShare a Scribd company logo
Module -2
 Looping
Topics to be covered
• Role of parser
• Context free grammar
• Derivation & Ambiguity
• Left recursion & Left factoring
• Classification of parsing
• Backtracking
• LL(1) parsing
• Recursive descent parsing
• Shift reduce parsing
• Operator precedence parsing
• LR parsing
Role of Parser
Role of parser
• Parser obtains a string of token from the lexical analyzer and reports syntax
error if any otherwise generates syntax tree.
• There are two types of parser:
1. Top-down parser
2. Bottom-up parser
Rest of front
end
Parse
tree
Token
IR
Lexical
analyzer
Symbol table
Parser
Get next
token
Source
program
Parse
tree
Context free grammar
Context free grammar
• A context free grammar (CFG) is a 4-tuple 𝐺 = (𝑉, 𝛴, 𝑆, 𝑃) where,
𝑉 is finite set of non terminals,
𝛴 is disjoint finite set of terminals,
𝑆 is an element of 𝑉 and it’s a start symbol,
𝑃 is a finite set formulas of the form 𝐴 → 𝛼 where 𝐴 ∈ 𝑉 and 𝛼 ∈ (𝑉 ∪ 𝛴)∗
 Nonterminal symbol:
 The name of syntax category of a language, e.g., noun, verb, etc.
 It is written as a single capital letter, or as a name enclosed between < … >, e.g., A or
<Noun>
<Noun Phrase> → <Article><Noun>
<Article> → a | an | the
<Noun> → boy | apple
Context free grammar
• A context free grammar (CFG) is a 4-tuple 𝐺 = (𝑉, 𝛴, 𝑆, 𝑃) where,
𝑉 is finite set of non terminals,
𝛴 is disjoint finite set of terminals,
𝑆 is an element of 𝑉 and it’s a start symbol,
𝑃 is a finite set formulas of the form 𝐴 → 𝛼 where 𝐴 ∈ 𝑉 and 𝛼 ∈ (𝑉 ∪ 𝛴)∗
 Terminal symbol:
 A symbol in the alphabet.
 It is denoted by lower case letter and punctuation marks used in language.
<Noun Phrase> → <Article><Noun>
<Article> → a | an | the
<Noun> → boy | apple
Context free grammar
• A context free grammar (CFG) is a 4-tuple 𝐺 = (𝑉, 𝛴, 𝑆, 𝑃) where,
𝑉 is finite set of non terminals,
𝛴 is disjoint finite set of terminals,
𝑆 is an element of 𝑉 and it’s a start symbol,
𝑃 is a finite set formulas of the form 𝐴 → 𝛼 where 𝐴 ∈ 𝑉 and 𝛼 ∈ (𝑉 ∪ 𝛴)∗
 Start symbol:
 First nonterminal symbol of the grammar is called start symbol.
<Noun Phrase> → <Article><Noun>
<Article> → a | an | the
<Noun> → boy | apple
Context free grammar
• A context free grammar (CFG) is a 4-tuple 𝐺 = (𝑉, 𝛴, 𝑆, 𝑃) where,
𝑉 is finite set of non terminals,
𝛴 is disjoint finite set of terminals,
𝑆 is an element of 𝑉 and it’s a start symbol,
𝑃 is a finite set formulas of the form 𝐴 → 𝛼 where 𝐴 ∈ 𝑉 and 𝛼 ∈ (𝑉 ∪ 𝛴)∗
 Production:
 A production, also called a rewriting rule, is a rule of grammar. It has the form of
A nonterminal symbol → String of terminal and nonterminal symbols
<Noun Phrase> → <Article><Noun>
<Article> → a | an | the
<Noun> → boy | apple
Example: Grammar
Write terminals, non terminals, start symbol, and productions for following
grammar.
E  E O E| (E) | -E | id
O  + | - | * | / | ↑
Terminals: id + - * / ↑ ( )
Non terminals: E, O
Start symbol: E
Productions: E  E O E| (E) | -E | id
O  + | - | * | / | ↑
Derivation & Ambiguity
Derivation
• Derivation is used to find whether the string belongs to a given grammar or not.
• Types of derivations are:
1. Leftmost derivation
2. Rightmost derivation
Leftmost derivation
• A derivation of a string 𝑊 in a grammar 𝐺 is a left most derivation if at every
step the left most non terminal is replaced.
• Grammar: SS+S | S-S | S*S | S/S | a Output string: a*a-a
S
S-S
S*S-S
a*S-S
a*a-S
a*a-a a
S - S
a
a
S * S
S
Parse tree represents the
structure of derivation
Leftmost Derivation Parse tree
Rightmost derivation
• A derivation of a string W in a grammar G is a right most derivation if at every
step the right most non terminal is replaced.
• It is all called canonical derivation.
• Grammar: SS+S | S-S | S*S | S/S | a Output string: a*a-a
S
S*S
S*S-S
S*S-a
S*a-a
a*a-a
a
S * S
a
a S - S
S
Rightmost Derivation Parse Tree
Exercise: Derivation
1. Perform leftmost derivation and draw parse tree.
SA1B
A0A | 𝜖
B0B | 1B | 𝜖
Output string: 1001
2. Perform leftmost derivation and draw parse tree.
S0S1 | 01 Output string: 000111
3. Perform rightmost derivation and draw parse tree.
EE+E | E*E | id | (E) | -E
Output string: id + id * id
Ambiguous grammar
Ambiguity
• In formal language grammar, ambiguity would arise if identical string can occur
on the RHS of two or more productions.
• Grammar:
N1 → α
N2 → α
• α can be derived from either N1 or N2
𝑵𝟏 𝑵𝟐
𝜶
Replaced by
𝑵𝟏 or 𝑵𝟐 ?
Ambiguous grammar
• Ambiguous grammar is one that produces more than one leftmost or more
than one rightmost derivation for the same sentence.
• Grammar: SS+S | S*S | (S) | a Output string: a+a*a
S S
S*S S+S
S+S*S a+S
a+S*S a+S*S
a+a*S a+a*S
a+a*a a+a*a
• Here, Two leftmost derivation for string a+a*a is possible hence, above
grammar is ambiguous.
a
S * S
a
a
S
S + S
a
S + S
a
a S * S
S
Exercise: Ambiguous Grammar
Check Ambiguity in following grammars:
1. S aS | Sa | 𝜖 (output string: aaaa)
2. S aSbS | bSaS | 𝜖 (output string: abab)
3. S SS+ | SS* | a (output string: aa+a*)
4. <exp> → <exp> + <term> | <term>
<term> → <term> * <letter> | <letter>
<letter> → a|b|c|…|z (output string: a+b*c)
5. Prove that the CFG with productions: S  a | Sa | bSS | SSb | SbS is
ambiguous (Hint: consider output string yourself)
Left recursion & Left factoring
Left recursion
• A grammar is said to be left recursive if it has a non terminal 𝐴 such that there
is a derivation 𝑨𝑨𝜶 for some string 𝛼.
Algorithm to eliminate left recursion
1. Arrange the non terminals in some order 𝐴1, … , 𝐴𝑛
2. For 𝑖: = 1 𝑡𝑜 𝑛 do begin
for 𝑗: = 1 𝑡𝑜 𝑖 − 1 do begin
replace each production of the form 𝐴𝑖 → 𝐴𝑖𝛾
by the productions 𝐴𝑖 → 𝛿1𝛾| 𝛿2𝛾| … . . | 𝛿𝑘𝛾,
where 𝐴𝑗 → 𝛿1 | 𝛿2 | … . | 𝛿𝑘 are all the current 𝐴𝑗
productions;
end
eliminate the immediate left recursion among the 𝐴𝑖 - productions
end
𝛽
𝛽
Left recursion elimination
𝐴 → 𝐴𝛼 | 𝐴 → 𝐴’
𝐴’
𝛼
𝐴’ | 𝜖
Examples: Left recursion elimination
EE+T | T
ETE’
E’+TE’ | ε
TT*F | F
TFT’
T’*FT’ | ε
XX%Y | Z
XZX’
X’%YX’ | ε
Exercise: Left recursion
1. AAbd | Aa | a
BBe | b
2. AAB | AC | a | b
3. SA | B
AABC | Acd | a | aa
BBee | b
4. ExpExp+term | Exp-term | term
Left factoring
Left factoring is a grammar transformation that is useful for producing a grammar
suitable for predictive parsing.
Algorithm to left factor a grammar
Input: Grammar G
Output: An equivalent left factored grammar.
Method:
For each non terminal A find the longest prefix 𝛼 common to two or more of its
alternatives. If 𝛼 ≠∈, i.e., there is a non trivial common prefix, replace all the A
productions 𝐴 𝛼𝛽1 𝛼𝛽2 … … … … . . 𝛼𝛽𝑛 𝛾 where 𝛾 represents all alternatives that
do not begin with 𝛼 by
𝐴 → 𝛼 𝐴′| 𝛾
𝐴′ → 𝛽1 | 𝛽2 | … … … … . |𝛽𝑛
Here A' is new non terminal. Repeatedly apply this transformation until no two
alternatives for a non-terminal have a common prefix.
𝛼
𝛽
𝛼 δ
Left factoring elimination
𝐴 → 𝛽 δ
𝛼 𝛼
| 𝐴 →
𝐴′ → |
𝐴′
Example: Left factoring elimination
SaAB | aCD
SaS’
S’AB | CD
A xByA | xByAzA | a
A xByAA’ | a
A’ Є | zA
A aAB | aA |a
AaA’
A’AB | A | 𝜖
A’AA’’ | 𝜖
A’’B | 𝜖
Exercise
1. SiEtS | iEtSeS | a
2. A ad | a | ab | abc | x
Parsing
Parsing
• Parsing is a technique that takes input string and produces output either a
parse tree if string is valid sentence of grammar, or an error message indicating
that string is not a valid.
• Types of parsing are:
1. Top down parsing: In top down parsing parser build parse tree from top to
bottom.
2. Bottom up parsing: Bottom up parser starts from leaves and work up to the
root.
Classification of parsing methods
Parsing
Top down parsing Bottom up parsing (Shift reduce)
Back tracking
Parsing without
backtracking (predictive
parsing)
LR parsing
Operator precedence
LALR
CLR
SLR
Recursive
descent
LL(1)
Backtracking
• In backtracking, expansion of nonterminal symbol we choose one alternative
and if any mismatch occurs then we try another alternative.
• Grammar: S cAd Input string: cad
A ab | a
c A d
S
c A d
S
a b
c A d
S
a Parsing done
Make prediction
Backtrack
Make prediction
Exercise
1. E 5+T | 3-T
T V | V*V | V+V
V a | b
String: 3-a+b
Parsing Methods
Parsing
Top down parsing Bottom up parsing (Shift reduce)
Back tracking
Parsing without
backtracking (predictive
parsing)
LR parsing
Operator precedence
LALR
CLR
SLR
Recursive
descent
LL(1)
LL(1) parser (predictive parser)
• LL(1) is non recursive top down parser.
1. First L indicates input is scanned from left to right.
2. The second L means it uses leftmost derivation for input string
3. 1 means it uses only input symbol to predict the parsing process.
Predictive
parsing
program
Parsing table M
INPUT
OUTPUT
Stack
a + b $
X
Y
Z
$
LL(1) parsing (predictive parsing)
Steps to construct LL(1) parser
1. Remove left recursion / Perform left factoring (if any).
2. Compute FIRST and FOLLOW of non terminals.
3. Construct predictive parsing table.
4. Parse the input string using parsing table.
Rules to compute first of non terminal
1. If 𝐴 → 𝛼 and 𝛼 is terminal, add 𝛼 to 𝐹𝐼𝑅𝑆𝑇(𝐴).
2. If 𝐴 → ∈, add ∈ to 𝐹𝐼𝑅𝑆𝑇(𝐴).
3. If 𝑋 is nonterminal and 𝑋𝑌1 𝑌2 … . 𝑌𝑘 is a production, then place 𝑎 in
𝐹𝐼𝑅𝑆𝑇(𝑋) if for some 𝑖, a is in 𝐹𝐼𝑅𝑆𝑇(𝑌𝑖), and 𝜖 is in all of
𝐹𝐼𝑅𝑆𝑇(𝑌1), … … … , 𝐹𝐼𝑅𝑆𝑇(𝑌𝑖−1 );that is 𝑌1 … 𝑌𝑖−1 ⇒ 𝜖. If 𝜖 is in 𝐹𝐼𝑅𝑆𝑇(𝑌𝑗)
for all 𝑗 = 1,2, … . . , 𝑘 then add 𝜖 to 𝐹𝐼𝑅𝑆𝑇(𝑋).
Everything in 𝐹𝐼𝑅𝑆𝑇(𝑌1) is surely in 𝐹𝐼𝑅𝑆𝑇(𝑋) If 𝑌1 does not derive 𝜖, then
we do nothing more to 𝐹𝐼𝑅𝑆𝑇(𝑋), but if 𝑌1 ⇒ 𝜖, then we add 𝐹𝐼𝑅𝑆𝑇(𝑌2)
and so on.
Rules to compute first of non terminal
Simplification of Rule 3
If 𝐴 → 𝑌1𝑌2 … … . . 𝑌𝐾 ,
• If 𝑌1 does not derives ∈ 𝑡ℎ𝑒𝑛, 𝐹𝐼𝑅𝑆𝑇(𝐴) = 𝐹𝐼𝑅𝑆𝑇(𝑌1)
• If 𝑌1 derives ∈ 𝑡ℎ𝑒𝑛,
𝐹𝐼𝑅𝑆𝑇 𝐴 = 𝐹𝐼𝑅𝑆𝑇 𝑌1 − 𝜖 𝑈 𝐹𝐼𝑅𝑆𝑇(𝑌2)
• If 𝑌1 & Y2 derives ∈ 𝑡ℎ𝑒𝑛,
𝐹𝐼𝑅𝑆𝑇 𝐴 = 𝐹𝐼𝑅𝑆𝑇 𝑌1 − 𝜖 𝑈 𝐹𝐼𝑅𝑆𝑇(𝑌2) − 𝜖 𝑈 𝐹𝐼𝑅𝑆𝑇(𝑌3)
• If 𝑌1 , Y2 & Y3 derives ∈ 𝑡ℎ𝑒𝑛,
𝐹𝐼𝑅𝑆𝑇 𝐴 = 𝐹𝐼𝑅𝑆𝑇 𝑌1 − 𝜖 𝑈 𝐹𝐼𝑅𝑆𝑇(𝑌2) − 𝜖 𝑈 𝐹𝐼𝑅𝑆𝑇(𝑌3) − 𝜖 𝑈 𝐹𝐼𝑅𝑆𝑇(𝑌4)
• If 𝑌1 , Y2 , Y3 …..YK all derives ∈ 𝑡ℎ𝑒𝑛,
𝐹𝐼𝑅𝑆𝑇 𝐴 = 𝐹𝐼𝑅𝑆𝑇 𝑌1 − 𝜖 𝑈 𝐹𝐼𝑅𝑆𝑇(𝑌2) − 𝜖 𝑈 𝐹𝐼𝑅𝑆𝑇(𝑌3) − 𝜖 𝑈 𝐹𝐼𝑅𝑆𝑇(𝑌4) −
𝜖 𝑈 … … … … 𝐹𝐼𝑅𝑆𝑇(𝑌𝑘) (note: if all non terminals derives ∈ then add ∈ to FIRST(A))
Rules to compute FOLLOW of non terminal
1. Place $ 𝑖𝑛 𝑓𝑜𝑙𝑙𝑜𝑤 𝑆 . (S is start symbol)
2. If A → 𝛼𝐵𝛽 , then everything in 𝐹𝐼𝑅𝑆𝑇(𝛽) except for 𝜖 is placed in
𝐹𝑂𝐿𝐿𝑂𝑊(𝐵)
3. If there is a production A → 𝛼𝐵 or a production A → 𝛼𝐵𝛽 where 𝐹𝐼𝑅𝑆𝑇(𝛽)
contains 𝜖 then everything in F𝑂𝐿𝐿𝑂𝑊 𝐴 = 𝐹𝑂𝐿𝐿𝑂𝑊 𝐵
How to apply rules to find FOLLOW of non terminal?
A → 𝛼𝐵𝛽
𝛽 𝑖𝑠 𝑝𝑟𝑒𝑠𝑒𝑛𝑡
𝛽 𝑖𝑠 𝑎𝑏𝑠𝑒𝑛𝑡
𝑅𝑢𝑙𝑒 3 𝛽 is terminal 𝛽 𝑖𝑠 𝑁𝑜𝑛𝑡𝑒𝑟𝑚𝑖𝑛𝑎𝑙
𝑅𝑢𝑙𝑒 2 𝛽 derives 𝜖
𝛽 does not derives 𝜖
𝑅𝑢𝑙𝑒 2 𝑅𝑢𝑙𝑒 2 + 𝑅𝑢𝑙𝑒 3
Rules to construct predictive parsing table
1. For each production 𝐴 → 𝛼 of the grammar, do steps 2 and 3.
2. For each terminal 𝑎 in 𝑓𝑖𝑟𝑠𝑡(𝛼), Add 𝐴 → 𝛼 to 𝑀[𝐴, 𝑎].
3. If 𝜖 is in 𝑓𝑖𝑟𝑠𝑡(𝛼), Add 𝐴 → 𝛼 to 𝑀[𝐴, 𝑏] for each terminal 𝑏 in 𝐹𝑂𝐿𝐿𝑂𝑊(𝐵).
If 𝜖 is in 𝑓𝑖𝑟𝑠𝑡(𝛼), and $ is in 𝐹𝑂𝐿𝐿𝑂𝑊(𝐴), add 𝐴 → 𝛼 to 𝑀[𝐴, $].
4. Make each undefined entry of M be error.
Example-1: LL(1) parsing
Step 1: Not required
Step 2: Compute FIRST
First(S)
SaBa
First(B)
BbB B𝜖
SaBa
BbB | ϵ
S  a B a
A  𝛼
B  𝜖
A  𝜖
Rule 1
add 𝛼 to 𝐹𝐼𝑅𝑆𝑇(𝐴)
Rule 2
add 𝜖 to 𝐹𝐼𝑅𝑆𝑇(𝐴)
FIRST(B)={ b , 𝜖 }
NT First
S { a }
B {b,𝜖}
B  b B
A  𝛼
Rule 1
add 𝛼 to 𝐹𝐼𝑅𝑆𝑇(𝐴)
FIRST(S)={ a }
Example-1: LL(1) parsing
Step 2: Compute FOLLOW
Follow(S)
Follow(B)
SaBa BbB
SaBa
BbB | ϵ
Follow(B)={ a
B  b B
Follow(S)={ $ }
NT First Follow
S {a} {$}
B {b,𝜖} {a}
S  a B a
A  𝛂 B
Rule 3
Follow(A)=follow(B)
A  𝛂 B 𝛃
Rule 2
First(𝛽) − 𝜖
Rule 1: Place $ in FOLLOW(S)
}
Example-1: LL(1) parsing
Step 3: Prepare predictive parsing table
SaBa
a=FIRST(aBa)={ a }
M[S,a]=SaBa
SaBa
BbB | ϵ NT First Follow
S {a} {$}
B {b,𝜖} {a}
NT Input Symbol
a b $
S SaBa
B
Rule: 2
A 𝛼
a = first(𝛼)
M[A,a] = A 𝛼
Example-1: LL(1) parsing
Step 3: Prepare predictive parsing table
BbB
a=FIRST(bB)={ b }
M[B,b]=BbB
SaBa
BbB | ϵ NT First Follow
S {a} {$}
B {b,𝜖} {a}
NT Input Symbol
a b $
S SaBa
B BbB
Rule: 2
A 𝛼
a = first(𝛼)
M[A,a] = A 𝛼
Example-1: LL(1) parsing
Step 3: Prepare predictive parsing table
Bϵ
b=FOLLOW(B)={ a }
M[B,a]=B𝜖
SaBa
BbB | ϵ NT First Follow
S {a} {$}
B {b,𝜖} {a}
NT Input Symbol
a b $
S SaBa Error Error
B Bϵ BbB Error
Rule: 3
A 𝛼
b = follow(A)
M[A,b] = A 𝛼
Example-2: LL(1) parsing
Step 1: Not required
Step 2: Compute FIRST
First(S)
SaB S𝜖
SaB | ϵ
BbC | ϵ
CcS | ϵ
S  𝜖
A  𝜖
Rule 2
add 𝜖 to 𝐹𝐼𝑅𝑆𝑇(𝐴)
FIRST(S)={ a , 𝜖 }
NT First
S { a, 𝜖 }
B {b,𝜖}
S  a B
A  𝛼
Rule 1
add 𝛼 to 𝐹𝐼𝑅𝑆𝑇(𝐴)
C {c,𝜖}
Example-2: LL(1) parsing
Step 1: Not required
Step 2: Compute FIRST
First(B)
BbC B𝜖
SaB | ϵ
BbC | ϵ
CcS | ϵ
B  𝜖
A  𝜖
Rule 2
add 𝜖 to 𝐹𝐼𝑅𝑆𝑇(𝐴)
FIRST(B)={ b , 𝜖 }
NT First
S { a, 𝜖 }
B {b,𝜖}
B  b C
A  𝛼
Rule 1
add 𝛼 to 𝐹𝐼𝑅𝑆𝑇(𝐴)
C {c,𝜖}
Example-2: LL(1) parsing
Step 1: Not required
Step 2: Compute FIRST
First(C)
CcS C𝜖
SaB | ϵ
BbC | ϵ
CcS | ϵ
C  𝜖
A  𝜖
Rule 2
add 𝜖 to 𝐹𝐼𝑅𝑆𝑇(𝐴)
FIRST(B)={ c , 𝜖 }
NT First
S { a, 𝜖 }
B {b,𝜖}
C  c S
A  𝛼
Rule 1
add 𝛼 to 𝐹𝐼𝑅𝑆𝑇(𝐴)
C {c,𝜖}
Example-2: LL(1) parsing
Step 2: Compute FOLLOW
Follow(S)
CcS
BbC SaB
={$}
C  c S
Follow(S)={ $ }
NT First Follow
S {a,𝜖} {$}
B {b,𝜖} {$}
C {c,𝜖} {$}
A  𝛂 B
Rule 3
Follow(A)=follow(B)
Rule 1: Place $ in FOLLOW(S)
SaB | ϵ
BbC | ϵ
CcS | ϵ
Follow(S)=Follow(C)
B  b C
A  𝛂 B
Rule 3
Follow(A)=follow(B)
Follow(C)=Follow(B)={$}
S  a B
A  𝛂 B
Rule 3
Follow(A)=follow(B)
Follow(B)=Follow(S)={$}
Example-2: LL(1) parsing
Step 3: Prepare predictive parsing table
SaB
a=FIRST(aB)={ a }
M[S,a]=SaB
N
T
Input Symbol
a b c $
S SaB
B
C
Rule: 2
A 𝛼
a = first(𝛼)
M[A,a] = A 𝛼
SaB | ϵ
BbC | ϵ
CcS | ϵ
NT First Follow
S {a,𝜖} {$}
B {b,𝜖} {$}
C {c,𝜖} {$}
Example-2: LL(1) parsing
Step 3: Prepare predictive parsing table
S𝜖
b=FOLLOW(S)={ $ }
M[S,$]=S𝜖
N
T
Input Symbol
a b c $
S SaB S𝜖
B
C
SaB | ϵ
BbC | ϵ
CcS | ϵ
NT First Follow
S {a} {$}
B {b,𝜖} {$}
C {c,𝜖} {$}
Rule: 3
A 𝛼
b = follow(A)
M[A,b] = A 𝛼
Example-2: LL(1) parsing
Step 3: Prepare predictive parsing table
BbC
a=FIRST(bC)={ b }
M[B,b]=BbC
N
T
Input Symbol
a b c $
S SaB S𝜖
B BbC
C
SaB | ϵ
BbC | ϵ
CcS | ϵ
NT First Follow
S {a} {$}
B {b,𝜖} {$}
C {c,𝜖} {$}
Rule: 2
A 𝛼
a = first(𝛼)
M[A,a] = A 𝛼
Example-2: LL(1) parsing
Step 3: Prepare predictive parsing table
B𝜖
b=FOLLOW(B)={ $ }
M[B,$]=B𝜖
N
T
Input Symbol
a b c $
S SaB S𝜖
B BbC B𝜖
C
SaB | ϵ
BbC | ϵ
CcS | ϵ
NT First Follow
S {a} {$}
B {b,𝜖} {$}
C {c,𝜖} {$}
Rule: 3
A 𝛼
b = follow(A)
M[A,b] = A 𝛼
Example-2: LL(1) parsing
Step 3: Prepare predictive parsing table
CcS
a=FIRST(cS)={ c }
M[C,c]=CcS
N
T
Input Symbol
a b c $
S SaB S𝜖
B BbC B𝜖
C CcS
SaB | ϵ
BbC | ϵ
CcS | ϵ
NT First Follow
S {a} {$}
B {b,𝜖} {$}
C {c,𝜖} {$}
Rule: 2
A 𝛼
a = first(𝛼)
M[A,a] = A 𝛼
Example-2: LL(1) parsing
Step 3: Prepare predictive parsing table
C𝜖
b=FOLLOW(C)={ $ }
M[C,$]=C𝜖
N
T
Input Symbol
a b c $
S SaB Error Error S𝜖
B Error BbB Error B𝜖
C Error Error CcS C𝜖
SaB | ϵ
BbC | ϵ
CcS | ϵ
NT First Follow
S {a} {$}
B {b,𝜖} {$}
C {c,𝜖} {$}
Rule: 3
A 𝛼
b = follow(A)
M[A,b] = A 𝛼

More Related Content

What's hot

Data Structures in Python
Data Structures in PythonData Structures in Python
Data Structures in Python
Devashish Kumar
 
Chapter 3 -Syntax Analyzer.ppt
Chapter 3 -Syntax Analyzer.pptChapter 3 -Syntax Analyzer.ppt
Chapter 3 -Syntax Analyzer.ppt
FamiDan
 
Types of Parser
Types of ParserTypes of Parser
Types of Parser
SomnathMore3
 
Chapter 5 Graphs (1).ppt
Chapter 5 Graphs (1).pptChapter 5 Graphs (1).ppt
Chapter 5 Graphs (1).ppt
ishan743441
 
Closure properties
Closure propertiesClosure properties
Closure properties
Dr. ABHISHEK K PANDEY
 
Graph based Clustering
Graph based ClusteringGraph based Clustering
Graph based Clustering
怡秀 林
 
Boosting - An Ensemble Machine Learning Method
Boosting - An Ensemble Machine Learning MethodBoosting - An Ensemble Machine Learning Method
Boosting - An Ensemble Machine Learning Method
Kirkwood Donavin
 
Python Scipy Numpy
Python Scipy NumpyPython Scipy Numpy
Python Scipy Numpy
Girish Khanzode
 
Machine Learning: Bias and Variance Trade-off
Machine Learning: Bias and Variance Trade-offMachine Learning: Bias and Variance Trade-off
Machine Learning: Bias and Variance Trade-off
International Institute of Information Technology (I²IT)
 
Hash table
Hash tableHash table
Hash table
Rajendran
 
Shift reduce parser
Shift reduce parserShift reduce parser
Shift reduce parser
TEJVEER SINGH
 
Unsupervised Learning in Machine Learning
Unsupervised Learning in Machine LearningUnsupervised Learning in Machine Learning
Unsupervised Learning in Machine Learning
Pyingkodi Maran
 
Automata theory
Automata theoryAutomata theory
Automata theory
colleges
 
Theory of automata and formal language lab manual
Theory of automata and formal language lab manualTheory of automata and formal language lab manual
Theory of automata and formal language lab manual
Nitesh Dubey
 
1.9. minimization of dfa
1.9. minimization of dfa1.9. minimization of dfa
1.9. minimization of dfa
Sampath Kumar S
 
Top down parsing
Top down parsingTop down parsing
Top down parsing
Prankit Mishra
 
LR Parsing
LR ParsingLR Parsing
LR Parsing
Eelco Visser
 
Prims and kruskal algorithms
Prims and kruskal algorithmsPrims and kruskal algorithms
Prims and kruskal algorithms
Saga Valsalan
 
Numpy Talk at SIAM
Numpy Talk at SIAMNumpy Talk at SIAM
Numpy Talk at SIAM
Enthought, Inc.
 

What's hot (20)

Data Structures in Python
Data Structures in PythonData Structures in Python
Data Structures in Python
 
Chapter 3 -Syntax Analyzer.ppt
Chapter 3 -Syntax Analyzer.pptChapter 3 -Syntax Analyzer.ppt
Chapter 3 -Syntax Analyzer.ppt
 
Types of Parser
Types of ParserTypes of Parser
Types of Parser
 
Chapter 5 Graphs (1).ppt
Chapter 5 Graphs (1).pptChapter 5 Graphs (1).ppt
Chapter 5 Graphs (1).ppt
 
Closure properties
Closure propertiesClosure properties
Closure properties
 
Graph based Clustering
Graph based ClusteringGraph based Clustering
Graph based Clustering
 
Boosting - An Ensemble Machine Learning Method
Boosting - An Ensemble Machine Learning MethodBoosting - An Ensemble Machine Learning Method
Boosting - An Ensemble Machine Learning Method
 
Python Scipy Numpy
Python Scipy NumpyPython Scipy Numpy
Python Scipy Numpy
 
Machine Learning: Bias and Variance Trade-off
Machine Learning: Bias and Variance Trade-offMachine Learning: Bias and Variance Trade-off
Machine Learning: Bias and Variance Trade-off
 
Hash table
Hash tableHash table
Hash table
 
Shift reduce parser
Shift reduce parserShift reduce parser
Shift reduce parser
 
Unsupervised Learning in Machine Learning
Unsupervised Learning in Machine LearningUnsupervised Learning in Machine Learning
Unsupervised Learning in Machine Learning
 
K Nearest Neighbors
K Nearest NeighborsK Nearest Neighbors
K Nearest Neighbors
 
Automata theory
Automata theoryAutomata theory
Automata theory
 
Theory of automata and formal language lab manual
Theory of automata and formal language lab manualTheory of automata and formal language lab manual
Theory of automata and formal language lab manual
 
1.9. minimization of dfa
1.9. minimization of dfa1.9. minimization of dfa
1.9. minimization of dfa
 
Top down parsing
Top down parsingTop down parsing
Top down parsing
 
LR Parsing
LR ParsingLR Parsing
LR Parsing
 
Prims and kruskal algorithms
Prims and kruskal algorithmsPrims and kruskal algorithms
Prims and kruskal algorithms
 
Numpy Talk at SIAM
Numpy Talk at SIAMNumpy Talk at SIAM
Numpy Talk at SIAM
 

Similar to 5-Introduction to Parsing and Context Free Grammar-09-05-2023.pptx

3. Syntax Analyzer.pptx
3. Syntax Analyzer.pptx3. Syntax Analyzer.pptx
3. Syntax Analyzer.pptx
Mattupallipardhu
 
Lexical 2
Lexical 2Lexical 2
Module 11
Module 11Module 11
Module 11
bittudavis
 
lec02-Syntax Analysis and LL(1).pdf
lec02-Syntax Analysis and LL(1).pdflec02-Syntax Analysis and LL(1).pdf
lec02-Syntax Analysis and LL(1).pdf
wigewej294
 
WINSEM2022-23_CSI2005_TH_VL2022230504110_Reference_Material_II_22-12-2022_1.2...
WINSEM2022-23_CSI2005_TH_VL2022230504110_Reference_Material_II_22-12-2022_1.2...WINSEM2022-23_CSI2005_TH_VL2022230504110_Reference_Material_II_22-12-2022_1.2...
WINSEM2022-23_CSI2005_TH_VL2022230504110_Reference_Material_II_22-12-2022_1.2...
Reddyjanardhan221
 
Top Down Parsing, Predictive Parsing
Top Down Parsing, Predictive ParsingTop Down Parsing, Predictive Parsing
Top Down Parsing, Predictive Parsing
Tanzeela_Hussain
 
8-Practice problems on operator precedence parser-24-05-2023.docx
8-Practice problems on operator precedence parser-24-05-2023.docx8-Practice problems on operator precedence parser-24-05-2023.docx
8-Practice problems on operator precedence parser-24-05-2023.docx
venkatapranaykumarGa
 
Syntactic analysis in NLP
Syntactic analysis in NLPSyntactic analysis in NLP
Syntactic analysis in NLP
kartikaVashisht
 
Parsing
ParsingParsing
Parsing
khush_boo31
 
Syntax analysis
Syntax analysisSyntax analysis
Syntax analysis
Akshaya Arunan
 
Ll(1) Parser in Compilers
Ll(1) Parser in CompilersLl(1) Parser in Compilers
Ll(1) Parser in Compilers
Mahbubur Rahman
 
Chapter-3 compiler.pptx course materials
Chapter-3 compiler.pptx course materialsChapter-3 compiler.pptx course materials
Chapter-3 compiler.pptx course materials
gadisaAdamu
 
Ch3_Syntax Analysis.pptx
Ch3_Syntax Analysis.pptxCh3_Syntax Analysis.pptx
Ch3_Syntax Analysis.pptx
TameneTamire
 
Parsing in Compiler Design
Parsing in Compiler DesignParsing in Compiler Design
Parsing in Compiler Design
Akhil Kaushik
 
Ch06
Ch06Ch06
Ch06
Hankyo
 
Compiler unit 2&3
Compiler unit 2&3Compiler unit 2&3
Compiler unit 2&3
BBDITM LUCKNOW
 
Chapter3pptx__2021_12_23_22_52_54.pptx
Chapter3pptx__2021_12_23_22_52_54.pptxChapter3pptx__2021_12_23_22_52_54.pptx
Chapter3pptx__2021_12_23_22_52_54.pptx
DrIsikoIsaac
 
Syntax Analysis.pdf
Syntax Analysis.pdfSyntax Analysis.pdf
Syntax Analysis.pdf
Shivang70701
 
Ch2 (1).ppt
Ch2 (1).pptCh2 (1).ppt
Ch2 (1).ppt
daniloalbay1
 

Similar to 5-Introduction to Parsing and Context Free Grammar-09-05-2023.pptx (20)

3. Syntax Analyzer.pptx
3. Syntax Analyzer.pptx3. Syntax Analyzer.pptx
3. Syntax Analyzer.pptx
 
Lexical 2
Lexical 2Lexical 2
Lexical 2
 
Module 11
Module 11Module 11
Module 11
 
lec02-Syntax Analysis and LL(1).pdf
lec02-Syntax Analysis and LL(1).pdflec02-Syntax Analysis and LL(1).pdf
lec02-Syntax Analysis and LL(1).pdf
 
WINSEM2022-23_CSI2005_TH_VL2022230504110_Reference_Material_II_22-12-2022_1.2...
WINSEM2022-23_CSI2005_TH_VL2022230504110_Reference_Material_II_22-12-2022_1.2...WINSEM2022-23_CSI2005_TH_VL2022230504110_Reference_Material_II_22-12-2022_1.2...
WINSEM2022-23_CSI2005_TH_VL2022230504110_Reference_Material_II_22-12-2022_1.2...
 
Top Down Parsing, Predictive Parsing
Top Down Parsing, Predictive ParsingTop Down Parsing, Predictive Parsing
Top Down Parsing, Predictive Parsing
 
8-Practice problems on operator precedence parser-24-05-2023.docx
8-Practice problems on operator precedence parser-24-05-2023.docx8-Practice problems on operator precedence parser-24-05-2023.docx
8-Practice problems on operator precedence parser-24-05-2023.docx
 
Syntactic analysis in NLP
Syntactic analysis in NLPSyntactic analysis in NLP
Syntactic analysis in NLP
 
Parsing
ParsingParsing
Parsing
 
Syntax analysis
Syntax analysisSyntax analysis
Syntax analysis
 
Ll(1) Parser in Compilers
Ll(1) Parser in CompilersLl(1) Parser in Compilers
Ll(1) Parser in Compilers
 
Chapter-3 compiler.pptx course materials
Chapter-3 compiler.pptx course materialsChapter-3 compiler.pptx course materials
Chapter-3 compiler.pptx course materials
 
Cd2 [autosaved]
Cd2 [autosaved]Cd2 [autosaved]
Cd2 [autosaved]
 
Ch3_Syntax Analysis.pptx
Ch3_Syntax Analysis.pptxCh3_Syntax Analysis.pptx
Ch3_Syntax Analysis.pptx
 
Parsing in Compiler Design
Parsing in Compiler DesignParsing in Compiler Design
Parsing in Compiler Design
 
Ch06
Ch06Ch06
Ch06
 
Compiler unit 2&3
Compiler unit 2&3Compiler unit 2&3
Compiler unit 2&3
 
Chapter3pptx__2021_12_23_22_52_54.pptx
Chapter3pptx__2021_12_23_22_52_54.pptxChapter3pptx__2021_12_23_22_52_54.pptx
Chapter3pptx__2021_12_23_22_52_54.pptx
 
Syntax Analysis.pdf
Syntax Analysis.pdfSyntax Analysis.pdf
Syntax Analysis.pdf
 
Ch2 (1).ppt
Ch2 (1).pptCh2 (1).ppt
Ch2 (1).ppt
 

More from venkatapranaykumarGa

9-Query Processing-05-06-2023.PPT
9-Query Processing-05-06-2023.PPT9-Query Processing-05-06-2023.PPT
9-Query Processing-05-06-2023.PPT
venkatapranaykumarGa
 
10-SLR parser practice problems-02-06-2023.pptx
10-SLR parser practice problems-02-06-2023.pptx10-SLR parser practice problems-02-06-2023.pptx
10-SLR parser practice problems-02-06-2023.pptx
venkatapranaykumarGa
 
13-Applications of Syntax Directed Translation - Syntax Directed Translation ...
13-Applications of Syntax Directed Translation - Syntax Directed Translation ...13-Applications of Syntax Directed Translation - Syntax Directed Translation ...
13-Applications of Syntax Directed Translation - Syntax Directed Translation ...
venkatapranaykumarGa
 
12-Syntax Directed Definition – Evaluation Order-09-06-2023.ppt
12-Syntax Directed Definition – Evaluation Order-09-06-2023.ppt12-Syntax Directed Definition – Evaluation Order-09-06-2023.ppt
12-Syntax Directed Definition – Evaluation Order-09-06-2023.ppt
venkatapranaykumarGa
 
15-CAT-2 answer key discussion-04-07-2023.pdf
15-CAT-2 answer key discussion-04-07-2023.pdf15-CAT-2 answer key discussion-04-07-2023.pdf
15-CAT-2 answer key discussion-04-07-2023.pdf
venkatapranaykumarGa
 
11-SLR input string parsing, CLR introduction-06-06-2023.docx
11-SLR input string parsing, CLR introduction-06-06-2023.docx11-SLR input string parsing, CLR introduction-06-06-2023.docx
11-SLR input string parsing, CLR introduction-06-06-2023.docx
venkatapranaykumarGa
 
14-Intermediate code generation - Variants of Syntax trees - Three Address Co...
14-Intermediate code generation - Variants of Syntax trees - Three Address Co...14-Intermediate code generation - Variants of Syntax trees - Three Address Co...
14-Intermediate code generation - Variants of Syntax trees - Three Address Co...
venkatapranaykumarGa
 
4-Regular expression to Deterministic Finite Automata (Direct method)-05-05-2...
4-Regular expression to Deterministic Finite Automata (Direct method)-05-05-2...4-Regular expression to Deterministic Finite Automata (Direct method)-05-05-2...
4-Regular expression to Deterministic Finite Automata (Direct method)-05-05-2...
venkatapranaykumarGa
 
6-Practice Problems - LL(1) parser-16-05-2023.pptx
6-Practice Problems - LL(1) parser-16-05-2023.pptx6-Practice Problems - LL(1) parser-16-05-2023.pptx
6-Practice Problems - LL(1) parser-16-05-2023.pptx
venkatapranaykumarGa
 
1-Phases of compiler-26-04-2023.pptx
1-Phases of compiler-26-04-2023.pptx1-Phases of compiler-26-04-2023.pptx
1-Phases of compiler-26-04-2023.pptx
venkatapranaykumarGa
 
9-Removal of ambiguity, precedence and associativity-26-05-2023.docx
9-Removal of ambiguity, precedence and associativity-26-05-2023.docx9-Removal of ambiguity, precedence and associativity-26-05-2023.docx
9-Removal of ambiguity, precedence and associativity-26-05-2023.docx
venkatapranaykumarGa
 
7-Operator Precedence Parser-23-05-2023.pptx
7-Operator Precedence Parser-23-05-2023.pptx7-Operator Precedence Parser-23-05-2023.pptx
7-Operator Precedence Parser-23-05-2023.pptx
venkatapranaykumarGa
 
2-Design Issues, Patterns, Lexemes, Tokens-28-04-2023.docx
2-Design Issues, Patterns, Lexemes, Tokens-28-04-2023.docx2-Design Issues, Patterns, Lexemes, Tokens-28-04-2023.docx
2-Design Issues, Patterns, Lexemes, Tokens-28-04-2023.docx
venkatapranaykumarGa
 

More from venkatapranaykumarGa (13)

9-Query Processing-05-06-2023.PPT
9-Query Processing-05-06-2023.PPT9-Query Processing-05-06-2023.PPT
9-Query Processing-05-06-2023.PPT
 
10-SLR parser practice problems-02-06-2023.pptx
10-SLR parser practice problems-02-06-2023.pptx10-SLR parser practice problems-02-06-2023.pptx
10-SLR parser practice problems-02-06-2023.pptx
 
13-Applications of Syntax Directed Translation - Syntax Directed Translation ...
13-Applications of Syntax Directed Translation - Syntax Directed Translation ...13-Applications of Syntax Directed Translation - Syntax Directed Translation ...
13-Applications of Syntax Directed Translation - Syntax Directed Translation ...
 
12-Syntax Directed Definition – Evaluation Order-09-06-2023.ppt
12-Syntax Directed Definition – Evaluation Order-09-06-2023.ppt12-Syntax Directed Definition – Evaluation Order-09-06-2023.ppt
12-Syntax Directed Definition – Evaluation Order-09-06-2023.ppt
 
15-CAT-2 answer key discussion-04-07-2023.pdf
15-CAT-2 answer key discussion-04-07-2023.pdf15-CAT-2 answer key discussion-04-07-2023.pdf
15-CAT-2 answer key discussion-04-07-2023.pdf
 
11-SLR input string parsing, CLR introduction-06-06-2023.docx
11-SLR input string parsing, CLR introduction-06-06-2023.docx11-SLR input string parsing, CLR introduction-06-06-2023.docx
11-SLR input string parsing, CLR introduction-06-06-2023.docx
 
14-Intermediate code generation - Variants of Syntax trees - Three Address Co...
14-Intermediate code generation - Variants of Syntax trees - Three Address Co...14-Intermediate code generation - Variants of Syntax trees - Three Address Co...
14-Intermediate code generation - Variants of Syntax trees - Three Address Co...
 
4-Regular expression to Deterministic Finite Automata (Direct method)-05-05-2...
4-Regular expression to Deterministic Finite Automata (Direct method)-05-05-2...4-Regular expression to Deterministic Finite Automata (Direct method)-05-05-2...
4-Regular expression to Deterministic Finite Automata (Direct method)-05-05-2...
 
6-Practice Problems - LL(1) parser-16-05-2023.pptx
6-Practice Problems - LL(1) parser-16-05-2023.pptx6-Practice Problems - LL(1) parser-16-05-2023.pptx
6-Practice Problems - LL(1) parser-16-05-2023.pptx
 
1-Phases of compiler-26-04-2023.pptx
1-Phases of compiler-26-04-2023.pptx1-Phases of compiler-26-04-2023.pptx
1-Phases of compiler-26-04-2023.pptx
 
9-Removal of ambiguity, precedence and associativity-26-05-2023.docx
9-Removal of ambiguity, precedence and associativity-26-05-2023.docx9-Removal of ambiguity, precedence and associativity-26-05-2023.docx
9-Removal of ambiguity, precedence and associativity-26-05-2023.docx
 
7-Operator Precedence Parser-23-05-2023.pptx
7-Operator Precedence Parser-23-05-2023.pptx7-Operator Precedence Parser-23-05-2023.pptx
7-Operator Precedence Parser-23-05-2023.pptx
 
2-Design Issues, Patterns, Lexemes, Tokens-28-04-2023.docx
2-Design Issues, Patterns, Lexemes, Tokens-28-04-2023.docx2-Design Issues, Patterns, Lexemes, Tokens-28-04-2023.docx
2-Design Issues, Patterns, Lexemes, Tokens-28-04-2023.docx
 

Recently uploaded

The Roman Empire A Historical Colossus.pdf
The Roman Empire A Historical Colossus.pdfThe Roman Empire A Historical Colossus.pdf
The Roman Empire A Historical Colossus.pdf
kaushalkr1407
 
678020731-Sumas-y-Restas-Para-Colorear.pdf
678020731-Sumas-y-Restas-Para-Colorear.pdf678020731-Sumas-y-Restas-Para-Colorear.pdf
678020731-Sumas-y-Restas-Para-Colorear.pdf
CarlosHernanMontoyab2
 
How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...
Jisc
 
Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345
beazzy04
 
The geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideasThe geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideas
GeoBlogs
 
Model Attribute Check Company Auto Property
Model Attribute  Check Company Auto PropertyModel Attribute  Check Company Auto Property
Model Attribute Check Company Auto Property
Celine George
 
Honest Reviews of Tim Han LMA Course Program.pptx
Honest Reviews of Tim Han LMA Course Program.pptxHonest Reviews of Tim Han LMA Course Program.pptx
Honest Reviews of Tim Han LMA Course Program.pptx
timhan337
 
Francesca Gottschalk - How can education support child empowerment.pptx
Francesca Gottschalk - How can education support child empowerment.pptxFrancesca Gottschalk - How can education support child empowerment.pptx
Francesca Gottschalk - How can education support child empowerment.pptx
EduSkills OECD
 
Palestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptxPalestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptx
RaedMohamed3
 
The basics of sentences session 5pptx.pptx
The basics of sentences session 5pptx.pptxThe basics of sentences session 5pptx.pptx
The basics of sentences session 5pptx.pptx
heathfieldcps1
 
Unit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdfUnit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdf
Thiyagu K
 
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdfUnit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Thiyagu K
 
Supporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptxSupporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptx
Jisc
 
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXXPhrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
MIRIAMSALINAS13
 
Home assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdfHome assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdf
Tamralipta Mahavidyalaya
 
Thesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.pptThesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.ppt
EverAndrsGuerraGuerr
 
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...
Levi Shapiro
 
Additional Benefits for Employee Website.pdf
Additional Benefits for Employee Website.pdfAdditional Benefits for Employee Website.pdf
Additional Benefits for Employee Website.pdf
joachimlavalley1
 
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
MysoreMuleSoftMeetup
 
CACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdfCACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdf
camakaiclarkmusic
 

Recently uploaded (20)

The Roman Empire A Historical Colossus.pdf
The Roman Empire A Historical Colossus.pdfThe Roman Empire A Historical Colossus.pdf
The Roman Empire A Historical Colossus.pdf
 
678020731-Sumas-y-Restas-Para-Colorear.pdf
678020731-Sumas-y-Restas-Para-Colorear.pdf678020731-Sumas-y-Restas-Para-Colorear.pdf
678020731-Sumas-y-Restas-Para-Colorear.pdf
 
How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...
 
Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345
 
The geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideasThe geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideas
 
Model Attribute Check Company Auto Property
Model Attribute  Check Company Auto PropertyModel Attribute  Check Company Auto Property
Model Attribute Check Company Auto Property
 
Honest Reviews of Tim Han LMA Course Program.pptx
Honest Reviews of Tim Han LMA Course Program.pptxHonest Reviews of Tim Han LMA Course Program.pptx
Honest Reviews of Tim Han LMA Course Program.pptx
 
Francesca Gottschalk - How can education support child empowerment.pptx
Francesca Gottschalk - How can education support child empowerment.pptxFrancesca Gottschalk - How can education support child empowerment.pptx
Francesca Gottschalk - How can education support child empowerment.pptx
 
Palestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptxPalestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptx
 
The basics of sentences session 5pptx.pptx
The basics of sentences session 5pptx.pptxThe basics of sentences session 5pptx.pptx
The basics of sentences session 5pptx.pptx
 
Unit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdfUnit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdf
 
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdfUnit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdf
 
Supporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptxSupporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptx
 
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXXPhrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
 
Home assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdfHome assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdf
 
Thesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.pptThesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.ppt
 
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...
 
Additional Benefits for Employee Website.pdf
Additional Benefits for Employee Website.pdfAdditional Benefits for Employee Website.pdf
Additional Benefits for Employee Website.pdf
 
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
 
CACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdfCACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdf
 

5-Introduction to Parsing and Context Free Grammar-09-05-2023.pptx

  • 2.  Looping Topics to be covered • Role of parser • Context free grammar • Derivation & Ambiguity • Left recursion & Left factoring • Classification of parsing • Backtracking • LL(1) parsing • Recursive descent parsing • Shift reduce parsing • Operator precedence parsing • LR parsing
  • 4. Role of parser • Parser obtains a string of token from the lexical analyzer and reports syntax error if any otherwise generates syntax tree. • There are two types of parser: 1. Top-down parser 2. Bottom-up parser Rest of front end Parse tree Token IR Lexical analyzer Symbol table Parser Get next token Source program Parse tree
  • 6. Context free grammar • A context free grammar (CFG) is a 4-tuple 𝐺 = (𝑉, 𝛴, 𝑆, 𝑃) where, 𝑉 is finite set of non terminals, 𝛴 is disjoint finite set of terminals, 𝑆 is an element of 𝑉 and it’s a start symbol, 𝑃 is a finite set formulas of the form 𝐴 → 𝛼 where 𝐴 ∈ 𝑉 and 𝛼 ∈ (𝑉 ∪ 𝛴)∗  Nonterminal symbol:  The name of syntax category of a language, e.g., noun, verb, etc.  It is written as a single capital letter, or as a name enclosed between < … >, e.g., A or <Noun> <Noun Phrase> → <Article><Noun> <Article> → a | an | the <Noun> → boy | apple
  • 7. Context free grammar • A context free grammar (CFG) is a 4-tuple 𝐺 = (𝑉, 𝛴, 𝑆, 𝑃) where, 𝑉 is finite set of non terminals, 𝛴 is disjoint finite set of terminals, 𝑆 is an element of 𝑉 and it’s a start symbol, 𝑃 is a finite set formulas of the form 𝐴 → 𝛼 where 𝐴 ∈ 𝑉 and 𝛼 ∈ (𝑉 ∪ 𝛴)∗  Terminal symbol:  A symbol in the alphabet.  It is denoted by lower case letter and punctuation marks used in language. <Noun Phrase> → <Article><Noun> <Article> → a | an | the <Noun> → boy | apple
  • 8. Context free grammar • A context free grammar (CFG) is a 4-tuple 𝐺 = (𝑉, 𝛴, 𝑆, 𝑃) where, 𝑉 is finite set of non terminals, 𝛴 is disjoint finite set of terminals, 𝑆 is an element of 𝑉 and it’s a start symbol, 𝑃 is a finite set formulas of the form 𝐴 → 𝛼 where 𝐴 ∈ 𝑉 and 𝛼 ∈ (𝑉 ∪ 𝛴)∗  Start symbol:  First nonterminal symbol of the grammar is called start symbol. <Noun Phrase> → <Article><Noun> <Article> → a | an | the <Noun> → boy | apple
  • 9. Context free grammar • A context free grammar (CFG) is a 4-tuple 𝐺 = (𝑉, 𝛴, 𝑆, 𝑃) where, 𝑉 is finite set of non terminals, 𝛴 is disjoint finite set of terminals, 𝑆 is an element of 𝑉 and it’s a start symbol, 𝑃 is a finite set formulas of the form 𝐴 → 𝛼 where 𝐴 ∈ 𝑉 and 𝛼 ∈ (𝑉 ∪ 𝛴)∗  Production:  A production, also called a rewriting rule, is a rule of grammar. It has the form of A nonterminal symbol → String of terminal and nonterminal symbols <Noun Phrase> → <Article><Noun> <Article> → a | an | the <Noun> → boy | apple
  • 10. Example: Grammar Write terminals, non terminals, start symbol, and productions for following grammar. E  E O E| (E) | -E | id O  + | - | * | / | ↑ Terminals: id + - * / ↑ ( ) Non terminals: E, O Start symbol: E Productions: E  E O E| (E) | -E | id O  + | - | * | / | ↑
  • 12. Derivation • Derivation is used to find whether the string belongs to a given grammar or not. • Types of derivations are: 1. Leftmost derivation 2. Rightmost derivation
  • 13. Leftmost derivation • A derivation of a string 𝑊 in a grammar 𝐺 is a left most derivation if at every step the left most non terminal is replaced. • Grammar: SS+S | S-S | S*S | S/S | a Output string: a*a-a S S-S S*S-S a*S-S a*a-S a*a-a a S - S a a S * S S Parse tree represents the structure of derivation Leftmost Derivation Parse tree
  • 14. Rightmost derivation • A derivation of a string W in a grammar G is a right most derivation if at every step the right most non terminal is replaced. • It is all called canonical derivation. • Grammar: SS+S | S-S | S*S | S/S | a Output string: a*a-a S S*S S*S-S S*S-a S*a-a a*a-a a S * S a a S - S S Rightmost Derivation Parse Tree
  • 15. Exercise: Derivation 1. Perform leftmost derivation and draw parse tree. SA1B A0A | 𝜖 B0B | 1B | 𝜖 Output string: 1001 2. Perform leftmost derivation and draw parse tree. S0S1 | 01 Output string: 000111 3. Perform rightmost derivation and draw parse tree. EE+E | E*E | id | (E) | -E Output string: id + id * id
  • 17. Ambiguity • In formal language grammar, ambiguity would arise if identical string can occur on the RHS of two or more productions. • Grammar: N1 → α N2 → α • α can be derived from either N1 or N2 𝑵𝟏 𝑵𝟐 𝜶 Replaced by 𝑵𝟏 or 𝑵𝟐 ?
  • 18. Ambiguous grammar • Ambiguous grammar is one that produces more than one leftmost or more than one rightmost derivation for the same sentence. • Grammar: SS+S | S*S | (S) | a Output string: a+a*a S S S*S S+S S+S*S a+S a+S*S a+S*S a+a*S a+a*S a+a*a a+a*a • Here, Two leftmost derivation for string a+a*a is possible hence, above grammar is ambiguous. a S * S a a S S + S a S + S a a S * S S
  • 19. Exercise: Ambiguous Grammar Check Ambiguity in following grammars: 1. S aS | Sa | 𝜖 (output string: aaaa) 2. S aSbS | bSaS | 𝜖 (output string: abab) 3. S SS+ | SS* | a (output string: aa+a*) 4. <exp> → <exp> + <term> | <term> <term> → <term> * <letter> | <letter> <letter> → a|b|c|…|z (output string: a+b*c) 5. Prove that the CFG with productions: S  a | Sa | bSS | SSb | SbS is ambiguous (Hint: consider output string yourself)
  • 20. Left recursion & Left factoring
  • 21. Left recursion • A grammar is said to be left recursive if it has a non terminal 𝐴 such that there is a derivation 𝑨𝑨𝜶 for some string 𝛼. Algorithm to eliminate left recursion 1. Arrange the non terminals in some order 𝐴1, … , 𝐴𝑛 2. For 𝑖: = 1 𝑡𝑜 𝑛 do begin for 𝑗: = 1 𝑡𝑜 𝑖 − 1 do begin replace each production of the form 𝐴𝑖 → 𝐴𝑖𝛾 by the productions 𝐴𝑖 → 𝛿1𝛾| 𝛿2𝛾| … . . | 𝛿𝑘𝛾, where 𝐴𝑗 → 𝛿1 | 𝛿2 | … . | 𝛿𝑘 are all the current 𝐴𝑗 productions; end eliminate the immediate left recursion among the 𝐴𝑖 - productions end
  • 22. 𝛽 𝛽 Left recursion elimination 𝐴 → 𝐴𝛼 | 𝐴 → 𝐴’ 𝐴’ 𝛼 𝐴’ | 𝜖
  • 23. Examples: Left recursion elimination EE+T | T ETE’ E’+TE’ | ε TT*F | F TFT’ T’*FT’ | ε XX%Y | Z XZX’ X’%YX’ | ε
  • 24. Exercise: Left recursion 1. AAbd | Aa | a BBe | b 2. AAB | AC | a | b 3. SA | B AABC | Acd | a | aa BBee | b 4. ExpExp+term | Exp-term | term
  • 25. Left factoring Left factoring is a grammar transformation that is useful for producing a grammar suitable for predictive parsing. Algorithm to left factor a grammar Input: Grammar G Output: An equivalent left factored grammar. Method: For each non terminal A find the longest prefix 𝛼 common to two or more of its alternatives. If 𝛼 ≠∈, i.e., there is a non trivial common prefix, replace all the A productions 𝐴 𝛼𝛽1 𝛼𝛽2 … … … … . . 𝛼𝛽𝑛 𝛾 where 𝛾 represents all alternatives that do not begin with 𝛼 by 𝐴 → 𝛼 𝐴′| 𝛾 𝐴′ → 𝛽1 | 𝛽2 | … … … … . |𝛽𝑛 Here A' is new non terminal. Repeatedly apply this transformation until no two alternatives for a non-terminal have a common prefix.
  • 26. 𝛼 𝛽 𝛼 δ Left factoring elimination 𝐴 → 𝛽 δ 𝛼 𝛼 | 𝐴 → 𝐴′ → | 𝐴′
  • 27. Example: Left factoring elimination SaAB | aCD SaS’ S’AB | CD A xByA | xByAzA | a A xByAA’ | a A’ Є | zA A aAB | aA |a AaA’ A’AB | A | 𝜖 A’AA’’ | 𝜖 A’’B | 𝜖
  • 28. Exercise 1. SiEtS | iEtSeS | a 2. A ad | a | ab | abc | x
  • 30. Parsing • Parsing is a technique that takes input string and produces output either a parse tree if string is valid sentence of grammar, or an error message indicating that string is not a valid. • Types of parsing are: 1. Top down parsing: In top down parsing parser build parse tree from top to bottom. 2. Bottom up parsing: Bottom up parser starts from leaves and work up to the root.
  • 31. Classification of parsing methods Parsing Top down parsing Bottom up parsing (Shift reduce) Back tracking Parsing without backtracking (predictive parsing) LR parsing Operator precedence LALR CLR SLR Recursive descent LL(1)
  • 32. Backtracking • In backtracking, expansion of nonterminal symbol we choose one alternative and if any mismatch occurs then we try another alternative. • Grammar: S cAd Input string: cad A ab | a c A d S c A d S a b c A d S a Parsing done Make prediction Backtrack Make prediction
  • 33. Exercise 1. E 5+T | 3-T T V | V*V | V+V V a | b String: 3-a+b
  • 34. Parsing Methods Parsing Top down parsing Bottom up parsing (Shift reduce) Back tracking Parsing without backtracking (predictive parsing) LR parsing Operator precedence LALR CLR SLR Recursive descent LL(1)
  • 35. LL(1) parser (predictive parser) • LL(1) is non recursive top down parser. 1. First L indicates input is scanned from left to right. 2. The second L means it uses leftmost derivation for input string 3. 1 means it uses only input symbol to predict the parsing process. Predictive parsing program Parsing table M INPUT OUTPUT Stack a + b $ X Y Z $
  • 36. LL(1) parsing (predictive parsing) Steps to construct LL(1) parser 1. Remove left recursion / Perform left factoring (if any). 2. Compute FIRST and FOLLOW of non terminals. 3. Construct predictive parsing table. 4. Parse the input string using parsing table.
  • 37. Rules to compute first of non terminal 1. If 𝐴 → 𝛼 and 𝛼 is terminal, add 𝛼 to 𝐹𝐼𝑅𝑆𝑇(𝐴). 2. If 𝐴 → ∈, add ∈ to 𝐹𝐼𝑅𝑆𝑇(𝐴). 3. If 𝑋 is nonterminal and 𝑋𝑌1 𝑌2 … . 𝑌𝑘 is a production, then place 𝑎 in 𝐹𝐼𝑅𝑆𝑇(𝑋) if for some 𝑖, a is in 𝐹𝐼𝑅𝑆𝑇(𝑌𝑖), and 𝜖 is in all of 𝐹𝐼𝑅𝑆𝑇(𝑌1), … … … , 𝐹𝐼𝑅𝑆𝑇(𝑌𝑖−1 );that is 𝑌1 … 𝑌𝑖−1 ⇒ 𝜖. If 𝜖 is in 𝐹𝐼𝑅𝑆𝑇(𝑌𝑗) for all 𝑗 = 1,2, … . . , 𝑘 then add 𝜖 to 𝐹𝐼𝑅𝑆𝑇(𝑋). Everything in 𝐹𝐼𝑅𝑆𝑇(𝑌1) is surely in 𝐹𝐼𝑅𝑆𝑇(𝑋) If 𝑌1 does not derive 𝜖, then we do nothing more to 𝐹𝐼𝑅𝑆𝑇(𝑋), but if 𝑌1 ⇒ 𝜖, then we add 𝐹𝐼𝑅𝑆𝑇(𝑌2) and so on.
  • 38. Rules to compute first of non terminal Simplification of Rule 3 If 𝐴 → 𝑌1𝑌2 … … . . 𝑌𝐾 , • If 𝑌1 does not derives ∈ 𝑡ℎ𝑒𝑛, 𝐹𝐼𝑅𝑆𝑇(𝐴) = 𝐹𝐼𝑅𝑆𝑇(𝑌1) • If 𝑌1 derives ∈ 𝑡ℎ𝑒𝑛, 𝐹𝐼𝑅𝑆𝑇 𝐴 = 𝐹𝐼𝑅𝑆𝑇 𝑌1 − 𝜖 𝑈 𝐹𝐼𝑅𝑆𝑇(𝑌2) • If 𝑌1 & Y2 derives ∈ 𝑡ℎ𝑒𝑛, 𝐹𝐼𝑅𝑆𝑇 𝐴 = 𝐹𝐼𝑅𝑆𝑇 𝑌1 − 𝜖 𝑈 𝐹𝐼𝑅𝑆𝑇(𝑌2) − 𝜖 𝑈 𝐹𝐼𝑅𝑆𝑇(𝑌3) • If 𝑌1 , Y2 & Y3 derives ∈ 𝑡ℎ𝑒𝑛, 𝐹𝐼𝑅𝑆𝑇 𝐴 = 𝐹𝐼𝑅𝑆𝑇 𝑌1 − 𝜖 𝑈 𝐹𝐼𝑅𝑆𝑇(𝑌2) − 𝜖 𝑈 𝐹𝐼𝑅𝑆𝑇(𝑌3) − 𝜖 𝑈 𝐹𝐼𝑅𝑆𝑇(𝑌4) • If 𝑌1 , Y2 , Y3 …..YK all derives ∈ 𝑡ℎ𝑒𝑛, 𝐹𝐼𝑅𝑆𝑇 𝐴 = 𝐹𝐼𝑅𝑆𝑇 𝑌1 − 𝜖 𝑈 𝐹𝐼𝑅𝑆𝑇(𝑌2) − 𝜖 𝑈 𝐹𝐼𝑅𝑆𝑇(𝑌3) − 𝜖 𝑈 𝐹𝐼𝑅𝑆𝑇(𝑌4) − 𝜖 𝑈 … … … … 𝐹𝐼𝑅𝑆𝑇(𝑌𝑘) (note: if all non terminals derives ∈ then add ∈ to FIRST(A))
  • 39. Rules to compute FOLLOW of non terminal 1. Place $ 𝑖𝑛 𝑓𝑜𝑙𝑙𝑜𝑤 𝑆 . (S is start symbol) 2. If A → 𝛼𝐵𝛽 , then everything in 𝐹𝐼𝑅𝑆𝑇(𝛽) except for 𝜖 is placed in 𝐹𝑂𝐿𝐿𝑂𝑊(𝐵) 3. If there is a production A → 𝛼𝐵 or a production A → 𝛼𝐵𝛽 where 𝐹𝐼𝑅𝑆𝑇(𝛽) contains 𝜖 then everything in F𝑂𝐿𝐿𝑂𝑊 𝐴 = 𝐹𝑂𝐿𝐿𝑂𝑊 𝐵
  • 40. How to apply rules to find FOLLOW of non terminal? A → 𝛼𝐵𝛽 𝛽 𝑖𝑠 𝑝𝑟𝑒𝑠𝑒𝑛𝑡 𝛽 𝑖𝑠 𝑎𝑏𝑠𝑒𝑛𝑡 𝑅𝑢𝑙𝑒 3 𝛽 is terminal 𝛽 𝑖𝑠 𝑁𝑜𝑛𝑡𝑒𝑟𝑚𝑖𝑛𝑎𝑙 𝑅𝑢𝑙𝑒 2 𝛽 derives 𝜖 𝛽 does not derives 𝜖 𝑅𝑢𝑙𝑒 2 𝑅𝑢𝑙𝑒 2 + 𝑅𝑢𝑙𝑒 3
  • 41. Rules to construct predictive parsing table 1. For each production 𝐴 → 𝛼 of the grammar, do steps 2 and 3. 2. For each terminal 𝑎 in 𝑓𝑖𝑟𝑠𝑡(𝛼), Add 𝐴 → 𝛼 to 𝑀[𝐴, 𝑎]. 3. If 𝜖 is in 𝑓𝑖𝑟𝑠𝑡(𝛼), Add 𝐴 → 𝛼 to 𝑀[𝐴, 𝑏] for each terminal 𝑏 in 𝐹𝑂𝐿𝐿𝑂𝑊(𝐵). If 𝜖 is in 𝑓𝑖𝑟𝑠𝑡(𝛼), and $ is in 𝐹𝑂𝐿𝐿𝑂𝑊(𝐴), add 𝐴 → 𝛼 to 𝑀[𝐴, $]. 4. Make each undefined entry of M be error.
  • 42. Example-1: LL(1) parsing Step 1: Not required Step 2: Compute FIRST First(S) SaBa First(B) BbB B𝜖 SaBa BbB | ϵ S  a B a A  𝛼 B  𝜖 A  𝜖 Rule 1 add 𝛼 to 𝐹𝐼𝑅𝑆𝑇(𝐴) Rule 2 add 𝜖 to 𝐹𝐼𝑅𝑆𝑇(𝐴) FIRST(B)={ b , 𝜖 } NT First S { a } B {b,𝜖} B  b B A  𝛼 Rule 1 add 𝛼 to 𝐹𝐼𝑅𝑆𝑇(𝐴) FIRST(S)={ a }
  • 43. Example-1: LL(1) parsing Step 2: Compute FOLLOW Follow(S) Follow(B) SaBa BbB SaBa BbB | ϵ Follow(B)={ a B  b B Follow(S)={ $ } NT First Follow S {a} {$} B {b,𝜖} {a} S  a B a A  𝛂 B Rule 3 Follow(A)=follow(B) A  𝛂 B 𝛃 Rule 2 First(𝛽) − 𝜖 Rule 1: Place $ in FOLLOW(S) }
  • 44. Example-1: LL(1) parsing Step 3: Prepare predictive parsing table SaBa a=FIRST(aBa)={ a } M[S,a]=SaBa SaBa BbB | ϵ NT First Follow S {a} {$} B {b,𝜖} {a} NT Input Symbol a b $ S SaBa B Rule: 2 A 𝛼 a = first(𝛼) M[A,a] = A 𝛼
  • 45. Example-1: LL(1) parsing Step 3: Prepare predictive parsing table BbB a=FIRST(bB)={ b } M[B,b]=BbB SaBa BbB | ϵ NT First Follow S {a} {$} B {b,𝜖} {a} NT Input Symbol a b $ S SaBa B BbB Rule: 2 A 𝛼 a = first(𝛼) M[A,a] = A 𝛼
  • 46. Example-1: LL(1) parsing Step 3: Prepare predictive parsing table Bϵ b=FOLLOW(B)={ a } M[B,a]=B𝜖 SaBa BbB | ϵ NT First Follow S {a} {$} B {b,𝜖} {a} NT Input Symbol a b $ S SaBa Error Error B Bϵ BbB Error Rule: 3 A 𝛼 b = follow(A) M[A,b] = A 𝛼
  • 47. Example-2: LL(1) parsing Step 1: Not required Step 2: Compute FIRST First(S) SaB S𝜖 SaB | ϵ BbC | ϵ CcS | ϵ S  𝜖 A  𝜖 Rule 2 add 𝜖 to 𝐹𝐼𝑅𝑆𝑇(𝐴) FIRST(S)={ a , 𝜖 } NT First S { a, 𝜖 } B {b,𝜖} S  a B A  𝛼 Rule 1 add 𝛼 to 𝐹𝐼𝑅𝑆𝑇(𝐴) C {c,𝜖}
  • 48. Example-2: LL(1) parsing Step 1: Not required Step 2: Compute FIRST First(B) BbC B𝜖 SaB | ϵ BbC | ϵ CcS | ϵ B  𝜖 A  𝜖 Rule 2 add 𝜖 to 𝐹𝐼𝑅𝑆𝑇(𝐴) FIRST(B)={ b , 𝜖 } NT First S { a, 𝜖 } B {b,𝜖} B  b C A  𝛼 Rule 1 add 𝛼 to 𝐹𝐼𝑅𝑆𝑇(𝐴) C {c,𝜖}
  • 49. Example-2: LL(1) parsing Step 1: Not required Step 2: Compute FIRST First(C) CcS C𝜖 SaB | ϵ BbC | ϵ CcS | ϵ C  𝜖 A  𝜖 Rule 2 add 𝜖 to 𝐹𝐼𝑅𝑆𝑇(𝐴) FIRST(B)={ c , 𝜖 } NT First S { a, 𝜖 } B {b,𝜖} C  c S A  𝛼 Rule 1 add 𝛼 to 𝐹𝐼𝑅𝑆𝑇(𝐴) C {c,𝜖}
  • 50. Example-2: LL(1) parsing Step 2: Compute FOLLOW Follow(S) CcS BbC SaB ={$} C  c S Follow(S)={ $ } NT First Follow S {a,𝜖} {$} B {b,𝜖} {$} C {c,𝜖} {$} A  𝛂 B Rule 3 Follow(A)=follow(B) Rule 1: Place $ in FOLLOW(S) SaB | ϵ BbC | ϵ CcS | ϵ Follow(S)=Follow(C) B  b C A  𝛂 B Rule 3 Follow(A)=follow(B) Follow(C)=Follow(B)={$} S  a B A  𝛂 B Rule 3 Follow(A)=follow(B) Follow(B)=Follow(S)={$}
  • 51. Example-2: LL(1) parsing Step 3: Prepare predictive parsing table SaB a=FIRST(aB)={ a } M[S,a]=SaB N T Input Symbol a b c $ S SaB B C Rule: 2 A 𝛼 a = first(𝛼) M[A,a] = A 𝛼 SaB | ϵ BbC | ϵ CcS | ϵ NT First Follow S {a,𝜖} {$} B {b,𝜖} {$} C {c,𝜖} {$}
  • 52. Example-2: LL(1) parsing Step 3: Prepare predictive parsing table S𝜖 b=FOLLOW(S)={ $ } M[S,$]=S𝜖 N T Input Symbol a b c $ S SaB S𝜖 B C SaB | ϵ BbC | ϵ CcS | ϵ NT First Follow S {a} {$} B {b,𝜖} {$} C {c,𝜖} {$} Rule: 3 A 𝛼 b = follow(A) M[A,b] = A 𝛼
  • 53. Example-2: LL(1) parsing Step 3: Prepare predictive parsing table BbC a=FIRST(bC)={ b } M[B,b]=BbC N T Input Symbol a b c $ S SaB S𝜖 B BbC C SaB | ϵ BbC | ϵ CcS | ϵ NT First Follow S {a} {$} B {b,𝜖} {$} C {c,𝜖} {$} Rule: 2 A 𝛼 a = first(𝛼) M[A,a] = A 𝛼
  • 54. Example-2: LL(1) parsing Step 3: Prepare predictive parsing table B𝜖 b=FOLLOW(B)={ $ } M[B,$]=B𝜖 N T Input Symbol a b c $ S SaB S𝜖 B BbC B𝜖 C SaB | ϵ BbC | ϵ CcS | ϵ NT First Follow S {a} {$} B {b,𝜖} {$} C {c,𝜖} {$} Rule: 3 A 𝛼 b = follow(A) M[A,b] = A 𝛼
  • 55. Example-2: LL(1) parsing Step 3: Prepare predictive parsing table CcS a=FIRST(cS)={ c } M[C,c]=CcS N T Input Symbol a b c $ S SaB S𝜖 B BbC B𝜖 C CcS SaB | ϵ BbC | ϵ CcS | ϵ NT First Follow S {a} {$} B {b,𝜖} {$} C {c,𝜖} {$} Rule: 2 A 𝛼 a = first(𝛼) M[A,a] = A 𝛼
  • 56. Example-2: LL(1) parsing Step 3: Prepare predictive parsing table C𝜖 b=FOLLOW(C)={ $ } M[C,$]=C𝜖 N T Input Symbol a b c $ S SaB Error Error S𝜖 B Error BbB Error B𝜖 C Error Error CcS C𝜖 SaB | ϵ BbC | ϵ CcS | ϵ NT First Follow S {a} {$} B {b,𝜖} {$} C {c,𝜖} {$} Rule: 3 A 𝛼 b = follow(A) M[A,b] = A 𝛼