Pushdown automata are recognizing automata that have a single stack (= memory) that operates in a last-in first-out manner, pushing and popping elements. They are similar to finite automata but have an auxiliary stack memory. Pushdown automata are inherently nondeterministic as there may be more than one transition for each input symbol. They are used to recognize context-free languages, where the languages are defined by context-free grammars.
2. • Adding additional auxiliary memory to Finite Automaton; in
form of ‘ Stack’ ; is Pushdown Automaton.
• While removing the elements LIFO ( Last In First Out) basis.
Pushdown Automata
3. • Has Read only Input Tape
• An input Alphabet
• Finite state control
• Set of final states
• Initial state
• In Addition to this has Stack “Pushdown Store”.
• It is a Read Write Pushdown Store, as element added to PDA
or removed from PDA
• PDA is in some state and on reading an input symbol and the
topmost symbol in PDA, it moves to a new state and
writes(adds) a string of symbol in PDA.
4. Pushdown Automata
Pushdown automata are for context-free
languages what finite automata are for regular
languages.
PDAs are recognizing automata that have a
single stack (= memory):
Last-In First-Out pushing and popping
Difference: PDAs are inherently nondeterministic.
(They are not practical machines.)
5.
6. Types of PDA
• Deterministic PDA- In PDA, there exits exactly
one transition for each input symbol.
• Non Deterministic PDA- In PDA, there may
exits more than one transition for each input
symbol.
24. Top Down Parsing
• start at the root of derivation tree and
fill in.
• picks a production and tries to match the
input .
• may require backtracking.
27. Bottom up Parsing
• start at the leaves and
fill in
• start in a state valid for legal
first tokens.
• as input is consumed, change state to encode
possibilities.
• use a stack to store both state and sentential
forms
31. LL(K) Grammer
• In this section we present certain techniques
for top-down parsing which can be applied to
a certain subclass of context-free languages.
We illustrate them by means of some
examples. We discuss LL(l) parsing, LL(k)
parsing, left factoring and the technique to
remove left recursion.
• EXAMPLE
35. Bottom-Up Parsing
• Start at the leaves and grow toward root
• As input is consumed, encode possibilities in
an internal state
• A powerful parsing technology
• LR grammars
– Construct right-most derivation of program
– Left-recursive grammar, virtually all programming
language are left-recursive
– Easier to express syntax
36. Bottom-Up Parsing
• Right-most derivation
– Start with the tokens
– End with the start symbol
– Match substring on RHS of production, replace by
LHS
– Shift-reduce parsers
• Parsers for LR grammars
• Automatic parser generators (yacc, bison)
37. Bottom-Up Parsing
• Example Bottom-Up Parsing
S S + E | E
E num | (S)
(1+2+(3+4))+5 (E+2+(3+4))+5 (S+2+(3+4))+5
(S+E+(3+4))+5 (S+(3+4))+5 (S+(E+4))+5
(S+(S+4))+5 (S+(S+E))+5 (S+(S))+5
(S+E)+5 (S)+5 E+5
S+5 S+E S
38. Bottom-Up Parsing
• Advantage
– Can postpone the selection of productions until
more of the input is scanned
S
S
E
1
+ E
2
S
S
E
( S
S
E
1
+ E
2
)
+ E
Top-Down
Parsing
Bottom-Up
Parsing
SS + E | E
E num | (S)
More time to decide what rules to apply
51. Universal TM
• In computer science, a universal Turing
machine(UTM) is a Turing machine that can
simulate an arbitrary Turing machine on
arbitrary input.
• The universal machine essentially achieves
this by reading both the description of
the machine to be simulated as well as the
input thereof from its own tape.
67. Computational Complexity:
Measuring Time & Space Complexity
• Space Complexity-
• Space complexity
• The better the time complexity of an algorithm is, the faster the
algorithm will carry out his work in practice. Apart from time
complexity, its space complexity is also important: This is
essentially the number of memory cells which an algorithm needs.
A good algorithm keeps this number as small as possible, too.
• There is often a time-space-tradeoff involved in a problem, that is,
it cannot be solved with few computing time and low memory
consumption. One then has to make a compromise and to exchange
computing time for memory consumption or vice versa, depending
on which algorithm one chooses and how one parameterizes it.
68. Measuring Time Complexity
• Time complexity
• How long does this sorting program run? It
possibly takes a very long time on large inputs
(that is many strings) until the program has
completed its work and gives a sign of life
again. Sometimes it makes sense to be able to
estimate the running time before starting a
progr