PUSH DOWN AUTOMATA
• Adding additional auxiliary memory to Finite Automaton; in
form of ‘ Stack’ ; is Pushdown Automaton.
• While removing the elements LIFO ( Last In First Out) basis.
Pushdown Automata
• Has Read only Input Tape
• An input Alphabet
• Finite state control
• Set of final states
• Initial state
• In Addition to this has Stack “Pushdown Store”.
• It is a Read Write Pushdown Store, as element added to PDA
or removed from PDA
• PDA is in some state and on reading an input symbol and the
topmost symbol in PDA, it moves to a new state and
writes(adds) a string of symbol in PDA.
Pushdown Automata
Pushdown automata are for context-free
languages what finite automata are for regular
languages.
PDAs are recognizing automata that have a
single stack (= memory):
Last-In First-Out pushing and popping
Difference: PDAs are inherently nondeterministic.
(They are not practical machines.)
Types of PDA
• Deterministic PDA- In PDA, there exits exactly
one transition for each input symbol.
• Non Deterministic PDA- In PDA, there may
exits more than one transition for each input
symbol.
Construct a PDA for
}
0
:
{ 
n
b
a n
n
`
Top Down Parsing
• start at the root of derivation tree and
fill in.
• picks a production and tries to match the
input .
• may require backtracking.
Top Down Parsing
Top Down Parsing
Bottom up Parsing
• start at the leaves and
fill in
• start in a state valid for legal
first tokens.
• as input is consumed, change state to encode
possibilities.
• use a stack to store both state and sentential
forms
Bottom Up Parsing
ACCEPTANCE BY pda
• Null Store Acceptance
• Final State Acceptance
Model of PDA
LL(K) Grammer
• In this section we present certain techniques
for top-down parsing which can be applied to
a certain subclass of context-free languages.
We illustrate them by means of some
examples. We discuss LL(l) parsing, LL(k)
parsing, left factoring and the technique to
remove left recursion.
• EXAMPLE
LL(1) Grammer
LL(2) Grammer
LR(K) Grammer
Bottom-Up Parsing
• Start at the leaves and grow toward root
• As input is consumed, encode possibilities in
an internal state
• A powerful parsing technology
• LR grammars
– Construct right-most derivation of program
– Left-recursive grammar, virtually all programming
language are left-recursive
– Easier to express syntax
Bottom-Up Parsing
• Right-most derivation
– Start with the tokens
– End with the start symbol
– Match substring on RHS of production, replace by
LHS
– Shift-reduce parsers
• Parsers for LR grammars
• Automatic parser generators (yacc, bison)
Bottom-Up Parsing
• Example Bottom-Up Parsing
S  S + E | E
E  num | (S)
(1+2+(3+4))+5  (E+2+(3+4))+5 (S+2+(3+4))+5
(S+E+(3+4))+5 (S+(3+4))+5 (S+(E+4))+5
(S+(S+4))+5 (S+(S+E))+5 (S+(S))+5
(S+E)+5 (S)+5 E+5
S+5 S+E S
Bottom-Up Parsing
• Advantage
– Can postpone the selection of productions until
more of the input is scanned
S
S
E
1
+ E
2
S
S
E
( S
S
E
1
+ E
2
)
+ E
Top-Down
Parsing
Bottom-Up
Parsing
SS + E | E
E  num | (S)
More time to decide what rules to apply
Properties of LR(K)
• Every LR(k) grammar G is unambiguouS.
Properties of LR(K)
Properties of LL(K) Grammer
LBA
LBA
LBA Tuples
Types of Turing Machine(Variation)
• MultiTape
• MultiHead
• 2way infinite TM
• Non Deterministic
• Deterministic
• Multi Dimensional
• Universal
2 way infinite TM
Multi Tape Turing Machine
MultiHead TM
Non Deterministic TM
Multi Dimensional TM
Universal TM
• In computer science, a universal Turing
machine(UTM) is a Turing machine that can
simulate an arbitrary Turing machine on
arbitrary input.
• The universal machine essentially achieves
this by reading both the description of
the machine to be simulated as well as the
input thereof from its own tape.
Universal TM
Turing Machine Model
Tuples of TM
Halting Problem
Halting Problem
Halting Problem
Halting Problem of Turing machine
Decidable and Undecidable
Problems
PCP
Undecidable Languages
Decidable Languages
Decidable Languages
Computational Complexity:
Measuring Time & Space Complexity
• Space Complexity-
• Space complexity
• The better the time complexity of an algorithm is, the faster the
algorithm will carry out his work in practice. Apart from time
complexity, its space complexity is also important: This is
essentially the number of memory cells which an algorithm needs.
A good algorithm keeps this number as small as possible, too.
• There is often a time-space-tradeoff involved in a problem, that is,
it cannot be solved with few computing time and low memory
consumption. One then has to make a compromise and to exchange
computing time for memory consumption or vice versa, depending
on which algorithm one chooses and how one parameterizes it.
Measuring Time Complexity
• Time complexity
• How long does this sorting program run? It
possibly takes a very long time on large inputs
(that is many strings) until the program has
completed its work and gives a sign of life
again. Sometimes it makes sense to be able to
estimate the running time before starting a
progr
Cellular Automata
List of Questions to be practiced on
PDA
Some of the Solutions on PDA
Problems
Some of the Solutions on PDA
Problems
Some of the Solutions on PDA
Problems
Some of the Solutions on PDA
Problems
Turing Machine
On 1n2n3n Solution
BEST OF LUCK IN YOUR ETE

PDA and Turing Machine (1).ppt

  • 1.
  • 2.
    • Adding additionalauxiliary memory to Finite Automaton; in form of ‘ Stack’ ; is Pushdown Automaton. • While removing the elements LIFO ( Last In First Out) basis. Pushdown Automata
  • 3.
    • Has Readonly Input Tape • An input Alphabet • Finite state control • Set of final states • Initial state • In Addition to this has Stack “Pushdown Store”. • It is a Read Write Pushdown Store, as element added to PDA or removed from PDA • PDA is in some state and on reading an input symbol and the topmost symbol in PDA, it moves to a new state and writes(adds) a string of symbol in PDA.
  • 4.
    Pushdown Automata Pushdown automataare for context-free languages what finite automata are for regular languages. PDAs are recognizing automata that have a single stack (= memory): Last-In First-Out pushing and popping Difference: PDAs are inherently nondeterministic. (They are not practical machines.)
  • 6.
    Types of PDA •Deterministic PDA- In PDA, there exits exactly one transition for each input symbol. • Non Deterministic PDA- In PDA, there may exits more than one transition for each input symbol.
  • 7.
    Construct a PDAfor } 0 : {  n b a n n
  • 10.
  • 24.
    Top Down Parsing •start at the root of derivation tree and fill in. • picks a production and tries to match the input . • may require backtracking.
  • 25.
  • 26.
  • 27.
    Bottom up Parsing •start at the leaves and fill in • start in a state valid for legal first tokens. • as input is consumed, change state to encode possibilities. • use a stack to store both state and sentential forms
  • 28.
  • 29.
    ACCEPTANCE BY pda •Null Store Acceptance • Final State Acceptance
  • 30.
  • 31.
    LL(K) Grammer • Inthis section we present certain techniques for top-down parsing which can be applied to a certain subclass of context-free languages. We illustrate them by means of some examples. We discuss LL(l) parsing, LL(k) parsing, left factoring and the technique to remove left recursion. • EXAMPLE
  • 32.
  • 33.
  • 34.
  • 35.
    Bottom-Up Parsing • Startat the leaves and grow toward root • As input is consumed, encode possibilities in an internal state • A powerful parsing technology • LR grammars – Construct right-most derivation of program – Left-recursive grammar, virtually all programming language are left-recursive – Easier to express syntax
  • 36.
    Bottom-Up Parsing • Right-mostderivation – Start with the tokens – End with the start symbol – Match substring on RHS of production, replace by LHS – Shift-reduce parsers • Parsers for LR grammars • Automatic parser generators (yacc, bison)
  • 37.
    Bottom-Up Parsing • ExampleBottom-Up Parsing S  S + E | E E  num | (S) (1+2+(3+4))+5  (E+2+(3+4))+5 (S+2+(3+4))+5 (S+E+(3+4))+5 (S+(3+4))+5 (S+(E+4))+5 (S+(S+4))+5 (S+(S+E))+5 (S+(S))+5 (S+E)+5 (S)+5 E+5 S+5 S+E S
  • 38.
    Bottom-Up Parsing • Advantage –Can postpone the selection of productions until more of the input is scanned S S E 1 + E 2 S S E ( S S E 1 + E 2 ) + E Top-Down Parsing Bottom-Up Parsing SS + E | E E  num | (S) More time to decide what rules to apply
  • 39.
    Properties of LR(K) •Every LR(k) grammar G is unambiguouS.
  • 40.
  • 41.
  • 42.
  • 43.
  • 44.
  • 45.
    Types of TuringMachine(Variation) • MultiTape • MultiHead • 2way infinite TM • Non Deterministic • Deterministic • Multi Dimensional • Universal
  • 46.
  • 47.
  • 48.
  • 49.
  • 50.
  • 51.
    Universal TM • Incomputer science, a universal Turing machine(UTM) is a Turing machine that can simulate an arbitrary Turing machine on arbitrary input. • The universal machine essentially achieves this by reading both the description of the machine to be simulated as well as the input thereof from its own tape.
  • 52.
  • 53.
  • 54.
  • 55.
  • 56.
  • 57.
  • 58.
    Halting Problem ofTuring machine
  • 59.
  • 63.
  • 64.
  • 65.
  • 66.
  • 67.
    Computational Complexity: Measuring Time& Space Complexity • Space Complexity- • Space complexity • The better the time complexity of an algorithm is, the faster the algorithm will carry out his work in practice. Apart from time complexity, its space complexity is also important: This is essentially the number of memory cells which an algorithm needs. A good algorithm keeps this number as small as possible, too. • There is often a time-space-tradeoff involved in a problem, that is, it cannot be solved with few computing time and low memory consumption. One then has to make a compromise and to exchange computing time for memory consumption or vice versa, depending on which algorithm one chooses and how one parameterizes it.
  • 68.
    Measuring Time Complexity •Time complexity • How long does this sorting program run? It possibly takes a very long time on large inputs (that is many strings) until the program has completed its work and gives a sign of life again. Sometimes it makes sense to be able to estimate the running time before starting a progr
  • 69.
  • 70.
    List of Questionsto be practiced on PDA
  • 71.
    Some of theSolutions on PDA Problems
  • 72.
    Some of theSolutions on PDA Problems
  • 73.
    Some of theSolutions on PDA Problems
  • 74.
    Some of theSolutions on PDA Problems
  • 75.
  • 76.
  • 77.
    BEST OF LUCKIN YOUR ETE