Lex is a program generator designed for lexical processing of character input streams. It works by translating a table of regular expressions and corresponding program fragments provided by the user into a program. This program then reads an input stream, partitions it into strings matching the given expressions, and executes the associated program fragments in order. Flex is a fast lexical analyzer generator that is an alternative to Lex. It generates scanners that recognize lexical patterns in text based on pairs of regular expressions and C code provided by the user.
Lexical Analysis, Tokens, Patterns, Lexemes, Example pattern, Stages of a Lexical Analyzer, Regular expressions to the lexical analysis, Implementation of Lexical Analyzer, Lexical analyzer: use as generator.
Syntax-Directed Translation: Syntax-Directed Definitions, Evaluation Orders for SDD's, Applications of Syntax-Directed Translation, Syntax-Directed Translation Schemes, and Implementing L-Attributed SDD's. Intermediate-Code Generation: Variants of Syntax Trees, Three-Address Code, Types and Declarations, Type Checking, Control Flow, Back patching, Switch-Statements
Lexical Analysis, Tokens, Patterns, Lexemes, Example pattern, Stages of a Lexical Analyzer, Regular expressions to the lexical analysis, Implementation of Lexical Analyzer, Lexical analyzer: use as generator.
Syntax-Directed Translation: Syntax-Directed Definitions, Evaluation Orders for SDD's, Applications of Syntax-Directed Translation, Syntax-Directed Translation Schemes, and Implementing L-Attributed SDD's. Intermediate-Code Generation: Variants of Syntax Trees, Three-Address Code, Types and Declarations, Type Checking, Control Flow, Back patching, Switch-Statements
Passes in the compilers are the essential part in any system as it works with Phases.
Two main types of compiler are:- One pass or single pass and Two pass or multi pass compilers.
We have learnt that any computer system is made of hardware and software.
The hardware understands a language, which humans cannot understand. So we write programs in high-level language, which is easier for us to understand and remember.
These programs are then fed into a series of tools and OS components to get the desired code that can be used by the machine.
This is known as Language Processing System.
Embracing GenAI - A Strategic ImperativePeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Francesca Gottschalk - How can education support child empowerment.pptxEduSkills OECD
Francesca Gottschalk from the OECD’s Centre for Educational Research and Innovation presents at the Ask an Expert Webinar: How can education support child empowerment?
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
A Strategic Approach: GenAI in EducationPeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
The French Revolution, which began in 1789, was a period of radical social and political upheaval in France. It marked the decline of absolute monarchies, the rise of secular and democratic republics, and the eventual rise of Napoleon Bonaparte. This revolutionary period is crucial in understanding the transition from feudalism to modernity in Europe.
For more information, visit-www.vavaclasses.com
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
2. Lexical Analyzer Generator
• Lex is a program generator designed for lexical
processing of character input streams.
• Lex source is a table of regular expressions and
corresponding program fragments.
• The table is translated to a program which reads an
input stream, copying it to an output stream and
partitioning the input into strings which match the
given expressions.
• The compiler writer uses specialized tools that
produce components that can easily be integrated
in the compiler and help implement various phases
of a compiler.
1/31/2017
ANKUR SRIVASTAVA ASSISTANT
PROFESSOR JIT
2
3. Contd…..
• The program fragments written by the user are executed in the
order in which the corresponding regular expressions occur in
the input stream.
• Lex can generate analyzers in either “C” or “Ratfor”, a language
which can be translated automatically to portable Fortran.
• Lex is a program designed to generate scanners, also known as
tokenizers, which recognize lexical patterns in text.
• Lex is an acronym that stands for "lexical analyzer generator.
• It is intended primarily for Unix-based systems.
• The code for Lex was originally developed by Eric Schmidt and
Mike Lesk.
1/31/2017
ANKUR SRIVASTAVA ASSISTANT
PROFESSOR JIT
3
4. Contd……
• Lex can be used with a parser generator to perform lexical
analysis.
• It is easy, for example, to interface Lex and Yacc, an open
source program that generates code for the parser in the C
programming language.
• Lex can perform simple transformations by itself but its main
purpose is to facilitate lexical analysis.
• The processing of character sequences such as source code
to produce symbol sequences called tokens for use as input
to other programs such as parsers.
1/31/2017
ANKUR SRIVASTAVA ASSISTANT
PROFESSOR JIT
4
5. Contd……
Processes in lexical analyzers
– Scanning
• Pre-processing
– Strip out comments and white space
– Macro functions
– Correlating error messages from compiler with
source program
• A line number can be associated with an error
message
– Lexical analysis
1/31/2017
ANKUR SRIVASTAVA ASSISTANT
PROFESSOR JIT
5
6. Contd……
Terms of the lexical analyzer
– Token
• Types of words in source program
• Keywords, operators, identifiers, constants,
literal strings, punctuation symbols(such as
commas, semicolons)
– Lexeme
• Actual words in source program
1/31/2017
ANKUR SRIVASTAVA ASSISTANT
PROFESSOR JIT
6
7. Contd…..
– Pattern
• A rule describing the set of lexemes that can
represent a particular token in source program
• Relation {<.<=,>,>=,==,<>}
Lexical Errors
– Deleting an extraneous character
– Inserting a missing character
– Replacing an incorrect character by a correct character
– Pre-scanning
1/31/2017
ANKUR SRIVASTAVA ASSISTANT
PROFESSOR JIT
7
9. ANKUR SRIVASTAVA ASSISTANT
PROFESSOR JIT
9
What is Lex?
• The main job of a lexical analyzer (scanner) is
to break up an input stream into more usable
elements (tokens)
a = b + c * d;
ID ASSIGN ID PLUS ID MULT ID SEMI
• Lex is an utility to help you rapidly generate
your scanners
1/31/2017
10. ANKUR SRIVASTAVA ASSISTANT
PROFESSOR JIT
10
Lex – Lexical Analyzer
• Lexical analyzers tokenize input streams
• Tokens are the terminals of a language
– English
• words, punctuation marks, …
– Programming language
• Identifiers, operators, keywords, …
• Regular expressions define terminals/tokens
• Some examples are
• Flex lexical analyser
• Yacc
• Ragel
• PLY (Python Lex-Yacc)
1/31/2017
11. Contd…..
• Lex turns the user's expressions and actions (called source in this memo)
into the host general-purpose language.
• the generated program is named yylex.
• The yylex program will recognize expressions in a stream (called input in
this memo) and perform the specified actions for each expression as it is
detected. See the below fig….
Source -> -> yylex
Input -> -> Output
Fig. An overview of Lex
1/31/2017
ANKUR SRIVASTAVA ASSISTANT
PROFESSOR JIT
11
Lex
yylex
12. yylex()
• It implies the main entry point for lex.
• Reads the input stream generates tokens, returns
0 at the end of input stream.
• It is called to invoke the lexer or scanner.
• Each time yylex() is called, the scanner continues
processing the input from where it last left off.
• Eg:
• yyout
• yyin
• yywrap
1/31/2017
ANKUR SRIVASTAVA ASSISTANT
PROFESSOR JIT
12
13. Lexical Analyzer Generator - Lex
Lexical Compiler
Lex Source program
lex.l
lex.yy.c
C
compiler
lex.yy.c a.out
a.outInput stream Sequence
of tokens
14. Structure of a Lex file
• The structure of a Lex file is intentionally similar to
that of a yacc file.
• files are divided into three sections, separated by
lines that contain only two percent signs, as follows:
• Definition section
%%
• Rules section
%%
• C code section
1/31/2017
ANKUR SRIVASTAVA ASSISTANT
PROFESSOR JIT
14
15. Contd…..
• The definition section defines macros and imports header files
written in C. It is also possible to write any C code here, which
will be copied verbatim into the generated source file.
• The rules section associates regular expression patterns with
C statements. When the lexer sees text in the input matching a
given pattern, it will execute the associated C code.
• The C code section contains C statements and functions that are
copied verbatim to the generated source file. These statements
presumably contain code called by the rules in the rules section.
In large programs it is more convenient to place this code in a
separate file linked in at compile time.
1/31/2017
ANKUR SRIVASTAVA ASSISTANT
PROFESSOR JIT
15
16. Example
• The following is an example Lex file for the flex version of Lex.
• It recognizes strings of numbers (positive integers) in the input,
and simply prints them out.
/*** Definition section ***/
%{
/* C code to be copied */
#include <stdio.h>
%}
/* This tells flex to read only one input file
*/
%option noyywrap
%%
1/31/2017
ANKUR SRIVASTAVA ASSISTANT
PROFESSOR JIT
16
17. Contd……
/*** Rules section ***/
/* [0-9]+ matches a string of one or more digits */
[0-9]+ {
/* yytext is a string containing the matched text.
*/
printf("Saw an integer: %sn", yytext);
}
.|n { /* Ignore all other characters. */
}
%%
/*** C Code section ***/
1/31/2017
ANKUR SRIVASTAVA ASSISTANT
PROFESSOR JIT
17
18. Contd…..
int main(void)
{
/* Call the lexer, then quit. */
yylex();
return 0;
}
If this input is given to flex, it will be converted into a C file, lex.yy.c.
This can be compiled into an executable which matches and
outputs strings of integers. Eg.
abc123z.!&*2gj6
the program will print: Saw an integer: 123
Saw an integer: 2
Saw an integer: 6
1/31/2017
ANKUR SRIVASTAVA ASSISTANT
PROFESSOR JIT
18
19. Flex, A fast scanner generator
• flex is a tool for generating scanners:
• programs which recognized lexical patterns in text.
• flex reads the given input files,
• or its standard input if no file names are given, for a description
of a scanner to generate.
• The description is in the form of pairs of regular expressions and
C code, called rules.
• flex generates as output a C source file, `lex.yy.c', which defines
a routine `yylex()'.
• This file is compiled and linked with the `-lfl' library to produce
an executable.
• When the executable is run, it analyzes its input for occurrences
of the regular expressions.
• Whenever it finds one, it executes the corresponding C code.
1/31/2017
ANKUR SRIVASTAVA ASSISTANT
PROFESSOR JIT
19
20. Contd…
• Flex (fast lexical analyzer generator) is a free and
open-source software alternative to lex.
• It is a computer program that generates lexical
analyzers (also known as "scanners" or "lexers").
• Flex was written in C by Vern Paxson around 1987.
• He was translating a Ratfor generator.
• It had been led by Jef Poskanzer.
• The tokens recognized are: '+', '-', '*', '/', '=', '(', ')', ',',
';', '.', ':=', '<', '<=', '<>', '>', '>=';
• numbers: 0-9 {0-9}; identifiers: a-zA-Z {a-zA-Z0-9} and
keywords: begin, call, const, do, end, if, odd, procedur
e, then, var, while.
1/31/2017
ANKUR SRIVASTAVA ASSISTANT
PROFESSOR JIT
20