LEX is a tool that allows users to specify a lexical analyzer by defining patterns for tokens using regular expressions. The LEX compiler transforms these patterns into a transition diagram and generates C code. It takes a LEX source program as input, compiles it to produce lex.yy.c, which is then compiled with a C compiler to generate an executable that takes an input stream and returns a sequence of tokens. LEX programs have declarations, translation rules that map patterns to actions, and optional auxiliary functions. The actions are fragments of C code that execute when a pattern is matched.
Lexical Analysis, Tokens, Patterns, Lexemes, Example pattern, Stages of a Lexical Analyzer, Regular expressions to the lexical analysis, Implementation of Lexical Analyzer, Lexical analyzer: use as generator.
Lexical Analysis, Tokens, Patterns, Lexemes, Example pattern, Stages of a Lexical Analyzer, Regular expressions to the lexical analysis, Implementation of Lexical Analyzer, Lexical analyzer: use as generator.
The purpose of types:
To define what the program should do.
e.g. read an array of integers and return a double
To guarantee that the program is meaningful.
that it does not add a string to an integer
that variables are declared before they are used
To document the programmer's intentions.
better than comments, which are not checked by the compiler
To optimize the use of hardware.
reserve the minimal amount of memory, but not more
use the most appropriate machine instructions.
The purpose of types:
To define what the program should do.
e.g. read an array of integers and return a double
To guarantee that the program is meaningful.
that it does not add a string to an integer
that variables are declared before they are used
To document the programmer's intentions.
better than comments, which are not checked by the compiler
To optimize the use of hardware.
reserve the minimal amount of memory, but not more
use the most appropriate machine instructions.
Lexical analyzer, tokenizer, scanner, or lexer is a function that is invoked by the syntax analyzer. This function returns the nxt lexicon or word in the source file.
Explore the innovative world of trenchless pipe repair with our comprehensive guide, "The Benefits and Techniques of Trenchless Pipe Repair." This document delves into the modern methods of repairing underground pipes without the need for extensive excavation, highlighting the numerous advantages and the latest techniques used in the industry.
Learn about the cost savings, reduced environmental impact, and minimal disruption associated with trenchless technology. Discover detailed explanations of popular techniques such as pipe bursting, cured-in-place pipe (CIPP) lining, and directional drilling. Understand how these methods can be applied to various types of infrastructure, from residential plumbing to large-scale municipal systems.
Ideal for homeowners, contractors, engineers, and anyone interested in modern plumbing solutions, this guide provides valuable insights into why trenchless pipe repair is becoming the preferred choice for pipe rehabilitation. Stay informed about the latest advancements and best practices in the field.
Cosmetic shop management system project report.pdfKamal Acharya
Buying new cosmetic products is difficult. It can even be scary for those who have sensitive skin and are prone to skin trouble. The information needed to alleviate this problem is on the back of each product, but it's thought to interpret those ingredient lists unless you have a background in chemistry.
Instead of buying and hoping for the best, we can use data science to help us predict which products may be good fits for us. It includes various function programs to do the above mentioned tasks.
Data file handling has been effectively used in the program.
The automated cosmetic shop management system should deal with the automation of general workflow and administration process of the shop. The main processes of the system focus on customer's request where the system is able to search the most appropriate products and deliver it to the customers. It should help the employees to quickly identify the list of cosmetic product that have reached the minimum quantity and also keep a track of expired date for each cosmetic product. It should help the employees to find the rack number in which the product is placed.It is also Faster and more efficient way.
Welcome to WIPAC Monthly the magazine brought to you by the LinkedIn Group Water Industry Process Automation & Control.
In this month's edition, along with this month's industry news to celebrate the 13 years since the group was created we have articles including
A case study of the used of Advanced Process Control at the Wastewater Treatment works at Lleida in Spain
A look back on an article on smart wastewater networks in order to see how the industry has measured up in the interim around the adoption of Digital Transformation in the Water Industry.
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdffxintegritypublishin
Advancements in technology unveil a myriad of electrical and electronic breakthroughs geared towards efficiently harnessing limited resources to meet human energy demands. The optimization of hybrid solar PV panels and pumped hydro energy supply systems plays a pivotal role in utilizing natural resources effectively. This initiative not only benefits humanity but also fosters environmental sustainability. The study investigated the design optimization of these hybrid systems, focusing on understanding solar radiation patterns, identifying geographical influences on solar radiation, formulating a mathematical model for system optimization, and determining the optimal configuration of PV panels and pumped hydro storage. Through a comparative analysis approach and eight weeks of data collection, the study addressed key research questions related to solar radiation patterns and optimal system design. The findings highlighted regions with heightened solar radiation levels, showcasing substantial potential for power generation and emphasizing the system's efficiency. Optimizing system design significantly boosted power generation, promoted renewable energy utilization, and enhanced energy storage capacity. The study underscored the benefits of optimizing hybrid solar PV panels and pumped hydro energy supply systems for sustainable energy usage. Optimizing the design of solar PV panels and pumped hydro energy supply systems as examined across diverse climatic conditions in a developing country, not only enhances power generation but also improves the integration of renewable energy sources and boosts energy storage capacities, particularly beneficial for less economically prosperous regions. Additionally, the study provides valuable insights for advancing energy research in economically viable areas. Recommendations included conducting site-specific assessments, utilizing advanced modeling tools, implementing regular maintenance protocols, and enhancing communication among system components.
2. LEX
LEX is a tool that allows one to specify a Lexical
Analyzer by specifying RE to describe patterns for
tokens.
Input Notation-Lex language(Specification)
Lex Compiler-Transforms Input patterns into a
Transition diagram and generates code in a file called
lex.yy.c
1/21/2020 V.Anusuya,AP(SG)/CSE 2
3. Lexical Analyzer Generator - Lex
Lexical Compiler
Lex Source program
lex.l
lex.yy.c
C
compiler
lex.yy.c a.out
a.outInput stream Sequence
of tokens
1/21/2020 V.Anusuya,AP(SG)/CSE 3
4. Structure of Lex programs
Lex program has the following form:
declarations
%%
translation rules
%%
auxiliary functions
Pattern {Action}
1/21/2020 V.Anusuya,AP(SG)/CSE 4
5. The declarations section includes declarations of
variables, manifest constants (identifiers declared to
stand for a constant, e.g., the name of a token), and
regular definitions.
The translation rules each have the form
Pattern { Action }
pattern is a regular expression
Action-Fragment of code written in C.
Third Section-holds whatever additional functions are used in the
actions.
Alternatively, these functions can be compiled separately and loaded
with the lexical analyser.
1/21/2020 V.Anusuya,AP(SG)/CSE 5
6. The lexical analyzer returns a single value, the token
name, to the parser, but uses the shared, integer
variable yylval to pass additional information about
the lexeme found, if needed.
1/21/2020 V.Anusuya,AP(SG)/CSE 6
8. Lex Predefined Variables
yytext -- a string containing the lexeme
yyleng -- the length of the lexeme
yyin -- the input stream pointer
yyout -- the output stream pointer
1/21/2020 V.Anusuya,AP(SG)/CSE 8
9. Lex Library Routines
yylex() - The default main() contains a call of yylex()
yymore() - return the next token
yyless(n) - retain the first n characters in yytext
yywarp()-is called whenever Lex reaches an end-of-
file.By default yywarp() always returns 1
1/21/2020 V.Anusuya,AP(SG)/CSE 9
10. Program: LEX Specification
%%
{letter}({letter}|{digit})* printf("id: %sn", yytext);
n printf("new linen");
%%
int main(int argc,char **argv)
{
if (argc > 1)
{
FILE *file;
file = fopen(argv[1],"r");
if(!file)
{
printf("could not open %s n",argv[1]);
exit(0);
}
yyin = file;
}
1/21/2020 V.Anusuya,AP(SG)/CSE 10
13. Output
id: c
[]="id: GOOD
";new line
id: b
=id: a
+10;new line
id: printf
("%id: d
",id: b
);new line
}new line
new line
1/21/2020 V.Anusuya,AP(SG)/CSE 13
14. Conflict Resolution in Lex
There are two rules that Lex uses to decide on the
proper lexeme to select, when several prefixes of the
input match one or more patterns:
1. Always prefer a longer prefix to a shorter prefix.
2. If the longest possible prefix matches two or more
patterns, prefer the pattern listed first in the Lex
program.
1/21/2020 V.Anusuya,AP(SG)/CSE 14
15. The Lookahead Operator
Lex automatically reads one character ahead of the last
character that forms the selected lexeme, and then
retracts the input so only the lexeme itself is consumed
from the input.
However, sometimes, we want a certain pattern to be
matched to the input only when it is followed by a certain
other characters. If so, we may use the slash in a pattern
to indicate the end of the part of the pattern that
matches the lexeme.
What follows / is additional pattern that must be
matched before we can decide that the token in question
was seen, but what matches this second pattern is not
part of the lexeme.
1/21/2020 V.Anusuya,AP(SG)/CSE 15
16. %{
/* program to recognize a c program */
int COMMENT=0;
%}
identifier [a-zA-Z][a-zA-Z0-9]*
%%
#.* { printf("n%s is a PREPROCESSOR
DIRECTIVE",yytext);}
int | float | char |double |while |for |do |if |break |
continue |void | switch |case |long |struct |const |
typedef | return |else | goto {printf("nt%s is a
KEYWORD",yytext);}
"/*" {COMMENT = 1;}
"*/" {COMMENT = 0;}
1/21/2020 V.Anusuya,AP(SG)/CSE 16
17. {identifier}(
{if(!COMMENT)printf("nnFUNCTIONnt%s",yytext);
}
"{" {if(!COMMENT) printf("n BLOCK BEGINS");}
"}" {if(!COMMENT) printf("n BLOCK ENDS");}
{identifier}([[0-9]*])? {if(!COMMENT) printf("n %s
IDENTIFIER",yytext);}
"a"|"n"|"b"|"t"|"t"|"b"|"a" printf("n%stis
Escape sequences",yytext);
""%d""|"%s"|"%c"|"%f"|"%e" printf("n%stis a format
specifier",yytext);
".*" {if(!COMMENT) printf("nt%s is a
STRING",yytext);}
1/21/2020 V.Anusuya,AP(SG)/CSE 17
18. [0-9]+ {if(!COMMENT) printf("nt%s is a
NUMBER",yytext);}
"=" {if(!COMMENT)printf("nt%s is an ASSIGNMENT
OPERATOR",yytext);}
"+"|"-"|"*"|"/"|"%" {printf("n %s is an arithmetic
operator",yytext);}
"<="|">="|"<"|"=="|">" {if(!COMMENT) printf("nt%s is
a RELATIONAL OPERATOR",yytext);}
%%
1/21/2020 V.Anusuya,AP(SG)/CSE 18
19. int main(int argc,char **argv)
{
if (argc > 1)
{
FILE *file;
file = fopen(argv[1],"r");
if(!file)
{
printf("could not open %s n",argv[1]);
exit(0);
}
1/21/2020 V.Anusuya,AP(SG)/CSE 19