The document discusses static type checking in compilers. It describes how static checking is performed at compile time to enforce type safety, whereas dynamic checking occurs at runtime. It provides examples of common static checks for types, control flow, uniqueness, and names. It also contrasts one-pass versus multi-pass compilers and how they approach static checking. Finally, it introduces type systems and shows an example of type rules and syntax-directed definitions for a simple language.
This document discusses static type checking in compilers. It begins by describing the structure of a compiler and how static checking fits in. It then contrasts static and dynamic checking. The rest of the document discusses various aspects of static type checking like type rules, type systems, and implementing static checking using syntax-directed definitions in Yacc. It provides examples of basic type checking for expressions and statements in a simple language. It also discusses type expressions, conversions, and functions.
The document discusses type systems and type checking in programming languages. It covers key concepts such as:
- Types specify valid operations on values and ensure operations are used correctly.
- Languages can be statically, dynamically, or untyped depending on when type checking occurs.
- Type expressions represent types using basic types and type constructors like arrays and pointers.
- Type checking verifies that expressions have the correct type by analyzing expression types. It catches errors and allows implicit conversions.
Compiler Construction | Lecture 7 | Type CheckingEelco Visser
This document summarizes a lecture on type checking. It discusses using constraints to separate the language-specific type checking rules from the language-independent solving algorithm. Constraint-based type checking collects constraints as it traverses the AST, then solves the constraints in any order. This allows type information to be learned gradually and avoids issues with computation order.
The purpose of types:
To define what the program should do.
e.g. read an array of integers and return a double
To guarantee that the program is meaningful.
that it does not add a string to an integer
that variables are declared before they are used
To document the programmer's intentions.
better than comments, which are not checked by the compiler
To optimize the use of hardware.
reserve the minimal amount of memory, but not more
use the most appropriate machine instructions.
Functional programming is getting a lot of attention because it eases many of the difficulties faced in object-oriented programming (OOP) such as testability, maintainability, scalability, and concurrency. Swift has a lot of functional programming features that can be easily used, but most of Objective-C and Swift programmers are not familiar with these concepts.
This talk aims to introduce some of the core concepts of functional programming with Swift such as:
• Importance of Immutability
• First-class, Higher-order and Pure functions
• Closures
• Generics and Associated Type Protocols
• Functors, Applicative Functors and Monads
• Enumerations and Pattern Matching
• Optionals
This document provides an overview of C# data types, operators, expressions, statements, and other fundamental concepts. It discusses the different primitive data types in C#, including integer, floating-point, boolean, character, and string types. It also covers literals, variables, operators such as arithmetic, logical, comparison, and assignment, and shows examples of using each concept. The document is intended to teach basic C# syntax and semantics.
This document provides an overview of C# data types, operators, expressions, and statements. It describes primitive data types like integers, floats, booleans, characters, and strings. It also covers literals for representing values of these types. Additionally, it discusses various categories of operators in C# like arithmetic, logical, comparison, assignment, and bitwise operators. Examples are provided to demonstrate how to use each data type and operator.
Crystal: tipos, peculiaridades y desafiosBrian Cardiff
This document provides an overview of the Crystal programming language including its type system, syntax, and some technical challenges. Key points include:
- Crystal is a compiled, statically typed language with syntax inspired by Ruby and C bindings.
- It uses a type system with value types, reference types, generics, unions, and nil handling.
- The type system supports features like tuples, named tuples, closures, and type inference.
- Technical challenges for the language include improving parallelism, subtyping rules, its intermediate representation, and metaprogramming capabilities.
This document discusses static type checking in compilers. It begins by describing the structure of a compiler and how static checking fits in. It then contrasts static and dynamic checking. The rest of the document discusses various aspects of static type checking like type rules, type systems, and implementing static checking using syntax-directed definitions in Yacc. It provides examples of basic type checking for expressions and statements in a simple language. It also discusses type expressions, conversions, and functions.
The document discusses type systems and type checking in programming languages. It covers key concepts such as:
- Types specify valid operations on values and ensure operations are used correctly.
- Languages can be statically, dynamically, or untyped depending on when type checking occurs.
- Type expressions represent types using basic types and type constructors like arrays and pointers.
- Type checking verifies that expressions have the correct type by analyzing expression types. It catches errors and allows implicit conversions.
Compiler Construction | Lecture 7 | Type CheckingEelco Visser
This document summarizes a lecture on type checking. It discusses using constraints to separate the language-specific type checking rules from the language-independent solving algorithm. Constraint-based type checking collects constraints as it traverses the AST, then solves the constraints in any order. This allows type information to be learned gradually and avoids issues with computation order.
The purpose of types:
To define what the program should do.
e.g. read an array of integers and return a double
To guarantee that the program is meaningful.
that it does not add a string to an integer
that variables are declared before they are used
To document the programmer's intentions.
better than comments, which are not checked by the compiler
To optimize the use of hardware.
reserve the minimal amount of memory, but not more
use the most appropriate machine instructions.
Functional programming is getting a lot of attention because it eases many of the difficulties faced in object-oriented programming (OOP) such as testability, maintainability, scalability, and concurrency. Swift has a lot of functional programming features that can be easily used, but most of Objective-C and Swift programmers are not familiar with these concepts.
This talk aims to introduce some of the core concepts of functional programming with Swift such as:
• Importance of Immutability
• First-class, Higher-order and Pure functions
• Closures
• Generics and Associated Type Protocols
• Functors, Applicative Functors and Monads
• Enumerations and Pattern Matching
• Optionals
This document provides an overview of C# data types, operators, expressions, statements, and other fundamental concepts. It discusses the different primitive data types in C#, including integer, floating-point, boolean, character, and string types. It also covers literals, variables, operators such as arithmetic, logical, comparison, and assignment, and shows examples of using each concept. The document is intended to teach basic C# syntax and semantics.
This document provides an overview of C# data types, operators, expressions, and statements. It describes primitive data types like integers, floats, booleans, characters, and strings. It also covers literals for representing values of these types. Additionally, it discusses various categories of operators in C# like arithmetic, logical, comparison, assignment, and bitwise operators. Examples are provided to demonstrate how to use each data type and operator.
Crystal: tipos, peculiaridades y desafiosBrian Cardiff
This document provides an overview of the Crystal programming language including its type system, syntax, and some technical challenges. Key points include:
- Crystal is a compiled, statically typed language with syntax inspired by Ruby and C bindings.
- It uses a type system with value types, reference types, generics, unions, and nil handling.
- The type system supports features like tuples, named tuples, closures, and type inference.
- Technical challenges for the language include improving parallelism, subtyping rules, its intermediate representation, and metaprogramming capabilities.
Declare Your Language: Name ResolutionEelco Visser
Scope graphs are used to represent the binding information in programs. They provide a language-independent representation of name resolution that can be used to conduct and represent the results of name resolution. Separating the representation of resolved programs from the declarative rules that define name binding allows language-independent tooling to be developed for name resolution and other tasks.
Chapter-2 is for tokens in C programmingz9819898203
The document discusses tokens in the C programming language including keywords, identifiers, constants, string literals, operators, and white spaces. It also summarizes basic data types in C like integral types (char, short, int, long), floating point numbers (float, double), and how they are represented. It then covers numerical constants, character/string constants, variables, declarations of global and local variables, and arithmetic, relational, and logical operators in C.
The document discusses tokens in the C programming language including keywords, identifiers, constants, string literals, operators, and white spaces. It also summarizes basic data types in C like integer and floating point types. Integer types include char, short int, int, and long int. Floating point types include float and double. The document then discusses numerical constants, character/string constants, variables, declarations, and global vs local variables.
This document summarizes key topics in intermediate code generation including:
- Variants of syntax trees like DAGs to represent common subexpressions.
- Three-address code where each instruction has at most three operands.
- Type checking declarations and expressions during translation.
- Generating three-address code for control flow statements using techniques like backpatching to resolve symbolic labels.
The document discusses lexical analysis and lexical analyzer generators. It provides background on why lexical analysis is a separate phase in compiler design and how it simplifies parsing. It also describes how a lexical analyzer interacts with a parser and some key attributes of tokens like lexemes and patterns. Finally, it explains how regular expressions are used to specify patterns for tokens and how tools like Lex and Flex can be used to generate lexical analyzers from regular expression definitions.
Declarative Semantics Definition - Static Analysis and Error CheckingGuido Wachsmuth
The document discusses static analysis and error checking in compiler construction. It covers several key topics:
- The static analysis process of parsing source code, checking for errors, and generating machine code.
- Name analysis, binding, and scoping during static checking and for editor services like refactoring and code generation.
- Testing static semantics including name binding, type systems, and constraints.
- Restricting context-free languages using static semantics and judgements of well-formedness and well-typedness.
- Formal type systems including those for Tiger language examples involving types, expressions, and scoping.
The document discusses various aspects of template type deduction in C++ including:
1. How the type T and the type of the parameter ParamType are deduced based on whether ParamType is a pointer/reference, universal reference, or neither.
2. How array arguments are handled differently when passed by value versus by reference in templates.
3. How function arguments are treated, with function types decaying to function pointers.
4. The differences between auto type deduction and template type deduction.
5. How decltype can be used to deduce types from expressions while preserving reference-ness.
Syntax defines the grammatical rules of a programming language. There are three levels of syntax: lexical, concrete, and abstract. Lexical syntax defines tokens like literals and identifiers. Concrete syntax defines the actual representation using tokens. Abstract syntax describes a program's information without implementation details. Backus-Naur Form (BNF) uses rewriting rules to specify a grammar. BNF grammars can be ambiguous. Extended BNF simplifies recursive rules. Syntax analysis transforms a program into an abstract syntax tree used for semantic analysis and code generation.
Dynamic Type Inference for Gradual Hindley–Milner TypingYusuke Miyazaki
This document describes dynamic type inference (DTI) for gradually typed languages with Hindley-Milner type inference. DTI infers types for type variables left undecided by compile-time type inference by instantiating type variables during evaluation based on the types of values. The approach is formalized in a blame calculus λBDTI that extends a simply typed lambda calculus with type variables, casts for runtime type checking, and a reduction relation for DTI. Properties like soundness and completeness of DTI and the gradual guarantee are proven for λBDTI.
This document provides an introduction to constants and variables in the C programming language. It discusses C's character set including letters, digits, special characters, and white spaces. It also covers C tokens, keywords and identifiers. The document defines different types of constants in C like numeric constants (integer and real), character constants, string constants, and backslash character constants. It describes variables in C including rules for defining variables. It provides examples of declaring and assigning values to variables. The overall summary is that this document serves as an introductory lesson on the basic building blocks of constants and variables used in the C programming language.
The document discusses syntax-directed translation using attribute grammars. Attribute grammars assign semantic values or attributes to the symbols in a context-free grammar. A depth-first traversal of the parse tree executes semantic rules that calculate the values of attributes. There are two types of attributes: synthesized attributes which are computed bottom-up and inherited attributes which are passed top-down. L-attributed grammars allow efficient evaluation by passing inherited attributes left-to-right during a depth-first traversal.
This document provides a summary of key elements of the C programming language including program structure, data types, operators, flow control statements, standard libraries, and common functions. It covers topics such as functions, variables, comments, preprocessor directives, constants, pointers, arrays, structures, I/O, math functions, and limits of integer and floating point types. The summary is presented in a reference card format organized by sections.
This document summarizes and discusses type checking algorithms for programming languages. It introduces constraint-based type checking, which separates type checking into constraint generation and constraint solving. This provides a more declarative way to specify type checkers. The document discusses using variables and constraints to represent types during type checking. It introduces NaBL2, a domain-specific language for writing constraint generators to specify name and type constraints for programming language static semantics. NaBL2 uses scope graphs to represent name binding structures and supports features like type equality, subtyping, and type-dependent name resolution through constraint rules. An example scope graph and constraint rule for let-bindings is provided.
The document discusses type systems and type inference in Perl. It explains that in Perl, what matters is whether a term is "typeable" rather than its specific type. Terms are typeable if they can be evaluated in the same way as in the typed lambda calculus. The document then covers type inference algorithms for Perl, including typed lambda calculus, building and solving equations to infer types, polymorphic types, record types, subtyping, and recursive types.
Intermediate representations span the gap between source and target languages by being closer to the target language while remaining mostly machine independent. This allows optimizations to be performed in a machine-independent way.
There are two main types of intermediate languages: high-level representations like syntax trees that are close to the source language, and low-level representations like three-address code that are closer to the target machine. Syntax trees separate parsing from subsequent processing while three-address code translates expressions into single assignment form using temporaries.
Semantic analysis enforces static semantic rules by constructing a syntax tree, evaluating attributes, and checking properties that can be analyzed at compile time. Attribute grammars annotate syntax rules with semantic attributes to define meaning while dynamic
This document discusses various fundamental concepts in C programming such as flowcharts, pseudocode, control structures, variables, data types, operators, functions, arrays, structures, and input/output functions. It provides definitions and examples for each concept. Control structures covered include conditional statements like if-else and switch-case, as well as loops like while, do-while and for. Data types discussed are integer, floating point, character and string constants. Key concepts like variables, arrays, structures, functions and their declarations are also summarized.
compiler Design course material chapter 2gadisaAdamu
The document provides an overview of lexical analysis in compiler design. It discusses the role of the lexical analyzer in reading characters from a source program and grouping them into lexemes and tokens. Regular expressions are used to specify patterns for tokens. Non-deterministic finite automata (NFA) and deterministic finite automata (DFA) are used to recognize patterns and languages. Thompson's construction is used to translate regular expressions to NFAs, and subset construction is used to translate NFAs to equivalent DFAs. This process is used in lexical analyzer generators to automate the recognition of language tokens from regular expressions.
ESOFT Metro Campus - Diploma in Software Engineering - (Module V) Windows Based Application Development in C#
(Template - Virtusa Corporate)
Contents:
Introduction to .NET Framework
.NET Framework Platform Architecture
Microsoft Visual Studio
C# Language
C#, VS and .NET Framework Versions
Your First C# Application
Printing Statements
Comments in C#
Common Type System
Value Types and Reference Type
Variables Declaration in C#
Type Conversion
Arithmetic Operators
Assignment Operators
Comparison Operators
Logical Operators
If Statement
If… Else Statement
If… Else if… Else Statement
Nested If Statement
Switch Statement
While Loop
Do While Loop
For Loop
Arrays
Accessing Arrays using foreach Loop
Two Dimensional Arrays
Classes and Objects in C#
Inheritance in C#
Partial Classes
Namespaces
Windows Forms Applications
Using Buttons, Labels and Text Boxes
Displaying Message Boxes
Error Handling with Try… Catch… finally…
Using Radio Buttons
Using Check Boxes
Using List Boxes
Creating Menus
Creating ToolStrips
MDI Forms
Database Application in C#
Creating a Simple Database Application
SQL Insert / Update / Retrieving / Delete
SQL Command Execute Methods
Data Sets
The document discusses intermediate representation in compilers. It states that compilers use a front end to translate source programs into an intermediate representation and a back end to generate target code from the intermediate representation. This allows for machine-independent optimization and retargeting of compilers to different architectures. It then describes three-address code as a common intermediate representation and shows syntax-directed translation of expressions and control flow structures into three-address code. Finally, it discusses various aspects of compiler implementation like symbol tables, type checking, and backpatching.
Declare Your Language: Name ResolutionEelco Visser
Scope graphs are used to represent the binding information in programs. They provide a language-independent representation of name resolution that can be used to conduct and represent the results of name resolution. Separating the representation of resolved programs from the declarative rules that define name binding allows language-independent tooling to be developed for name resolution and other tasks.
Chapter-2 is for tokens in C programmingz9819898203
The document discusses tokens in the C programming language including keywords, identifiers, constants, string literals, operators, and white spaces. It also summarizes basic data types in C like integral types (char, short, int, long), floating point numbers (float, double), and how they are represented. It then covers numerical constants, character/string constants, variables, declarations of global and local variables, and arithmetic, relational, and logical operators in C.
The document discusses tokens in the C programming language including keywords, identifiers, constants, string literals, operators, and white spaces. It also summarizes basic data types in C like integer and floating point types. Integer types include char, short int, int, and long int. Floating point types include float and double. The document then discusses numerical constants, character/string constants, variables, declarations, and global vs local variables.
This document summarizes key topics in intermediate code generation including:
- Variants of syntax trees like DAGs to represent common subexpressions.
- Three-address code where each instruction has at most three operands.
- Type checking declarations and expressions during translation.
- Generating three-address code for control flow statements using techniques like backpatching to resolve symbolic labels.
The document discusses lexical analysis and lexical analyzer generators. It provides background on why lexical analysis is a separate phase in compiler design and how it simplifies parsing. It also describes how a lexical analyzer interacts with a parser and some key attributes of tokens like lexemes and patterns. Finally, it explains how regular expressions are used to specify patterns for tokens and how tools like Lex and Flex can be used to generate lexical analyzers from regular expression definitions.
Declarative Semantics Definition - Static Analysis and Error CheckingGuido Wachsmuth
The document discusses static analysis and error checking in compiler construction. It covers several key topics:
- The static analysis process of parsing source code, checking for errors, and generating machine code.
- Name analysis, binding, and scoping during static checking and for editor services like refactoring and code generation.
- Testing static semantics including name binding, type systems, and constraints.
- Restricting context-free languages using static semantics and judgements of well-formedness and well-typedness.
- Formal type systems including those for Tiger language examples involving types, expressions, and scoping.
The document discusses various aspects of template type deduction in C++ including:
1. How the type T and the type of the parameter ParamType are deduced based on whether ParamType is a pointer/reference, universal reference, or neither.
2. How array arguments are handled differently when passed by value versus by reference in templates.
3. How function arguments are treated, with function types decaying to function pointers.
4. The differences between auto type deduction and template type deduction.
5. How decltype can be used to deduce types from expressions while preserving reference-ness.
Syntax defines the grammatical rules of a programming language. There are three levels of syntax: lexical, concrete, and abstract. Lexical syntax defines tokens like literals and identifiers. Concrete syntax defines the actual representation using tokens. Abstract syntax describes a program's information without implementation details. Backus-Naur Form (BNF) uses rewriting rules to specify a grammar. BNF grammars can be ambiguous. Extended BNF simplifies recursive rules. Syntax analysis transforms a program into an abstract syntax tree used for semantic analysis and code generation.
Dynamic Type Inference for Gradual Hindley–Milner TypingYusuke Miyazaki
This document describes dynamic type inference (DTI) for gradually typed languages with Hindley-Milner type inference. DTI infers types for type variables left undecided by compile-time type inference by instantiating type variables during evaluation based on the types of values. The approach is formalized in a blame calculus λBDTI that extends a simply typed lambda calculus with type variables, casts for runtime type checking, and a reduction relation for DTI. Properties like soundness and completeness of DTI and the gradual guarantee are proven for λBDTI.
This document provides an introduction to constants and variables in the C programming language. It discusses C's character set including letters, digits, special characters, and white spaces. It also covers C tokens, keywords and identifiers. The document defines different types of constants in C like numeric constants (integer and real), character constants, string constants, and backslash character constants. It describes variables in C including rules for defining variables. It provides examples of declaring and assigning values to variables. The overall summary is that this document serves as an introductory lesson on the basic building blocks of constants and variables used in the C programming language.
The document discusses syntax-directed translation using attribute grammars. Attribute grammars assign semantic values or attributes to the symbols in a context-free grammar. A depth-first traversal of the parse tree executes semantic rules that calculate the values of attributes. There are two types of attributes: synthesized attributes which are computed bottom-up and inherited attributes which are passed top-down. L-attributed grammars allow efficient evaluation by passing inherited attributes left-to-right during a depth-first traversal.
This document provides a summary of key elements of the C programming language including program structure, data types, operators, flow control statements, standard libraries, and common functions. It covers topics such as functions, variables, comments, preprocessor directives, constants, pointers, arrays, structures, I/O, math functions, and limits of integer and floating point types. The summary is presented in a reference card format organized by sections.
This document summarizes and discusses type checking algorithms for programming languages. It introduces constraint-based type checking, which separates type checking into constraint generation and constraint solving. This provides a more declarative way to specify type checkers. The document discusses using variables and constraints to represent types during type checking. It introduces NaBL2, a domain-specific language for writing constraint generators to specify name and type constraints for programming language static semantics. NaBL2 uses scope graphs to represent name binding structures and supports features like type equality, subtyping, and type-dependent name resolution through constraint rules. An example scope graph and constraint rule for let-bindings is provided.
The document discusses type systems and type inference in Perl. It explains that in Perl, what matters is whether a term is "typeable" rather than its specific type. Terms are typeable if they can be evaluated in the same way as in the typed lambda calculus. The document then covers type inference algorithms for Perl, including typed lambda calculus, building and solving equations to infer types, polymorphic types, record types, subtyping, and recursive types.
Intermediate representations span the gap between source and target languages by being closer to the target language while remaining mostly machine independent. This allows optimizations to be performed in a machine-independent way.
There are two main types of intermediate languages: high-level representations like syntax trees that are close to the source language, and low-level representations like three-address code that are closer to the target machine. Syntax trees separate parsing from subsequent processing while three-address code translates expressions into single assignment form using temporaries.
Semantic analysis enforces static semantic rules by constructing a syntax tree, evaluating attributes, and checking properties that can be analyzed at compile time. Attribute grammars annotate syntax rules with semantic attributes to define meaning while dynamic
This document discusses various fundamental concepts in C programming such as flowcharts, pseudocode, control structures, variables, data types, operators, functions, arrays, structures, and input/output functions. It provides definitions and examples for each concept. Control structures covered include conditional statements like if-else and switch-case, as well as loops like while, do-while and for. Data types discussed are integer, floating point, character and string constants. Key concepts like variables, arrays, structures, functions and their declarations are also summarized.
compiler Design course material chapter 2gadisaAdamu
The document provides an overview of lexical analysis in compiler design. It discusses the role of the lexical analyzer in reading characters from a source program and grouping them into lexemes and tokens. Regular expressions are used to specify patterns for tokens. Non-deterministic finite automata (NFA) and deterministic finite automata (DFA) are used to recognize patterns and languages. Thompson's construction is used to translate regular expressions to NFAs, and subset construction is used to translate NFAs to equivalent DFAs. This process is used in lexical analyzer generators to automate the recognition of language tokens from regular expressions.
ESOFT Metro Campus - Diploma in Software Engineering - (Module V) Windows Based Application Development in C#
(Template - Virtusa Corporate)
Contents:
Introduction to .NET Framework
.NET Framework Platform Architecture
Microsoft Visual Studio
C# Language
C#, VS and .NET Framework Versions
Your First C# Application
Printing Statements
Comments in C#
Common Type System
Value Types and Reference Type
Variables Declaration in C#
Type Conversion
Arithmetic Operators
Assignment Operators
Comparison Operators
Logical Operators
If Statement
If… Else Statement
If… Else if… Else Statement
Nested If Statement
Switch Statement
While Loop
Do While Loop
For Loop
Arrays
Accessing Arrays using foreach Loop
Two Dimensional Arrays
Classes and Objects in C#
Inheritance in C#
Partial Classes
Namespaces
Windows Forms Applications
Using Buttons, Labels and Text Boxes
Displaying Message Boxes
Error Handling with Try… Catch… finally…
Using Radio Buttons
Using Check Boxes
Using List Boxes
Creating Menus
Creating ToolStrips
MDI Forms
Database Application in C#
Creating a Simple Database Application
SQL Insert / Update / Retrieving / Delete
SQL Command Execute Methods
Data Sets
The document discusses intermediate representation in compilers. It states that compilers use a front end to translate source programs into an intermediate representation and a back end to generate target code from the intermediate representation. This allows for machine-independent optimization and retargeting of compilers to different architectures. It then describes three-address code as a common intermediate representation and shows syntax-directed translation of expressions and control flow structures into three-address code. Finally, it discusses various aspects of compiler implementation like symbol tables, type checking, and backpatching.
हिंदी वर्णमाला पीपीटी, hindi alphabet PPT presentation, hindi varnamala PPT, Hindi Varnamala pdf, हिंदी स्वर, हिंदी व्यंजन, sikhiye hindi varnmala, dr. mulla adam ali, hindi language and literature, hindi alphabet with drawing, hindi alphabet pdf, hindi varnamala for childrens, hindi language, hindi varnamala practice for kids, https://www.drmullaadamali.com
Strategies for Effective Upskilling is a presentation by Chinwendu Peace in a Your Skill Boost Masterclass organisation by the Excellence Foundation for South Sudan on 08th and 09th June 2024 from 1 PM to 3 PM on each day.
This slide is special for master students (MIBS & MIFB) in UUM. Also useful for readers who are interested in the topic of contemporary Islamic banking.
Main Java[All of the Base Concepts}.docxadhitya5119
This is part 1 of my Java Learning Journey. This Contains Custom methods, classes, constructors, packages, multithreading , try- catch block, finally block and more.
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...Dr. Vinod Kumar Kanvaria
Exploiting Artificial Intelligence for Empowering Researchers and Faculty,
International FDP on Fundamentals of Research in Social Sciences
at Integral University, Lucknow, 06.06.2024
By Dr. Vinod Kumar Kanvaria
বাংলাদেশের অর্থনৈতিক সমীক্ষা ২০২৪ [Bangladesh Economic Review 2024 Bangla.pdf] কম্পিউটার , ট্যাব ও স্মার্ট ফোন ভার্সন সহ সম্পূর্ণ বাংলা ই-বুক বা pdf বই " সুচিপত্র ...বুকমার্ক মেনু 🔖 ও হাইপার লিংক মেনু 📝👆 যুক্ত ..
আমাদের সবার জন্য খুব খুব গুরুত্বপূর্ণ একটি বই ..বিসিএস, ব্যাংক, ইউনিভার্সিটি ভর্তি ও যে কোন প্রতিযোগিতা মূলক পরীক্ষার জন্য এর খুব ইম্পরট্যান্ট একটি বিষয় ...তাছাড়া বাংলাদেশের সাম্প্রতিক যে কোন ডাটা বা তথ্য এই বইতে পাবেন ...
তাই একজন নাগরিক হিসাবে এই তথ্য গুলো আপনার জানা প্রয়োজন ...।
বিসিএস ও ব্যাংক এর লিখিত পরীক্ষা ...+এছাড়া মাধ্যমিক ও উচ্চমাধ্যমিকের স্টুডেন্টদের জন্য অনেক কাজে আসবে ...
Natural birth techniques - Mrs.Akanksha Trivedi Rama University
Ch6.ppt
1. 1
Static Checking and Type
Systems
Chapter 6
COP5621 Compiler Construction
Copyright Robert van Engelen, Florida State University, 2005
2. 2
The Structure of our Compiler
Revisited
Lexical analyzer
Syntax-directed
static checker
Character
stream
Token
stream
Java
bytecode
Yacc specification
JVM specification
Lex specification
Syntax-directed
translator
Type
checking
Code
generation
3. 3
Static versus Dynamic Checking
• Static checking: the compiler enforces
programming language’s static semantics,
which are checked at compile time
• Runtime checking: dynamic semantics are
checked at run time by special code
generated by the compiler
4. 4
Static Checking
• Typical examples of static checking are
– Type checks
– Flow-of-control checks
– Uniqueness checks
– Name-related checks
5. 5
Type Checks, Overloading,
Coercion, and Polymorphism
int op(int), op(float);
int f(float);
int a, c[10], d;
d = c+d; // FAIL
*d = a; // FAIL
a = op(d); // OK: overloading (C++)
a = f(d); // OK: coersion
vector<int> v; // OK: template instantiation
9. 9
One-Pass versus Multi-Pass
Static Checking
• One-pass compiler: static checking for C,
Pascal, Fortran, and many other languages
can be performed in one pass while at the
same time intermediate code is generated
• Multi-pass compiler: static checking for
Ada, Java, and C# is performed in a
separate phase, sometimes requiring
traversing the syntax tree multiple times
10. 10
Type Expressions
• Type expressions are used in declarations
and type casts to define or refer to a type
– Primitive types, such as int and bool
– Type constructors, such as pointer-to, array-of,
records and classes, templates, and functions
– Type names, such as typedefs in C and named
types in Pascal, refer to type expressions
11. 11
Graph Representations for Type
Expressions
int *f(char*,char*)
fun
args pointer
char
int
pointer
char
pointer
Tree forms
fun
args pointer
char
int
pointer
DAGs
13. 13
Name Equivalence
• Each type name is a distinct type, even
when the type expressions the names refer
to are the same
• Types are identical only if names match
• Used by Pascal (inconsistently)
type link = ^node;
var next : link;
last : link;
p : ^node;
q, r : ^node;
With name equivalence in Pascal:
p ≠ next
p ≠ last
p = q = r
next = last
14. 14
Structural Equivalence of Type
Expressions
• Two types are the same if they are
structurally identical
• Used in C, Java, C#
struct
val next
int
pointer
struct
val
int
pointer
=
pointer
next
15. 15
Structural Equivalence of Type
Expressions (cont’d)
• Two structurally equivalent type
expressions have the same pointer address
when constructing graphs by sharing nodes
struct
val
int
pointer
s
p
struct Node
{ int val;
struct Node *next;
};
struct Node s, *p;
… p = &s; // OK
… *p = s; // OK
next
&s
*p
16. 16
Constructing Type Graphs in
Yacc
Type *mkint() construct int node if not already
constructed
Type *mkarr(Type*,int) construct array-of-type node
if not already constructed
Type *mkptr(Type*) construct pointer-of-type node
if not already constructed
17. 17
Syntax-Directed Definitions for
Constructing Type Graphs in Yacc
%union
{ Symbol *sym;
int num;
Type *typ;
}
%token INT
%token <sym> ID
%token <int> NUM
%type <typ> type
%%
decl : type ID { addtype($2, $1); }
| type ID ‘[’ NUM ‘]’ { addtype($2, mkarr($1, $4)); }
;
type : INT { $$ = mkint(); }
| type ‘*’ { $$ = mkptr($1); }
| /* empty */ { $$ = mkint(); }
;
18. 18
Type Systems
• A type system defines a set of types and
rules to assign types to programming
language constructs
• Informal type system rules, for example “if
both operands of addition are of type
integer, then the result is of type integer”
• Formal type system rules: Post system
19. 19
Type Rules in Post System
Notation
E
e1 : integer E
e2 : integer
E
e1 + e2 : integer
Environment E maps
variables v to types T:
E(v) = T
E(v) = T
E
v : T
E
e : T
E
v := e
E(v) = T
20. 20
A Simple Language Example
P D ; S
D D ; D
| id : T
T boolean
| char
| integer
| array [ num ] of T
| ^ T
S id := E
| if E then S
| while E do S
| S ; S
E true
| false
| literal
| num
| id
| E and E
| E mod E
| E [ E ]
| E ^
21. 21
Simple Language Example:
Declarations
D id : T { addtype(id.entry, T.type) }
T boolean { T.type := boolean }
T char { T.type := char }
T integer { T.type := integer }
T array [ num ] of T1 { T.type := array(1..num.val, T1.type) }
T ^ T1 { T.type := pointer(T1)
22. 22
Simple Language Example:
Statements
S id := E { S.type := if id.type = E.type then void
else type_error }
S if E then S1 { S.type := if E.type = boolean then S1.type
else type_error }
S while E do S1 { S.type := if E.type = boolean then S1.type
else type_error }
S S1 ; S2 { S.type := if S1.type = void and S2.type = void
then void else type_error }
23. 23
Simple Language Example:
Expressions
E true { E.type = boolean }
E false { E.type = boolean }
E literal { E.type = char }
E num { E.type = integer }
E id { E.type = lookup(id.entry) }
…
24. 24
Simple Language Example:
Expressions (cont’d)
E E1 and E2 { E.type := if E1.type = boolean and
E2.type = boolean
then boolean else type_error }
E E1 mod E2 { E.type := if E1.type = integer and
E2.type = integer
then integer else type_error }
E E1 [ E2 ] { E.type := if E1.type = array(s, t) and
E2.type = integer
then t else type_error }
E E1 ^ { E.type := if E1.type = pointer(t)
then t else type_error }
25. 25
Simple Language Example:
Adding Functions
T T1 -> T2 { T.type := function(T1.type, T2.type) }
E E1 ( E2 ) { E.type := if E1.type = function(s, t) and
E2.type = s
then t else type_error }
Example:
v : integer;
odd : integer -> boolean;
if odd(3) then
v := 1;
26. 26
Syntax-Directed Definitions for
Type Checking in Yacc
%{
enum Types {Tint, Tfloat, Tpointer, Tarray, … };
typedef struct Type
{ enum Types type;
struct Type *child;
} Type;
%}
%union
{ Type *typ;
}
%type <typ> expr
%%
expr : expr ‘+’ expr { if ($1.type != Tint
|| $3.type != Tint)
semerror(“non-int operands in +”);
$$ = mkint();
emit(iadd);
}
27. 27
Type Conversion and Coercion
• Type conversion is explicit, for example
using type casts
• Type coercion is implicitly performed by the
compiler
• Both require a type system to check and
infer types for (sub)expressions