The major programming languages have evolved over several decades, beginning in the 1940s-1950s with assembly languages and languages like Fortran for scientific computing. Key developments included ALGOL in the late 1950s which introduced block structures, LISP for AI in the late 1950s, COBOL for business in the late 1950s, and BASIC for education in the 1960s. Object-oriented concepts emerged in the 1960s with Simula and became widespread in the 1980s and 1990s with languages like C++, Smalltalk, and Java. Functional programming concepts gained prominence with LISP and languages like ML and Haskell. Modern scripting languages aided system administration tasks. Each new generation of languages incorporated ideas from its
Course: Programming Languages and Paradigms:
A brief introduction to imperative programming principles: history, von neumann, BNF, variables (r-values, l-values), modifiable data structures, order of evaluation, static and dynamic scopes, referencing environments, call by value, control flow (sequencing, selection, iteration), ...
CDA4411: Chapter 10 - Application DevelopmentFreddy San
Describe the application development process and the role of methodologies, models, and tools
Compare generations and types of programming language
Explain how assemblers, compilers, and interpreters translate source code instructions into executable code
Describe link editing and contrast static and dynamic linking
Explain the role of memory maps in symbolic debugging
Describe integrated application development tools
Oplægget blev holdt ved et seminar i InfinIT-interessegruppen Højniveausprog til Indlejrede Systemer den 2. oktober 2013. Læs mere om interessegruppen her: http://infinit.dk/dk/interessegrupper/hoejniveau_sprog_til_indlejrede_systemer/hoejniveau_sprog_til_indlejrede_systemer.htm
Computers when invented by Charles Babbage only viewed it as a computing machines. However it is only recently that computer has evolved more rapidly. Through its complex systems and processing capabilities computers can be used to manipulate databases.
For more such innovative content on management studies, join WeSchool PGDM-DLP Program: http://bit.ly/ZEcPAc
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
More Related Content
Similar to 2. Evolution of the Major Programming Languages.pdf
Course: Programming Languages and Paradigms:
A brief introduction to imperative programming principles: history, von neumann, BNF, variables (r-values, l-values), modifiable data structures, order of evaluation, static and dynamic scopes, referencing environments, call by value, control flow (sequencing, selection, iteration), ...
CDA4411: Chapter 10 - Application DevelopmentFreddy San
Describe the application development process and the role of methodologies, models, and tools
Compare generations and types of programming language
Explain how assemblers, compilers, and interpreters translate source code instructions into executable code
Describe link editing and contrast static and dynamic linking
Explain the role of memory maps in symbolic debugging
Describe integrated application development tools
Oplægget blev holdt ved et seminar i InfinIT-interessegruppen Højniveausprog til Indlejrede Systemer den 2. oktober 2013. Læs mere om interessegruppen her: http://infinit.dk/dk/interessegrupper/hoejniveau_sprog_til_indlejrede_systemer/hoejniveau_sprog_til_indlejrede_systemer.htm
Computers when invented by Charles Babbage only viewed it as a computing machines. However it is only recently that computer has evolved more rapidly. Through its complex systems and processing capabilities computers can be used to manipulate databases.
For more such innovative content on management studies, join WeSchool PGDM-DLP Program: http://bit.ly/ZEcPAc
Similar to 2. Evolution of the Major Programming Languages.pdf (20)
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
2. Activity
• The class will play snakes and ladders.
• Students will form teams.
• In between discussions, there will be questions given and each team
will give out their answer.
• For every correct answer, the team can roll the dice to advance in the
snakes and ladders game.
• The first team to reach the top of the game wins first place, followed
by other groups.
3. Amid the strife of war
• 1936 - 1945
• Konrad Zuse
• Built a series of complex computers from
electromechanical relays.
• Developed a language called Plankalkul for
expressing computations
• It was unpublished until 1972
• A programming language designed for
engineering purposes.
4. ShortCode
A step toward readability/writability
• 1949
• John Mauchly
• The language is for the BINAC.
• It consists of coded version of mathematical expressions.
• The machine implementation was pure interpretation, which was
termed automatic programming.
• More readable and writable than machine code, but cost running 50
times slower.
6. Keep in mind in the initial context
• At the beginning of the 50s
• Primary use of computers were for numerical calculations.
• Computer memories were small.
• Hardware did not directly support floating point operations or indexing.
• Hardware was unreliable.
• Hardware was more costly than programmers.
7. • 1950-1953
• Grace Hopper and her team for UNIVAC
• Programs were written in a type of code called pseudocode which
was expanded into machine code subprograms.
A-0, A-1, A-2
A first compiling system
8. Progress in Generality
• 1950
• David J. Wheeler
• Developed a method of using blocks of relocatable addresses.
• 1951
• Maurice Wilkes
• Extended this idea to design an assembly program that could combine chosen
subroutines and allocate storage.
9. Speedcode System
Support for floating point operations
• Early 1950s
• John Backus
• For IBM 701 – extended machine code to include floating-point
operations.
• Includes…
• four arithmetic operations for floating points (sqrt, sine, arc, tangent,
exponent and logarithms)
• Conditional and unconditional branching
• I/O conversions
• Pure interpretation
10. First real compiler
• 1952
• Alick E. Glennie
• Autocode compiler for Manchester Mark I computer.
• Low-level and machine oriented
OR
• 1953
• Laning and Zierler
• Algebraic translation system (compiler)
• Implemented on the MIT Whirlwind computer.
• Generated subroutine call to each formula and expression
11. Fortran
First important high-level language
• 1954
• IBM 704
• Provided hardware indexing and floating point instructions.
• John Backus and group at IBM.
• Provided efficiency of hand-coded programs.
• The language was widely adopted by scientists for writing numerically
intensive programs.
• Implementation began in 1955.
12. Fortran 1
• 1957
• Types and storage for all variables fixed before run time.
• Included I/O formatting
• Variables of up to six characters
• User-defined subprograms
• If and Do statements
• Implicit data types
13. Artificial Intelligence
An influence on programming languages
• Mid 50s – interest in AI emerged
• Natural language processing
• Modeling human information storage and retrieval and other brain
processes.
• Mechanizing certain intelligent processes such as theorem proving
14. FLPL (Fortran List Processing Language)
• Mid 50s
• IBM
• Extension to the Fortran compiler.
• A compiled computer language for the manipulation of symbolic
expression.
• For simulation of a geometry theorem-proving machine on the IBM
704.
15. IPL – Information Processing Language
First AI programming language
• 1056
• Allan Newell, J.C. Shaw, Herbert Simon
• An assembly language for manipulating lists.
• Intended to help with programs that perform simple problem solving
actions such as lists, dynamic memory allocation, data types, etc.
• Invented the concept of list processing.
16. In the business domain…
• FLOW-MATIC
• 1957
• Business oriented language for the UNIVAC
• first English-like data processing language.
• “mathematical programs should be written in mathematical notation, data
processing programs should be written in English statements” - Grace Hopper
17. In the business domain…
• COBOL 60 (Common Business Oriented Language)
• 1959
• Design goals
• Use English as much as possible
• Easy to use, even at the expense of being less powerful…
• Should not be overly restricted by the problems of its implementation
• Characteristics
• DEFINE verb for macros
• Records
• 30 character names with hyphens
• Data division and procedure division
• Mandated by the DOD
18. Fortran Progresses…
• Fortran II compiler
• 1958
• But fixes
• Independent compilation of subprograms
• Made lengthier programs possible
19. ALGOL - Algorithmic Language
• 1958
• GAMM (Society for Applied Mechanics) and ACM (Assoc. for
Computing Machinery)
• ALGOL 58 – A Universal Standard Language
• Developed jointly by a committee of European and American
computer scientists in a meeting in 1958 at ETH Zurich.
• Heavily influenced many other languages and was the standard
method for algorithm description used by the Association for
Computing Machinery (ACM) in textbooks and academic sources for
more than thirty years.
• Introduced the fundamental notion of the compound statement, but
it was restricted to control flow only
20. LISP
• 1959
• John McCarthy and Marvin Minsky
• Produced a system for list processing
• A functional language – originally interpreted
• Dominated for a decade
• Descendants:
• Scheme
• COMMON LISP
21. Progress in AI
• 1960
• Newell and Tonge
• IPL-V
• Demonstrated that list processing was feasible and useful.
• Actually as assembly language implemented in an interpreter with list
processing instructions for Johnniac machine.
22. ALGOL 60
• 1960
• Formally described using Backus-Naur Form
• The first language implementing nested function definitions
with lexical scope
• Parent: Fortran
• Descendants: PL/I, SIMULA 67, C, Pascal, Ada, C++, and Java
23. APL and SNOBOL
• 1960
• APL
• Kenneth Iverson at IBM
• Designed for describing computer architecture.
• SNOBOL
• D. J. Farber, R. E. Griswold, and I. P. Polonsky at Bell Labs
• Designed for text processing.
• Collection of powerful operations for string pattern matching
Common Features:
• Not based on a previous language nor a basis for any languages
• Dynamic typing and hence storage allocation.
24. Fortran IV
• 1962
• One of the most widely used PLs
• Explicit type declarations for variables.
• Capability of passing subprograms as parameters.
• 1966
• Fortran 66 – its Standard version
25. BASIC
Let’s make it easy..
• 1963
• John Kemeny and Thomas Kurtz
• Design BASIC
• Beginner’s All-purpose Symbolic Instruction Code
• Goals
1. Must be an easy for non-science students to learn and use
2. Must be pleasant and friendly
3. Must provide fast turnaround for homework
4. Must allow free and private access
5. Must consider user time more important than computer time!’
• Characteristics
1. Small, non-iteractive
2. Used through terminals
3. Single data type – fp – numbers
4. Resurgence with Visual BASIC in 90s.
26. PL/I
A Single Universal Language
• 1964
• IBM
• Developed PL/I
• Goal
• Capable of both floating point and decimal arithmetic to support both scientific and business apps, as
well as, support for list processing and systems programming!
• Replace Fortran, LISP, COBOL and assembly languages.
• Contributions
• ALGOL 60’s recursion and block structure
• Fortran IV’s separate compilation with communication via global data
• COBOL 50’s data structures, I/O and report generating facilities
• A collection of new constructs
• Concurrently executing subprograms
• Exception handling for 23 different types of exceptions
• Allowed the disabling of recursion for more efficient linkage
• Pointers as data types
• References to cross sections of Arrays
27. Simula 67
• 1967
• Kristen Nygaard and Ole-Johan Dahl
• First developed SIMULA I in the early 60s
• Designed for system simulation, implemented in mid-60s
• Generalized into Simula 67
• Features
• Extension of ALGOL 60 taking block structure and control statements
• Support for co-routines via the class construct thus beginning the concept of data abstraction
• Encapsulation of data and processes that manipulate the data
• Class definition as a template
• Constructors
28. ALGOL 68
Dramatically Different
• 1968
• ALGOL 68
• Design Criteria – Orthogonality
• Never achieved widespread use, but contributed several important ideas.
• User defined data types
• Flex arrays – Implicit heap-dynamic arrays
• Orthogonality – a few primitive concepts and unrestricted use of a few combining
mechanisms
• Descendents – ALGOL-W
• Value-result method of passing parameters as an alternative to pass-by-name
• Case statement for multiple selection
30. Pascal
• 1971
• Niklaus Wirth
• Developed Pascal based on ALGOL 60
• Primarily used as a teaching language
• Simple but expressive
• Lacked essential features for many apps
which led to non-standard dialects such
as Turbo Pascal
31. C
• 1972
• Dennis Ritchie
• Developed the C language
• Heritage was
• CPL - Cambridge early 60s
• BCPL – Martin Richards 1967
• B – Ken Thompson 1970 First HLL under Unix
• C
• and ALGOL 68
Originally designed for systems programming.
C has adequate control statements and data-structuring facilities to
allow its use in many application areas. It also has a rich set of operators
that provide a high degree of expressiveness.
32. PROLOG - Programming Logic
• 1975
• Alain Colmerauer and Philippe Roussel
• in the Artificial Intelligence Group at the University of Aix-Marseill
• Described Prolog
• One common use of Prolog is as a kind of intelligent database.
• This application provides a simple framework for discussing the Prolog
language.
• The database of a Prolog program consists of two kinds of statements:
facts and rules.
33. Scheme
A Functional Programming Language
• 1975
• MIT
• Scheme
• Small size
• Exclusive use of static scoping
• Functions are first-class entities – can be values of expressions and elements of lists; assigned
to variables; passed as parameters and returned as values of function applications.
• Simple syntax and semantics
34. Fortran 77
Continues to Dominate
• 1978
• Fortran 77
• Character string handling
• Logical loop control statements
• If else
• “Fortran is the “lingua franca” of the computing world..” Alan Perlis
36. Smalltalk
• 1980
• Alan Kay who predicted computer “desktop”
windowing environment
• Developed the first language that fully supported OOP as a
part of the Xerox Palo Alto Research Center (PARC) group
• Charged with task of designing a lanuage to support Kay’s vision.
• Objects and message passing
• Example on pages 93-94
37. MetaLanguage
Functional Language Interest Continues
• 1980s
• Robin Milner at the University of Edinburgh as a metalanguage for a
program verification system named Logic for Computable Function
• ML (MetaLanguage)
• Functional but supports imperative
• The syntax of ML resembles that of the imperative languages, such as Java and C++.
38. ADA
• 1983
• DOD
• Most extensive and expensive language design effort.
• Over half of the applications of computers in DoD were embedded
systems.
• An embedded system is one in which the computer hardware
is embedded in the device it controls or for which it provides
services.
• More than 450 different programming languages were in use for
DoD projects, and none of them was standardized by DoD.
Features
Packages
Exception handling
Generics
Concurrency support
39. COMMON LISP
• 1984
• Designed to combine features of a number of different dialects of LISP
that were developed during the 70s and 80s.
• Large and complex
• Allows both dynamic and static scoping
• Basis is pure lisp - its syntax, primitive functions, and fundamental nature come
from that language.
40. Miranda
• 1984
• David Turner
• Based on ML, SASL, and KRC
• Functional, no variables, no assignment statement
• Haskell is based on Miranda
• But has the unique feature of lazy evaluation - No expression is evaluated until its value is
required
41. C++
• Bjarne Stroustrup at Bell Labs
• Made the first step from C to C++ with C with Classes language int 1983
• Progression
1980
Addition of function parameter type checking
and conversion
Classes like those of SIMULA 67 and Smalltalk
Derived classes, public/private access,
constructors/destructors, friends.
1981
Inline functions, default parameters, overloading
1984
Named C++ - virtual methods, dynamic binding of
method calls to method definitions, reference types
1985
First available implementation named Cfront which
translates C++ programs into C programs
Continued to evolve
multiple inheritance, abstract classes
Templates which provide parameterized types and
exception handling
2002 - .NET
The primary goal was to provide a language in which programs
could be organized as they could be organized in SIMULA 67—
that is, with classes and inheritance.
44. Fortran 90
• 1990
• Fortran 90
• Dynamic arrays
• Records
• Pointers
• Multiple selection statement
• Modules
• Recursion
• Obsolescent-features list
• Dropped fixed format of code
requirement
• Fortran vs. FORTRAN
• Convention – keywords & identifiers in
uppercase
Fortran 95
• 1995
• Fortran 95
• For all added for parallelizing..
45. Java
• 1990
• Sun Microsystems
• Determined there was a need for a programming
language for embedded consumer electronic
devices.
• Java’s designers started with C++, removed
some constructs, changed some, and added a
few others.
• Reliability – characteristic of the software in
consumer electronic products.
46. ADA 95
• Features
• Adding new components to those inherited from a base class -
inheritance
• Dynamic binding of subprogram classes to subprogram
definitions (polymorphism)
• Protected objects
• Success hindered by C++; the widespread acceptance of C++ for
object-oriented programming, which occurred before Ada 95
was released.
47. Scripting Languages
• Sh (shell) – a small collection of commands interpreted as calls to system
subprograms to perform utility functions with added varaibels, control
flow statements, functions and etc
• Ksh (korn shell programming)– David Korn ’95
• Awk – Al Aho, Brian Kernighan, Peter Weinberger (’88) began as a report
generation language
• Tcl – John Ousterhout ’94
• Perl – Larry Wall – designed as a UNIX tool for processing text files.
Common Gateway Interface language as well.
48. Scripting Languages
• Perl – Larry Wall
• Combination of sh and awk
• Statically typed variables - $ scalar; @ arrays; % hash names
• Arrays can be dynamic and sparse
• Some dangers
• If a string is used in a numeric context and the string cannot be converted to a number, zero
is used without warning
• Array indexing cannot be checked since there is no set subscript range. References to non-
existent elements return undef, which is interpreted as 0 in numeric context.
51. C#
• Based on C++ and Java but includes some
ideas from Delphi and Visual Basic.
• lead designer, Anders Hejlsberg,
• Also designed Turbo Pascal and Delphi, which
explains the Delphi parts of the heritage of C#.
• The purpose of C# is to provide a language
for component-based software
development, specifically for such
development in the .NET Framework.