The document provides a history of programming languages from the 1940s to the projected year 2100. It discusses early pioneers and the first programming language as well as the evolution of programming languages throughout each decade. Popular modern languages discussed include Python, Java, R, Julia, Lisp, JavaScript, C++, and Mojo in the context of artificial intelligence and machine learning. The document takes a high-level view of the evolution of programming over eight decades from the 1940s to 2000s and looks ahead to future trends.
Optimizing Performance in Rust for Low-Latency Database DriversScyllaDB
The process of optimizing shard-aware drivers for ScyllaDB has involved several initiatives, often necessitating a complete rewrite from the ground up. Discover the efforts put into enhancing the performance of ScyllaDB drivers with a focus on Rust, and how its code base will serve as a foundation for drivers using other language bindings in the future. This session emphasizes the performance gains achieved by harnessing the power of the asynchronous Tokio framework as the backbone of a new, high-performance driver while thoughtfully architecting and optimizing various components of the driver.
Docker Desktop is probably the most common way to work with Linux containers on Windows 10. Also, Microsoft continues to improve Windows 10 and offers with the Windows Subsystem for Linux a very good platform to work natively with Linux. Microsoft is currently preparing a new version of WSL 2, which for the first time brings a real Linux kernel. This makes operation of the Docker Engine under WSL possible. Microsoft and Docker are working closely together to optimally enable the running of Linux containers under WSL 2. This talk will show what is going to improve on Docker Desktop on the upcoming Windows 10 version 20H1. The current status can already be tried out with the Windows Insider program and the Technical Preview of Docker Desktop.
Kernel Recipes 2015: Linux Kernel IO subsystem - How it works and how can I s...Anne Nicolas
Understanding how Linux kernel IO subsystem works is a key to analysis of a wide variety of issues occurring when running a Linux system. This talk is aimed at helping Linux users understand what is going on and how to get more insight into what is happening.
First we present an overview of Linux kernel block layer including different IO schedulers. We also talk about a new block multiqueue implementation that gets used for more and more devices.
After surveying the basic architecture we will be prepared to talk about tools to peek into it. We start with lightweight monitoring like iostat and continue with more heavy blktrace and variety of tools that are based on it. We demonstrate use of the tools on analysis of real world issues.
Jan Kara, SUSE
Intel® Software Guard Extensions (Intel® SGX) is Intel’s Trusted Execution Environment for client and data center. It provides the foundation for many secure use cases.
Inexpensive Datamasking for MySQL with ProxySQL — Data Anonymization for Deve...Ontico
HighLoad++ 2017
Зал «Кейптаун», 8 ноября, 16:00
Тезисы:
http://www.highload.ru/2017/abstracts/3115.html
During this session we will cover the last development in ProxySQL to support regular expressions (RE2 and PCRE) and how we can use this strong technique in correlation with ProxySQL's query rules to anonymize live data quickly and transparently. We will explain the mechanism and how to generate these rules quickly. We show live demo with all challenges we got from the Community and we finish the session by an interactive brainstorm testing queries from the audience.
Jim Huang (jserv) from 0xlab.org prepared the technical training for ARM and SoC. In part I, it introduced the overview of ARM architecture, family, ISA feature, SoC overview, and several practical approaches to Xscale SoC as example.
Note: When you view the the slide deck via web browser, the screenshots may be blurred. You can download and view them offline (Screenshots are clear).
Optimizing Performance in Rust for Low-Latency Database DriversScyllaDB
The process of optimizing shard-aware drivers for ScyllaDB has involved several initiatives, often necessitating a complete rewrite from the ground up. Discover the efforts put into enhancing the performance of ScyllaDB drivers with a focus on Rust, and how its code base will serve as a foundation for drivers using other language bindings in the future. This session emphasizes the performance gains achieved by harnessing the power of the asynchronous Tokio framework as the backbone of a new, high-performance driver while thoughtfully architecting and optimizing various components of the driver.
Docker Desktop is probably the most common way to work with Linux containers on Windows 10. Also, Microsoft continues to improve Windows 10 and offers with the Windows Subsystem for Linux a very good platform to work natively with Linux. Microsoft is currently preparing a new version of WSL 2, which for the first time brings a real Linux kernel. This makes operation of the Docker Engine under WSL possible. Microsoft and Docker are working closely together to optimally enable the running of Linux containers under WSL 2. This talk will show what is going to improve on Docker Desktop on the upcoming Windows 10 version 20H1. The current status can already be tried out with the Windows Insider program and the Technical Preview of Docker Desktop.
Kernel Recipes 2015: Linux Kernel IO subsystem - How it works and how can I s...Anne Nicolas
Understanding how Linux kernel IO subsystem works is a key to analysis of a wide variety of issues occurring when running a Linux system. This talk is aimed at helping Linux users understand what is going on and how to get more insight into what is happening.
First we present an overview of Linux kernel block layer including different IO schedulers. We also talk about a new block multiqueue implementation that gets used for more and more devices.
After surveying the basic architecture we will be prepared to talk about tools to peek into it. We start with lightweight monitoring like iostat and continue with more heavy blktrace and variety of tools that are based on it. We demonstrate use of the tools on analysis of real world issues.
Jan Kara, SUSE
Intel® Software Guard Extensions (Intel® SGX) is Intel’s Trusted Execution Environment for client and data center. It provides the foundation for many secure use cases.
Inexpensive Datamasking for MySQL with ProxySQL — Data Anonymization for Deve...Ontico
HighLoad++ 2017
Зал «Кейптаун», 8 ноября, 16:00
Тезисы:
http://www.highload.ru/2017/abstracts/3115.html
During this session we will cover the last development in ProxySQL to support regular expressions (RE2 and PCRE) and how we can use this strong technique in correlation with ProxySQL's query rules to anonymize live data quickly and transparently. We will explain the mechanism and how to generate these rules quickly. We show live demo with all challenges we got from the Community and we finish the session by an interactive brainstorm testing queries from the audience.
Jim Huang (jserv) from 0xlab.org prepared the technical training for ARM and SoC. In part I, it introduced the overview of ARM architecture, family, ISA feature, SoC overview, and several practical approaches to Xscale SoC as example.
Note: When you view the the slide deck via web browser, the screenshots may be blurred. You can download and view them offline (Screenshots are clear).
The major programming languages have evolved over several decades, beginning in the 1940s-1950s with assembly languages and languages like Fortran for scientific computing. Key developments included ALGOL in the late 1950s which introduced block structures, LISP for AI in the late 1950s, COBOL for business in the late 1950s, and BASIC for education in the 1960s. Object-oriented concepts emerged in the 1960s with Simula and became widespread in the 1980s and 1990s with languages like C++, Smalltalk, and Java. Functional programming concepts gained prominence with LISP and languages like ML and Haskell. Modern scripting languages aided system administration tasks. Each new generation of languages incorporated ideas from its
This document provides information about a programming languages concepts course, including details about the course code, title, lecturer, and description. The course aims to describe the evolution of programming languages and understand different computation models or paradigms like imperative, functional, logic and object-oriented programming. It will cover topics like syntax, semantics, data types, expressions and control structures over 13 weeks. Students will complete an assignment on MATLAB/Octave and two term exams. The course objectives are listed as understanding different programming language paradigms and concepts to select the proper language to solve problems.
The document provides an overview of the history and evolution of various programming languages. It discusses early languages like FORTRAN, LISP, PASCAL, C, and Java. It also covers scripting languages and their uses. The document explains what Python is as a programming language - that it is interpreted, object-oriented, and high-level. It was named after Monty Python and was created by Guido van Rossum. The document then gives examples of using Python to program Minecraft by importing protein data from PDB files and using coordinates to place blocks to visualize proteins in the game.
The document provides an overview of using Python for bioinformatics, discussing what Python is, why it is useful for bioinformatics, how to set up Python in integrated development environments like Eclipse with PyDev, how to share code using Git and GitHub, and includes examples of Hello World and bioinformatics programs in Python. It introduces Python and argues it is well-suited for bioinformatics due to its extensive standard libraries, ease of use, and wide adoption in science. The document demonstrates how to install Python, set up an IDE, create and run simple Python programs, and use version control with Git and GitHub to collaborate on projects.
Context sensitive help
Toolbars:
Quick access to common actions
Views:
Panels for navigating code, files,
tasks etc.
Editor:
Where code is written and edited
Console:
Output from running code
Debug Perspective:
Tools for debugging code
Project Explorer:
Navigating files and folders
Outline:
Structure of current editor
Problems:
Errors and warnings
Properties:
Details of selected item
PyDev Perspective:
Python specific tools
Run/Debug Buttons:
Run and debug code
Status Bar:
Status messages
Welcome Page:
Getting started tips
Help:
Documentation and
Oplægget blev holdt ved et seminar i InfinIT-interessegruppen Højniveausprog til Indlejrede Systemer den 2. oktober 2013. Læs mere om interessegruppen her: http://infinit.dk/dk/interessegrupper/hoejniveau_sprog_til_indlejrede_systemer/hoejniveau_sprog_til_indlejrede_systemer.htm
This document discusses different programming domains including scientific applications, business applications, artificial intelligence, systems programming, and web software. It provides details on each domain, such as scientific applications using floating point computations and Fortran being the first language. Business applications used special computers and languages like COBOL. Artificial intelligence uses symbolic rather than numeric computations and LISP was the first widely used AI language. Systems programming requires efficient languages like C and C++. Web software uses an eclectic collection of languages and dynamic content is provided by embedding code in HTML documents.
This document outlines the syllabus for an Object Oriented Programming course in Java. The course aims to teach students the principles of OOP using Java, including classes, objects, inheritance, polymorphism, exceptions handling, and more. It covers topics like Java basics, GUI programming, JDBC, collections, and multithreading. The course objectives are to enable students to write Java programs that solve problems and demonstrate good OOP skills. The content includes revising Java principles, learning about classes, interfaces, packages and more.
This document outlines the syllabus for an Object Oriented Programming course in Java. The course aims to teach students the principles of OOP using Java, including classes, objects, inheritance, polymorphism, interfaces, exceptions handling, and more. It will cover both programming concepts in Java as well as GUI, database connectivity, collections, and multithreading. The course content is presented over several classes and includes lectures, practical sessions, and assessed exercises.
Konrad Zuse developed Plankalkül in 1941 as the first high-level programming language for his Z3 computer. John Mauchly designed Short Code, the first higher-level language used for a computer, in 1949. Grace Hopper created the A-0 system in 1951, one of the first compiler systems. FORTRAN was developed in the 1950s to simplify scientific programming. COBOL was created in 1959 to standardize business programming. ALGOL influenced many languages with its focus on algorithm description. Simula 67 introduced important object-oriented concepts like classes and inheritance.
Smalltalk was the first full implementation of an object-oriented language with features like abstraction, inheritance, and dynamic binding. C++ combined imperative and object-oriented programming, growing rapidly in popularity along with OOP. Java eliminated unsafe features of C++ while adding support for applets and concurrency.
History of Computer Programming Languages.pptxAliAbbas906043
The document discusses the history and development of programming languages from the first algorithm created by Ada Lovelace in 1843 to modern languages. It outlines several important events and milestones, including the first assembly language in 1949, FORTRAN in 1957, COBOL and ALGOL in 1959, BASIC and PASCAL in the 1960s-1970s, C and SQL in 1972, Ada in the 1980s, Java and JavaScript in 1995, and Swift in 2014. The document concludes that programming languages have come a long way from early machine codes to today's high-level, readable languages and will likely continue advancing in the future.
Course: Programming Languages and Paradigms:
A brief introduction to imperative programming principles: history, von neumann, BNF, variables (r-values, l-values), modifiable data structures, order of evaluation, static and dynamic scopes, referencing environments, call by value, control flow (sequencing, selection, iteration), ...
Information about the level of programming language, types of programming language, the principal paradigms, few programming languages, criteria for good language.
This document provides an overview and introduction to a course on principles of compiler design. It discusses the motivation for studying compilers, as language processing is important for many software applications. It outlines what will be covered in the course, including the theoretical foundations and practical techniques for developing lexical analyzers, parsers, type checkers, code generators, and more. The document also describes the organization of the course with lectures, programming assignments, and exams.
The document discusses programming languages and their importance. It covers the following key points:
- A programming language allows computation to be described in both machine-readable and human-readable form. Most languages today are high-level languages.
- Studying programming languages improves one's ability to choose the right language for a task, learn new languages, and better understand how language features are implemented.
- Major programming domains include scientific, business, artificial intelligence, systems, and web applications. Each domain utilizes languages suited to its particular needs and purposes.
Unit 4 Assignment 1 Comparative Study Of Programming...Carmen Sanborn
- The goal is to design a new programming language by combining common qualities from two existing languages.
- When designing a new language, it is important to consider aspects like syntax, semantics, data types, control structures, modularity, and libraries/frameworks.
- The language design should aim to take useful features from other languages while avoiding their shortcomings to create a language that is efficient, readable, and meets modern programming needs.
End-to-end pipeline agility - Berlin Buzzwords 2024Lars Albertsson
We describe how we achieve high change agility in data engineering by eliminating the fear of breaking downstream data pipelines through end-to-end pipeline testing, and by using schema metaprogramming to safely eliminate boilerplate involved in changes that affect whole pipelines.
A quick poll on agility in changing pipelines from end to end indicated a huge span in capabilities. For the question "How long time does it take for all downstream pipelines to be adapted to an upstream change," the median response was 6 months, but some respondents could do it in less than a day. When quantitative data engineering differences between the best and worst are measured, the span is often 100x-1000x, sometimes even more.
A long time ago, we suffered at Spotify from fear of changing pipelines due to not knowing what the impact might be downstream. We made plans for a technical solution to test pipelines end-to-end to mitigate that fear, but the effort failed for cultural reasons. We eventually solved this challenge, but in a different context. In this presentation we will describe how we test full pipelines effectively by manipulating workflow orchestration, which enables us to make changes in pipelines without fear of breaking downstream.
Making schema changes that affect many jobs also involves a lot of toil and boilerplate. Using schema-on-read mitigates some of it, but has drawbacks since it makes it more difficult to detect errors early. We will describe how we have rejected this tradeoff by applying schema metaprogramming, eliminating boilerplate but keeping the protection of static typing, thereby further improving agility to quickly modify data pipelines without fear.
Beyond the Basics of A/B Tests: Highly Innovative Experimentation Tactics You...Aggregage
This webinar will explore cutting-edge, less familiar but powerful experimentation methodologies which address well-known limitations of standard A/B Testing. Designed for data and product leaders, this session aims to inspire the embrace of innovative approaches and provide insights into the frontiers of experimentation!
The major programming languages have evolved over several decades, beginning in the 1940s-1950s with assembly languages and languages like Fortran for scientific computing. Key developments included ALGOL in the late 1950s which introduced block structures, LISP for AI in the late 1950s, COBOL for business in the late 1950s, and BASIC for education in the 1960s. Object-oriented concepts emerged in the 1960s with Simula and became widespread in the 1980s and 1990s with languages like C++, Smalltalk, and Java. Functional programming concepts gained prominence with LISP and languages like ML and Haskell. Modern scripting languages aided system administration tasks. Each new generation of languages incorporated ideas from its
This document provides information about a programming languages concepts course, including details about the course code, title, lecturer, and description. The course aims to describe the evolution of programming languages and understand different computation models or paradigms like imperative, functional, logic and object-oriented programming. It will cover topics like syntax, semantics, data types, expressions and control structures over 13 weeks. Students will complete an assignment on MATLAB/Octave and two term exams. The course objectives are listed as understanding different programming language paradigms and concepts to select the proper language to solve problems.
The document provides an overview of the history and evolution of various programming languages. It discusses early languages like FORTRAN, LISP, PASCAL, C, and Java. It also covers scripting languages and their uses. The document explains what Python is as a programming language - that it is interpreted, object-oriented, and high-level. It was named after Monty Python and was created by Guido van Rossum. The document then gives examples of using Python to program Minecraft by importing protein data from PDB files and using coordinates to place blocks to visualize proteins in the game.
The document provides an overview of using Python for bioinformatics, discussing what Python is, why it is useful for bioinformatics, how to set up Python in integrated development environments like Eclipse with PyDev, how to share code using Git and GitHub, and includes examples of Hello World and bioinformatics programs in Python. It introduces Python and argues it is well-suited for bioinformatics due to its extensive standard libraries, ease of use, and wide adoption in science. The document demonstrates how to install Python, set up an IDE, create and run simple Python programs, and use version control with Git and GitHub to collaborate on projects.
Context sensitive help
Toolbars:
Quick access to common actions
Views:
Panels for navigating code, files,
tasks etc.
Editor:
Where code is written and edited
Console:
Output from running code
Debug Perspective:
Tools for debugging code
Project Explorer:
Navigating files and folders
Outline:
Structure of current editor
Problems:
Errors and warnings
Properties:
Details of selected item
PyDev Perspective:
Python specific tools
Run/Debug Buttons:
Run and debug code
Status Bar:
Status messages
Welcome Page:
Getting started tips
Help:
Documentation and
Oplægget blev holdt ved et seminar i InfinIT-interessegruppen Højniveausprog til Indlejrede Systemer den 2. oktober 2013. Læs mere om interessegruppen her: http://infinit.dk/dk/interessegrupper/hoejniveau_sprog_til_indlejrede_systemer/hoejniveau_sprog_til_indlejrede_systemer.htm
This document discusses different programming domains including scientific applications, business applications, artificial intelligence, systems programming, and web software. It provides details on each domain, such as scientific applications using floating point computations and Fortran being the first language. Business applications used special computers and languages like COBOL. Artificial intelligence uses symbolic rather than numeric computations and LISP was the first widely used AI language. Systems programming requires efficient languages like C and C++. Web software uses an eclectic collection of languages and dynamic content is provided by embedding code in HTML documents.
This document outlines the syllabus for an Object Oriented Programming course in Java. The course aims to teach students the principles of OOP using Java, including classes, objects, inheritance, polymorphism, exceptions handling, and more. It covers topics like Java basics, GUI programming, JDBC, collections, and multithreading. The course objectives are to enable students to write Java programs that solve problems and demonstrate good OOP skills. The content includes revising Java principles, learning about classes, interfaces, packages and more.
This document outlines the syllabus for an Object Oriented Programming course in Java. The course aims to teach students the principles of OOP using Java, including classes, objects, inheritance, polymorphism, interfaces, exceptions handling, and more. It will cover both programming concepts in Java as well as GUI, database connectivity, collections, and multithreading. The course content is presented over several classes and includes lectures, practical sessions, and assessed exercises.
Konrad Zuse developed Plankalkül in 1941 as the first high-level programming language for his Z3 computer. John Mauchly designed Short Code, the first higher-level language used for a computer, in 1949. Grace Hopper created the A-0 system in 1951, one of the first compiler systems. FORTRAN was developed in the 1950s to simplify scientific programming. COBOL was created in 1959 to standardize business programming. ALGOL influenced many languages with its focus on algorithm description. Simula 67 introduced important object-oriented concepts like classes and inheritance.
Smalltalk was the first full implementation of an object-oriented language with features like abstraction, inheritance, and dynamic binding. C++ combined imperative and object-oriented programming, growing rapidly in popularity along with OOP. Java eliminated unsafe features of C++ while adding support for applets and concurrency.
History of Computer Programming Languages.pptxAliAbbas906043
The document discusses the history and development of programming languages from the first algorithm created by Ada Lovelace in 1843 to modern languages. It outlines several important events and milestones, including the first assembly language in 1949, FORTRAN in 1957, COBOL and ALGOL in 1959, BASIC and PASCAL in the 1960s-1970s, C and SQL in 1972, Ada in the 1980s, Java and JavaScript in 1995, and Swift in 2014. The document concludes that programming languages have come a long way from early machine codes to today's high-level, readable languages and will likely continue advancing in the future.
Course: Programming Languages and Paradigms:
A brief introduction to imperative programming principles: history, von neumann, BNF, variables (r-values, l-values), modifiable data structures, order of evaluation, static and dynamic scopes, referencing environments, call by value, control flow (sequencing, selection, iteration), ...
Information about the level of programming language, types of programming language, the principal paradigms, few programming languages, criteria for good language.
This document provides an overview and introduction to a course on principles of compiler design. It discusses the motivation for studying compilers, as language processing is important for many software applications. It outlines what will be covered in the course, including the theoretical foundations and practical techniques for developing lexical analyzers, parsers, type checkers, code generators, and more. The document also describes the organization of the course with lectures, programming assignments, and exams.
The document discusses programming languages and their importance. It covers the following key points:
- A programming language allows computation to be described in both machine-readable and human-readable form. Most languages today are high-level languages.
- Studying programming languages improves one's ability to choose the right language for a task, learn new languages, and better understand how language features are implemented.
- Major programming domains include scientific, business, artificial intelligence, systems, and web applications. Each domain utilizes languages suited to its particular needs and purposes.
Unit 4 Assignment 1 Comparative Study Of Programming...Carmen Sanborn
- The goal is to design a new programming language by combining common qualities from two existing languages.
- When designing a new language, it is important to consider aspects like syntax, semantics, data types, control structures, modularity, and libraries/frameworks.
- The language design should aim to take useful features from other languages while avoiding their shortcomings to create a language that is efficient, readable, and meets modern programming needs.
End-to-end pipeline agility - Berlin Buzzwords 2024Lars Albertsson
We describe how we achieve high change agility in data engineering by eliminating the fear of breaking downstream data pipelines through end-to-end pipeline testing, and by using schema metaprogramming to safely eliminate boilerplate involved in changes that affect whole pipelines.
A quick poll on agility in changing pipelines from end to end indicated a huge span in capabilities. For the question "How long time does it take for all downstream pipelines to be adapted to an upstream change," the median response was 6 months, but some respondents could do it in less than a day. When quantitative data engineering differences between the best and worst are measured, the span is often 100x-1000x, sometimes even more.
A long time ago, we suffered at Spotify from fear of changing pipelines due to not knowing what the impact might be downstream. We made plans for a technical solution to test pipelines end-to-end to mitigate that fear, but the effort failed for cultural reasons. We eventually solved this challenge, but in a different context. In this presentation we will describe how we test full pipelines effectively by manipulating workflow orchestration, which enables us to make changes in pipelines without fear of breaking downstream.
Making schema changes that affect many jobs also involves a lot of toil and boilerplate. Using schema-on-read mitigates some of it, but has drawbacks since it makes it more difficult to detect errors early. We will describe how we have rejected this tradeoff by applying schema metaprogramming, eliminating boilerplate but keeping the protection of static typing, thereby further improving agility to quickly modify data pipelines without fear.
Beyond the Basics of A/B Tests: Highly Innovative Experimentation Tactics You...Aggregage
This webinar will explore cutting-edge, less familiar but powerful experimentation methodologies which address well-known limitations of standard A/B Testing. Designed for data and product leaders, this session aims to inspire the embrace of innovative approaches and provide insights into the frontiers of experimentation!
The Ipsos - AI - Monitor 2024 Report.pdfSocial Samosa
According to Ipsos AI Monitor's 2024 report, 65% Indians said that products and services using AI have profoundly changed their daily life in the past 3-5 years.
4th Modern Marketing Reckoner by MMA Global India & Group M: 60+ experts on W...Social Samosa
The Modern Marketing Reckoner (MMR) is a comprehensive resource packed with POVs from 60+ industry leaders on how AI is transforming the 4 key pillars of marketing – product, place, price and promotions.
Open Source Contributions to Postgres: The Basics POSETTE 2024ElizabethGarrettChri
Postgres is the most advanced open-source database in the world and it's supported by a community, not a single company. So how does this work? How does code actually get into Postgres? I recently had a patch submitted and committed and I want to share what I learned in that process. I’ll give you an overview of Postgres versions and how the underlying project codebase functions. I’ll also show you the process for submitting a patch and getting that tested and committed.
ViewShift: Hassle-free Dynamic Policy Enforcement for Every Data LakeWalaa Eldin Moustafa
Dynamic policy enforcement is becoming an increasingly important topic in today’s world where data privacy and compliance is a top priority for companies, individuals, and regulators alike. In these slides, we discuss how LinkedIn implements a powerful dynamic policy enforcement engine, called ViewShift, and integrates it within its data lake. We show the query engine architecture and how catalog implementations can automatically route table resolutions to compliance-enforcing SQL views. Such views have a set of very interesting properties: (1) They are auto-generated from declarative data annotations. (2) They respect user-level consent and preferences (3) They are context-aware, encoding a different set of transformations for different use cases (4) They are portable; while the SQL logic is only implemented in one SQL dialect, it is accessible in all engines.
#SQL #Views #Privacy #Compliance #DataLake
Orchestrating the Future: Navigating Today's Data Workflow Challenges with Ai...Kaxil Naik
Navigating today's data landscape isn't just about managing workflows; it's about strategically propelling your business forward. Apache Airflow has stood out as the benchmark in this arena, driving data orchestration forward since its early days. As we dive into the complexities of our current data-rich environment, where the sheer volume of information and its timely, accurate processing are crucial for AI and ML applications, the role of Airflow has never been more critical.
In my journey as the Senior Engineering Director and a pivotal member of Apache Airflow's Project Management Committee (PMC), I've witnessed Airflow transform data handling, making agility and insight the norm in an ever-evolving digital space. At Astronomer, our collaboration with leading AI & ML teams worldwide has not only tested but also proven Airflow's mettle in delivering data reliably and efficiently—data that now powers not just insights but core business functions.
This session is a deep dive into the essence of Airflow's success. We'll trace its evolution from a budding project to the backbone of data orchestration it is today, constantly adapting to meet the next wave of data challenges, including those brought on by Generative AI. It's this forward-thinking adaptability that keeps Airflow at the forefront of innovation, ready for whatever comes next.
The ever-growing demands of AI and ML applications have ushered in an era where sophisticated data management isn't a luxury—it's a necessity. Airflow's innate flexibility and scalability are what makes it indispensable in managing the intricate workflows of today, especially those involving Large Language Models (LLMs).
This talk isn't just a rundown of Airflow's features; it's about harnessing these capabilities to turn your data workflows into a strategic asset. Together, we'll explore how Airflow remains at the cutting edge of data orchestration, ensuring your organization is not just keeping pace but setting the pace in a data-driven future.
Session in https://budapestdata.hu/2024/04/kaxil-naik-astronomer-io/ | https://dataml24.sessionize.com/session/667627
2. History of Programmers
• Before 1940s: The first programmers
• The 1940s: Von Neumann, Konard Zuse & PlankalKul
• The 1950s: The First Programming Language
• The 1960s: An Explosion in Programming languages
• The 1970s: Simplicity, Abstraction, Study
• The 1980s: Consolidation and New Directions
• The 1990s: Internet and the Web
• The 2000s: tbd
3. Early History: The First Programmer
• Jacquard loom of early 1800s
– Translated card patterns into cloth designs
• Charles Babbage’s analytical engine (1830s & 40s)
Programs were cards with data and operations
• Ada Lovelace – first programmer
“The engine can arrange and combine its numerical quantities
exactly as if they were letters or any other general symbols; And in
fact might bring out its results in algebraic notation, were provision
made.”
4. Jacquard loom of early 1800s Charles Babbage’s analytical engine (1830s & 40s) Ada Lovelace – first programmer
5. The 1940s: Von Neumann and Zuse
John Von Neumann
led a team that built
computers with stored
programs and a central
processor ENIAC.
6. Konrad Zuse and Plankalkul
Konrad Zuse began work on Plankalkul (plan
calculus), the first algorithmic programming
language, with an aim of creating the theoretical
preconditions for the formulation of problems
of a general nature.
Seven years earlier, Zuse had developed and
built the world's first binary digital computer,
the Z1. He completed the first fully functional
program-controlled electromechanical digital
computer, the Z3, in 1941.
Only the Z4, the most sophisticated of his
creations, survived World War II.
7. Machine Code (1940’s)
•Initial computers were programmed in raw machine code.
•These were entirely numeric.
•What was wrong with using machine code? Everything!
•Poor readability
•Poor modifiability
•Expression coding was tedious
•Inherit deficiencies of hardware, e.g., no indexing or floating point
numbers
8. Pseudocodes (1949)
•Short Code or SHORTCODE - John Mauchly, 1949.
•Pseudocode interpreter for math problems, on Eckert
and Mauchly’s BINAC and later on UNIVAC I and II.
•Possibly the first attempt at a higher level language.
•Expressions were coded, left to right
9. More Pseudocodes
Speed coding; 1953-4
• A pseudocode interpreter for math on IBM 701, IBM 650.
• Developed by John Backus
• Pseudo ops for arithmetic and math functions
• Conditional and unconditional branching
• Auto increment registers for array access
• Slow but still dominated by slowness of s/w math
• Interpreter left only 700 words left for user program
Laning and Zierler System – 1953
• Implemented on the MIT Whirlwind computer
• First "algebraic" compiler system
• Subscripted variables, function calls, expression translation
• Never ported to any other machine
10. The 1950s: The First Programming
Language
• Pseudocodes: interpreters for assembly language like
• Fortran: the first higher level programming language
• COBOL: he first business oriented language
• Algol: one of the most influential programming languages ever
designed
• LISP: the first language to depart from the procedural paradigm
• APL: A Programming Language
11. The 1960s: An Explosion in Programming
Languages
• The development of hundreds of programming languages
• PL/I designed in 1963-4
– supposed to be all purpose
– combined features of FORTRAN, COBOL and Algol 60 and more!
– translators were slow, huge and unreliable
– some say it was ahead of its time......
• Algol 68
• SNOBOL
• Simula
• BASIC
12. The 1970s: Simplicity, Abstraction, Study
• Algol-W - Nicklaus Wirth and C.A.R.Hoare
– reaction against 1960s
– simplicity
• Pascal
– small, simple, efficient structures
– for teaching program
• C - 1972 - Dennis Ritchie
– aims for simplicity by reducing restrictions of the type system
– allows access to underlying system
– interface with O/S - UNIX
13. The 1980s: Consolidation and New
Paradigms
• Ada
– US Department of Defence
– European team lead by Jean Ichbiah. (Sam Lomonaco was also on the ADA team :-)
• Functional programming
– Scheme, ML, Haskell
• Logic programming
– Prolog
• Object-oriented programming
– Smalltalk, C++, Eiffel
14. Functional Programming
• Common Lisp: consolidation of LISP dialects spurred practical use,
as did the development of Lisp Machines.
• Scheme: a simple and pure LISP like language used for teaching
programming.
• Logo: Used for teaching young children how to program.
• ML: (Meta Language) a strongly-typed functional language first
developed by Robin Milner in the 70’s
• Haskell: poly morphicly typed, lazy, purely functional language.
15. Small talk (1972-80)
•Developed at Xerox PARC by Alan Kay and colleagues (esp. Adele
Goldberg) inspired by Simula 67
•First compilation in 1972 was written on a bet to come up with "the
most powerful language in the world" in "a single page of code".
•In 1980, Smalltalk 80, a uniformly object-oriented programming
environment became available as the first commercial release of the
Smalltalk language
•Pioneered the graphical user interface everyone now uses
•Industrial use continues to the present day
16. C++ (1985)
•Developed at Bell Labs by Stroustrup
•Evolved from C and SIMULA 67
•Facilities for object-oriented programming, taken partially from SIMULA
67, added to C
•Also has exception handling
•A large and complex language, in part because it supports both procedural
and OO programming
•Rapidly grew in popularity, along with OOP
•ANSI standard approved in November, 1997
17. Eiffel
Eiffel - a related language that supports OOP
- (Designed by Bertrand Meyer - 1992)
- Not directly derived from any other language
- Smaller and simpler than C++, but still has most of the power
18. 1990’s: the Internet and Web
During the 90’s, Object-oriented languages (mostly C++)
became widely used in practical applications
The Internet and Web drove several phenomena:
– Adding concurrency and threads to existing languages
– Increased use of scripting languages such as Perl and Tcl/Tk
– Java as a new programming language
19. Java
• Developed at Sun in the early 1990s
with original goal of a language for
embedded computers
• Principals: Bill Joy, James Gosling, Mike
Sheradin, Patrick Naughton
• Original name, Oak, changed for copyright reasons
• Based on C++ but significantly simplified
• Supports only OOP
• Has references, but not pointers
• Includes support for applets and a form of concurrency (i.e. threads)
20. The future
• In the 60’s, the dream was a single all-purpose language
(e.g., PL/I, Algol)
• The 70s and 80s dream expressed by Winograd (1979)
“Just as high-level languages allow the programmer to escape the intricacies of
the machine, higher level programming systems can provide for manipulating
complex systems. We need to shift away from algorithms and towards the
description of the properties of the packages that we build. Programming
systems will be declarative not imperative”
• Will that dream be realised?
• Programming is not yet obsolete
21.
22.
23.
24.
25.
26.
27.
28.
29.
30.
31.
32.
33.
34.
35.
36. Currently Trending Programming
Languages
There are several programming languages that are commonly used in the fields of AI, ML, and Cybersecurity. Here are some of the most popular ones:
1. Python: Python is one of the most popular programming languages for AI and ML development due to its simple syntax and readability. It supports a variety of
frameworks and libraries, which allows for more flexibility and creates endless possibilities for an engineer to work with. Some of the most popular Python libraries
for machine learning include: sci-kit image, OpenCV, TensorFlow, PyTorch, Keras, NumPy, NLTK, SciPy, and sci-kit learn123.
2. Java: Java is a general-purpose programming language that is used for creating mobile, desktop, web, and cloud applications. It is also used for developing AI
systems. Java is known for its scalability, security, and cross-platform compatibility2.
3. R: R is a programming language that is used for statistical computing and graphics. It is widely used in data analysis, machine learning, and scientific research. R
has a large number of libraries and packages that make it easy to perform complex statistical analyses3.
4. Julia: Julia is a high-level, high-performance programming language that is designed for numerical and scientific computing. It is used for developing AI and ML
models, as well as for data analysis and visualization3.
5. Lisp: Lisp is a family of programming languages that are used for AI and ML development. Lisp is known for its powerful macro system, which allows developers
to extend the language itself. Lisp is also used for symbolic computing, which is a type of computing that deals with symbols and expressions3.
6. JavaScript: JavaScript is a programming language that is used for creating highly interactive browser-based applications. It is also used for developing AI systems.
JavaScript is known for its flexibility and ease of use2.
7. C++: C++ is a general-purpose programming language that is used for developing AI and ML models, as well as for developing operating systems, system software,
and embedded systems. C++ is known for its speed and efficiency2.
• Here are some PowerPoint presentations that you might find useful:
1. Machine Learning in Cyber Security - This presentation provides a holistic view of machine learning in cybersecurity for better organizational readiness.
2. AI and ML in Cybersecurity - This presentation discusses the limitations of machine learning and issues of explain ability, where deep learning should never be
applied, and examples of how the blind application of algorithms can lead to wrong results.
• Please note that the information provided is current as of January 2024 and may be subject to change.
37. Comparing the performance of Mojo,
Python, and JavaScript
Comparing the performance of Mojo, Python, and JavaScript in the context of
machine learning.
According to a Medium article, Python and Mojo are two popular programming
languages that have been widely used in various applications, from web
development to machine learning. While both Python and Mojo share some
similarities, they also have notable differences that set them apart. As a developer or
programmer, it’s essential to understand the fundamental differences between these
languages so that you can choose the one that best suits your needs.
Another Medium article compares the performance of JavaScript and Python for
machine learning. The article states that JavaScript’s computational performance is
still much better than Python’s. However, the maturity of the libraries — which
often have underlying modules written in C — means that operations on large
datasets can offer so much more than sheer computational power. But there is still a
place for JavaScript in machine learning.
38. Programming Languages for Civil
Engineering
• Civil and Structural engineering are fields that require a lot of computational power. Learning to code can
help engineers automate repetitive tasks, improve their workflow, and increase their productivity. According
to The Computational Engineer, the following programming languages are commonly used in the civil and
structural engineering industry 1:
1. Grasshopper: A visual programming language that can be easily adopted by civil and structural engineers. It
is a plugin to a CAD and 3D-modelling software called Rhinoceros. It has a low bar to entry but is powerful
enough to manage most of your workflows, including your Revit workflows.
2. Dynamo: A popular visual programming language for building and civil engineers. It is a plugin for Autodesk
Revit and can be used to automate repetitive tasks and improve workflows.
3. BHoM: A data structure and toolset for building and architecture that can be used to create custom
workflows and automate tasks.
4. C#: A general-purpose programming language that is widely used in the civil engineering industry. It is used
to develop software applications and tools for civil engineering projects.
• These languages have been designed with civil engineering workflows in mind and offer a lower bar to entry
for civil and structural engineers. They are also powerful enough to manage most of your workflows,
including your Revit workflows. If you are new to coding, Grasshopper is a great first language to learn as it
has an easy-to-adopt and debug interface 1.
39. Never Ever Ending Life History of
Programming Languages
• 18 New Programming Languages to Learn in 2024 | Built In