In this insightful talk, we'll embark on a journey from the origins of programming in 1883 and the conceptualization of AI in the 1950s, to the current explosion of AI applications reshaping our world. We'll unravel why AI has surged to prominence in the last decade, driven by unprecedented data generation and significant hardware advancements. With examples ranging from individual email filtering to complex supply chain optimizations, we'll explore AI's pervasive impact across various sectors including finance, manufacturing, healthcare, and media. The talk will address the challenges of AI implementation, such as the high cost of AI teams and the quest for universally applicable models, while highlighting the promising horizon of no-code AI platforms democratizing access. Furthermore, we'll delve into the ethical dimensions of AI, from biases to privacy concerns, and the pressing question of AI's potential to replace human roles. Lastly, we'll discuss the transformative potential of language models and generative AI, underscoring the importance of understanding and integrating AI into our lives and businesses for a future that's both scalable and sustainable.
2. MEET THE SPEAKER
Machine Learning Lead
Synapse Analytics (Egypt & UAE)
Nezar El-Kady
Pr. Computer Vision Engineer
ABM (Egypt & UAE)
Machine Learning Consultant
Dronodat (Germany)
3. Agenda
1. Intro to AI history
2. Why AI is booming now?
3. Everyday AI
4. Challenges of adopting AI
5. AI refactoring different businesses
6. Why to use AI?
7. Will AI replace humans? Assist vs. Replace
5. AI History
AI is not a new technology
Muculloch-Pitts first
mathematical model (binary)
of biological neuron
1943
Alan Turing proposed the turing
test as a measure of intelligence
Birth of AI as a field &
proposing ELIZA for
conversation mimicking
Big Data & Deep Learning
(Neural Networks) like AlphaGo
engine by Google
1950
1950s & 1960s
2000s & 2010s
Machine Learning (not from data)
and The internet (DeepBlue chess
engine by IBM)
1990s
Expert Systems (Knowledge with
if-then rules) like XCON along
side with backpropagation
1970s & 1980s
Large Language Models and GenAI
(like ChatGPT & DALL-E) 2020s
6. AI Categorization
trained and dedicated to
a specific task (surpasses
human-level in this task)
can innovate and
outperform human
intelligence generally
manage to perform a
broad range of tasks by
using humanlike cognitive
capabilities
Narrow AI Super AI
General AI
7. Data Roles Evolution
Data-roles have evolved as follows (overview):
● Early-Days: Data collection and storage using librarians, archivists, and clerks
managed paper-based records.
● 1960s-1990s: IT specialists and database administrators emerged to manage new
data infrastructure.
● 2000s: With the growth of data, Data warehousing and BI roles started to emerge as
well as the Data analysts started using advanced tools like SQL for data exploration
and reporting.
● 2010s-2020s: In the Big Data Era, data engineers and data scientists started to take
over to build and manage large-scale data pipelines and processing systems, and
combine statistical expertise, programming skills, and machine learning techniques
to extract knowledge from vast datasets.
8. Why AI is
booming
now? What are the ignites behind the revolution
of AI in the last 15 years?
02
9. AI Booming Reasons
Powerful GPUs and TPUs for
accelerating AI models training
The data explosion after the
internet invention (150 ZetaBytes)
The improvements in algorithms
that can learn from data using
patterns and back propagating
errors to learn
Collaborative research culture and
online education platforms helped
in the booming (Democratization)
Hardware Advancement Big Data
Deep Learning (NNs) Open-Source and Education
10. Everyday AI
AI is being used individually everyday but
not on the business pipelines level.
03
11. AI Everywhere
Social Media Voice Assistant Recommendation
Engines
All images are AI-generated images
Filtering Spams Navigation
12. AI Serving Whole Pipelines (Example)
AI algorithms can analyze customer
data to personalize marketing
messages, offers, and product
recommendations
AI can optimize inventory levels by
predicting demand fluctuations,
identifying slow-moving or obsolete
inventory
Using chatbots that can
help in finding your
search and answering
your questions
AI can help in optimizing the
routes to be used for faster
reachability.
AI helps in organizing the warehouse in
a way that facilitate the entry the
leaving of the products in a utilized way.
Online Ordering
Warehouse
Organizing
Marketing /
Recommendations
Routing
Optimization
Inventory
Management
13. Large Businesses to Small Businesses
We all know and see the effect of AI on large businesses as
it serves millions or billions of people all over the world but
what about small businesses?
AI in a bakery shop
Why don’t we use AI in every business around us?
17. Healthcare
- Analyze medical imaging
- Early diagnosis of diseases
- tailored medical treatments
Finance
Manufacturing Media
02
01
04
03
AI Roles in Some Businesses
- OCR for data entry
- Taking financial decisions
- Fraud detections
- Detecting defects (QC)
- Predictive maintenance
- Optimization of coworkers
movement flow.
- Content recommendations
- Visual effects (VFX)
- Voice Synthesis
18. Why to Use
AI? Humans can do the same tasks, so why to
use AI?
06
19. AI Benefits/Drawbacks of Usage
Scalability Sustainability
Efficiency
Bias Accountability
Fake Content /
Copyrights /
Privacy
20. Will AI
Replace
Humans? Since AI can do a lot in better efficiency
and sustained over time, why doesn’t it
replace us in jobs handling?
07
21. Will AI Replace Some Humans in Jobs?
AI is already transforming many industries, automating
routine and predictable tasks which can lead to job
displacement. However, it also creates new opportunities
and can augment human capabilities, making some jobs
more efficient or spawning entirely new roles.
22. Assist vs. Replace
Some less-likely jobs to be replaced by AI (so far)
● Trades People (Electrician, Plumbers, Hairdressers, etc ..)
● Nurse Practitioners
● Physical Therapists
● Fire Fighters
● Chief Executives and Managerial Roles
● Surgeons, Dentists, and Neurologists
● Directors of Religious Activities
23. Assist vs. Replace
Some most-likely jobs to be replaced by AI (so far)
● Routine (Repeated) jobs like data entry, basic customer service, simple
manufacturing processes, ..etc
● Retail checkout clerks
● Proofreading
● Market research analysts
● Receptionists
● Assembly line / manufacturing workers
● Some media creators
24. CREDITS: This presentation template was created by Slidesgo, including
icons by Flaticon, and infographics & images by Freepik.
THANKS!
Do you have any questions?
nezarelkadyy@gmail.com
+20 01117442844
Nezar El-Kady (nezarelkadyy)
Please keep this slide for attribution.
Editor's Notes
ELIZA was an early natural language processing computer program created by Joseph Weizenbaum at the Massachusetts Institute of Technology (MIT) in 1966. It's best known for its simulation of a psychotherapist, using a script called DOCTOR. The program works by applying pattern matching and substitution methodologies, which provided users with an illusion of understanding, but in reality, it was merely manipulating the user's own words to generate responses.
How ELIZA Works
Input Processing: ELIZA takes a user's input (usually typed English sentences) and processes it.
Pattern Matching: The program scans for keywords or phrases from the input and matches them to pre-programmed rules in its script. These rules determine how the program will respond.
Response Construction: Once a pattern is identified, ELIZA uses a set of scripted responses that contain placeholders. It then substitutes these placeholders with phrases or structures drawn from the user's input to create a response.
How XCON Works
Input: The system receives an order specifying the customer's requirements for a computer system, including various peripheral devices and software specifications.
Knowledge Base: XCON’s knowledge base contains detailed information about all possible computer components and the rules for how they can be combined. This includes data on compatibility, optimal configuration practices, and hardware requirements.
Inference Engine: Using mainly forward chaining, the inference engine applies the rules from the knowledge base to the input specifications. It works through the configuration process step-by-step, ensuring all components are compatible and meet the customer's needs.
Output: The final output is a complete and validated configuration that specifies exactly how the system should be assembled, including what parts to use and how they should be connected.
- data (150 zetabytes 21 zeros) and about 90% of these data are collected in the last 5 years. (Social media for example)
Our interview now is data: our words will be analyzed, our facial expressions will be used in emotion intelligence, even our voice tones will be analyzed for talking behaviour analysis.
We are creating data now.
- Hardware progress: GPUs with CUDA and parallel computing that started in 2007 and open the gate for making use of this large amount of data.