This document discusses artificial intelligence and machine learning in Africa. It provides a brief history of AI from Greek mythology to modern times. Key figures discussed include Charles Babbage, Ada Lovelace, Alan Turing, and researchers at the 1956 Dartmouth Conference. The document defines what AI is and is not, and classifies AI types. It also discusses machine learning, big data, data mining, and deep learning. It notes that bridging the gap between African and Western tech students requires interview skills, competitions, progressive work experience, impact knowledge, university education, and being timely with knowledge.
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Techexpo 2017
1.
2.
3. DIGITAL SKILLS FOR AFRICA
EXPEDITION + TECH EXPO
THEMETHEME
ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING NEW FRONTIERS
IN AFRICAN TECHNOLOGY
SUB - THEME
BRIDGING THE GAP BETWEEN AN AVERAGE AFRICAN TECH STUDENT
AND A WESTERN TECH STUDENT
MEKULEYI MICHAEL (Mechanical Engineering)
Nagura Jonah(Physics)
4. BRIEF HISTORY OF A.I
( 7 core stage development)
• Greek Mythology
According to the greeks,Hephaetus(Son of Zeus and Hera)
started Core invention of metallic objects, creating Automated
drink trolleys, Khryseoi tripod( set of 20 wheeled devices) that
gods used to move themselves in and out of Olympus in and
out during Ceremonies
• Charles Babbage and Ada Lovelace
Charles's difference Machine(1822) and Analytical
Machine(design never completed), Charles's Babbage model
of a computer was not Completed by him, untill late
19s.Engineers saw usefullness in his model, fashioned the
Computer after it and Named him the Father of Computing.
Ada Lovelace(Not an Igbo Lady, Daughter of Lord Byron) was
the first computer Programmer(little dispute)/Debugger.
6. Professor Alan Turing
• Professor Alan Turing
Famous Code Breaker, Computer Scientist and Mechanical
Engineer, famous for breaking '''ENIGMA' German Code, and
shortening the World War by 2 years.
• World-Wide Proclaimed Father of Modern Computer Science,
Crytography and Logic. Turing designed and Built ''Bombe
machines''.
• Turing designed Primitives for the ideal and Logical Computer.
• Turing in Conjuction with his PhD thesis tutor found '' The Turing
Test'' arguably the most important test for Strong AI till today.
9. THE DARTMOUTH CONFERENCE
In the summer of 1956, scientist including John
McCarthy, Marvin Minsky, Claude Shannon, Nathaniel
Rochester had a conference on this conjecture
“that every…feature of intelligence can in principle be
so precisely described that a machine can be made
to simulate it.”
However, Funding for AI dropped significantly because
key investors realized that it would take much longer to
develop AI that could what they wanted it to do.This
Period was tagged the "AI winter"
10. AI in 90s
• In the early 1980s, AI research was revived by
the commercial success of expert systems, a
form of AI program that simulated the knowledge
and analytical skills of human experts. By 1985
the market for AI had reached over a billion
dollars.
• At the same time, Japan's fifth generation
computer project inspired the U.S and British
governments to restore funding for academic
research.
11. WHAT ARTIFICIAL INTELLIGENCE IS NOT
• Artificial Intelligence is not wide scale programming to do tasks that
man cannot do
• Artifical Intelligence is not Robotics or Automation (note that some
Robots might be embedded with some Artificail Intelligence Abilities)
• Artificail Intelligence is not a repetitive loop!!! neither is it a set of
machines that are pogrammed only to do tasks or perform some
certain Operations
• Artficial Intelligence is not some complex Machine or Voice only
interface (even though some aspects of machine learning supports
such)
• Artifical Intelligence is not entirely Computers that can do 35 billion
operations at once(Those are Super Computers!!!)
• Artificail Intelligence might not neccesarily involve much Hardware,
it could be full Software on an interface.
• Artificial Intelligence is much more broader than what we can all
imagine
13. WHAT IS ARTIFICIAL INTELLIGENCE
• In computer science AI research is defined as the study of "intelligent
agents": any device that perceives its environment and takes actions that
maximize its chance of success at some goal.
• Colloquially, the term "artificial intelligence" is applied when a machine
mimics "cognitive" functions that humans associate with other human minds,
such as "learning" and "problem solving".{Wikpedia}
• AI is characterized as Machine with some thinking faculty. Computers with
such Neural Connectivity that it can take decisions on its own.
• Computers were generally classified to have an IQ of 0.
• What Question birthed AI?
Is it possible to so much understand the human brain that we could model
machines to think and reason like Humans?
14. ARTIFICIAL INTELLIGENCE CLASSIFICATION
• Weak AI (narrow AI) – non-sentient machine intelligence,
typically focused on a narrow task (narrow AI).-Siri
• Strong AI – (hypothetical) sentient machine (with
consciousness and mind).-Nanobots
• Artificial general intelligence (AGI) – (hypothetical)
machine with the ability to apply intelligence to any
problem, rather than just one specific problem, typically
meaning "at least as smart as a typical human".
• Superintelligence – (hypothetical) artificial intelligence far
surpassing that of the brightest and most gifted human
minds
16. Machine Learning And Data Science
• Machine learning grew out of the quest for artificial intelligence
• Machine learning is a field of computer science that gives
computers the ability to learn without being explicitly
programmed(You see the key part of cognitive display ?).
• Machine learning explores the study and construction of algorithms
that can learn from and make predictions on data
• Such algorithms overcome following strictly static program
instructions by making data-driven predictions or decisions,through
building a model from sample inputs
• The major fuel for Machine Learning is Data categorically big data
• Data is like the delicious icing on the cake
• While Artificial Intelligence is the new Electicity, Data is the new fuel.
18. WHAT IS BIG DATA
• Big data is a term applied to data sets whose
size or type is beyond the ability of traditional
relational databases to capture, manage, and
process the data with low-latency.
• Up until 2005, humans created about 130
exabytes of data( 130000000000 Gigabytes)
• As at 2010 the number had reason to 1200
exabytes(1200000000000 Gigabytes)
• Last Data census taken in 2015 saw the number
rise to about 7900 exabytes(7900000000000
Gigabytes)
• Note: You can fix the average human genome in
about 1 Gigabyte
19. BIG DATA
• Remember in Primary and Secondary School when our
teachers used to say Data is useless, meaningless, raw
and with no value. And then they say Information was of
such value but now data is like one of the most
expensive commodities. Some data sets running into
tens of thousands of dollars
20. DATA MINING AND MACHINE LEARNING
• Data Mining and Machine learning are two extremely
related concepts.
• While Machine seeks to make predictions based on
known properties learned from the training data.
• Data Mining is seeks to make new predictions and
postulations from unknown properties of data
• Data mining focuses on the discovery of (previously)
unknown hidden properties in the data (this is the
analysis step of knowledge discovery in databases).
Data mining uses many machine learning methods, but
with different goals
• It is an Umbrella Subject covering Data collation, Data
preprocessing Data cleaning,
21. DATA MINING AND MACHINE LEARNING
• Data mining includes extraction, cleaning,
transformation, and storing of data into the data
warehouse.
24. Bridging the Gap between African and Western
Tech enthusiasts
• Interview
• Competition
• Progressive work
• Inpact Knowledge
• Universities and Colleges
• Timeliness on Knowledge