MACHINE LEARNING
OVERVIEW
Artificial
Intelligence
Machine
Learning
Computer
Vision
Expert
System
…
“[Machine Learning is the] field of study that
gives computers the ability to learn without being
explicitly programmed.” – Arthur Samuel (1959)
Early Days
• 1950 – Alan Turing menciptakan “Turing Test”.
• 1952 – Arthur Samuel program computer learning pertama untuk
bermain game checkers.
• 1957 – Frank Rosenblatt mendesain neural network pertama untuk
komputer “the perceptron”.
• 1967 – Algoritma “nearest neighbor”.
Later Days
• 1979 – Mahasiswa Stanford University menciptakan “Stanford Cart”.
• 1981 – Gerald Dejong memperkenalkan konsep “Explanation Based
Learning” (EBL).
• 1985 — Terry Sejnowski menciptakan NetTalk.
• 1990’s – Mulai berpindah dari knowledge-driven menjadi data-
driven.
• 1997 – Deep Blue buatan IBM mengalahkan Kasparov.
• 2006 – Geoffrey Hinton memperkenalkan istilah “deep learning”.
• 2010’s – Big Data
“A computer program is said to learn from
experience E with respect to some class of tasks T
and performance measure P if its performance at
tasks in T, as measured by P, improves with
experience E.” – Tom Mitchel (1997)
Machine
Learning
Supervised
Learning
Semi-supervised
Learning
Unsupervised
Learning
Reinforcement
Learning
Supervised Learning
Data dengan label
Prediktif
Regression
Classification
Unsupervised Learning
Data tanpa label
Deskriptif
Clustering
Association Analysis
Semi-supervised Learning
Nowadays
• 2006 – Geoffrey Hinton memperkenalkan istilah “deep learning”.
• 2010 – Kinect bisa melacak 20 fitur manusia 30 kali per detik.
• 2010’s – Big Data.
• 2016 – AlphaGo mengalahkan Lee Sedol.
#48 Machine learning

#48 Machine learning

Editor's Notes

  • #4 Machine learning ga Cuma dipake di AI
  • #9 1950 — Alan Turing creates the “Turing Test” to determine if a computer has real intelligence. To pass the test, a computer must be able to fool a human into believing it is also human. 1952 — Arthur Samuel wrote the first computer learning program. The program was the game of checkers, and the IBM IBM -0.73%computer improved at the game the more it played, studying which moves made up winning strategies and incorporating those moves into its program. a human into believing it is also human. 1957 — Frank Rosenblatt designed the first neural network for computers (the perceptron), which simulate the thought processes of the human brain. 1967 — The “nearest neighbor” algorithm was written, allowing computers to begin using very basic pattern recognition. This could be used to map a route for traveling salesmen, starting at a random city but ensuring they visit all cities during a short tour. 1990s — Work on machine learning shifts from a knowledge-driven approach to a data-driven approach.  Scientists begin creating programs for computers to analyze large amounts of data and draw conclusions — or “learn” — from the results. 1997 - Deep Blue versus Garry Kasparov was a pair of six-game chess matches between world chess champion Garry Kasparov and an IBM supercomputer called Deep Blue. The first match was played in Philadelphia in 1996 and won by Kasparov. The second was played in New York City in 1997 and won by Deep Blue. The 1997 match was the first defeat of a reigning world chess champion to a computer under tournament conditions.
  • #10 1979 — Students at Stanford University invent the “Stanford Cart” which can navigate obstacles in a room on its own. 1981 — Gerald Dejong introduces the concept of Explanation Based Learning (EBL), in which a computer analyses training data and creates a general rule it can follow by discarding unimportant data. 1985 — Terry Sejnowski invents NetTalk, which learns to pronounce words the same way a baby does. 1990s — Work on machine learning shifts from a knowledge-driven approach to a data-driven approach.  Scientists begin creating programs for computers to analyze large amounts of data and draw conclusions — or “learn” — from the results. 1997 - Deep Blue versus Garry Kasparov was a pair of six-game chess matches between world chess champion Garry Kasparov and an IBM supercomputer called Deep Blue. The first match was played in Philadelphia in 1996 and won by Kasparov. The second was played in New York City in 1997 and won by Deep Blue. The 1997 match was the first defeat of a reigning world chess champion to a computer under tournament conditions. 2006 — Geoffrey Hinton coins the term “deep learning” to explain new algorithms that let computers “see” and distinguish objects and text in images and videos. Big Data Statistical AI is a center piece of Big Data analysis: as a result of the exponential growth in the amount of data that is available for scientific research, the sciences are on the brink of huge changes. That applies to all disciplines. In biology, for example, there will shortly be around 1 Exabyte of genomics data (10 to the power of 18 bytes) in the world. In 2024 the next generation of radio telescopes will produce in excess of 1 Exabyte per day. To deal with this data deluge, a new scientific discipline is taking shape. Big Data Science aims to develop new methods to store those substantial amounts, and to quickly find, analyze and validate complex patterns in Big Data.
  • #19 Sound wave language
  • #24 2006 — Geoffrey Hinton coins the term “deep learning” to explain new algorithms that let computers “see” and distinguish objects and text in images and videos. 2010 Big Data Statistical AI is a center piece of Big Data analysis: as a result of the exponential growth in the amount of data that is available for scientific research, the sciences are on the brink of huge changes. That applies to all disciplines. In biology, for example, there will shortly be around 1 Exabyte of genomics data (10 to the power of 18 bytes) in the world. In 2024 the next generation of radio telescopes will produce in excess of 1 Exabyte per day. To deal with this data deluge, a new scientific discipline is taking shape. Big Data Science aims to develop new methods to store those substantial amounts, and to quickly find, analyze and validate complex patterns in Big Data.