5. AlphaGo
Go has 1.74 x 10172 possible
moves, more than the number of
atoms in the known universe.
6. Proprietary & Confidential
Search
Search ranking
Speech recognition
Android
Keyboard & speech input
Play
App recommendations
Game developer experience
Gmail
Smart reply
Spam classification
Drive
Intelligence in Apps
Chrome
Search by image
Assistant
Smart connections
across products
YouTube
Video recommendations
Better thumbnails
Maps
Parsing local search
Translate
Text, graphic and speech
translation
Cardboard
Smart stitching
Photos
Photos search
Editor's Notes
The AI revolution is underway and our industry is leaning into AI. The marketer’s job is changing and many other companies are innovating and filling in gaps with AI tools and technology. We have been talking about it but it’s arrived- a simple look at google trends shows that searches for AI are at an all time high (https://trends.google.com/trends/explore?date=all&geo=US&q=AI&hl=en)
This is the science behind how Google revolutionized and continue to revolutionize how natural language is processed and understood by machines
Future AI horizons: Automation, Personalization
So let me step back for a moment and talk about what is it you need to be successful with machine learning.
First is high quality datasets. Since ML learns by example, it’s pretty self-evident that you need good examples. I used to say you needed large quantities of data. But I think it’s important to stress that although ML does work best when given a lot of examples, the quality of those examples are equally important.
Think back to our example of learning your native language. Now imagine you grew never seeing a dog, You never read a book with a dog, your family never said the word dog, you never saw one on TV. Now you go outside and suddenly see a dog for the first time. What do you think are the chances you’d spontaneously say “that’s a dog!” Pretty slim, right?
There’s actually a much deeper topic I could talk to about importance of picking training data, setting aside data to test with, etc. But for now I’ll leave at this: You need to train your ML systems with data that is representative of what it will encounter in the real world. ML systems are the ultimate example of garbage in, garbage out.
Second, you need a lot of compute. Your brain is an amazingly compact and powerful electro-chemical computer. The average human brain weighs about 3 pounds and consumes something like 20 W of power. To replicate that, we need systems that are much bigger and much more powerful. In fact, it’s much more compute power than most companies have available in their datacenters. Realisticaly, the cloud is the only way most companies can tap into that kind of computing power.
Lastly, you need good tools and frameworks. Although I could describe the basic ML algorithms to you in a few minutes, they are quite tricky to implement so you really can’t afford to try to implement them starting from scratch.
Some of you may have heard of AlphaGo - it’s a system built by the Deepmind team within Google to play Go. If you’re not familiar with Go, it’s an ancient game that despite having simple rules, is incredibly complex. In fact, I’d say that Go is to chess as chess is to tic-tac-toe. There are more possible positions in Go than there are atoms in the universe.
So unlike Chess, which a computer can play by simple brute force, Go requires a different approach. A few years ago, the general consensus was that we were a decade away from having a computer beat the world’s best players. Then in 2016, AlphaGo beat Lee Sedol 4-1 in a challenge match.