How have robots learned to read? This talk goes over the process within natural language processing, how BERT works, and how these learnings can be applied to content optimisation.
Include introduction slide - where did I start with this?
There's more - much more - you can do to create a content ideation that is based on actual insights and not just keyword research
Average rank has gone down across most of the pages we’ve optimised (this isn’t a magic bullet, using NLP doesn’t guarantee results just like the rest of SEO)
Average rank has gone down across most of the pages we’ve optimised (this isn’t a magic bullet, using NLP doesn’t guarantee results just like the rest of SEO)
Sentiment’s hokeyyyyy
Explain difference between adjective and noun
A machine learning technique, aimed at understanding intent and the relationship between queries.
The blueprint for every NLP experiment since.
Used in pretty much every English language query + 70 other languages across Google Search.
Pre-trained on English-language Wikipedia and BooksCorpus - all the text it was trained on (its corpus) was in order and not shuffled.
BERT is a pre-training model - you can use BERT as a base for your own NLP needs (provided you know Python!)
There are two things that BERT does best: language modelling and next sentence prediction.
Token masking: BERT was trained to predict the next word in a sentence, based off of the text that it was trained on.
Bidirectional: BERT was the first NLP model that was fully bidirectional, able to use the context of both the words around masked tokens.
BERT was made bidirectional by its token masking process - Input tokens were masked randomly, and the program would predict what their tokens are.
Next sentence prediction: The program would be given a first sentence, then would have to guess if the next sentence given IsNext or is NotNext. 50% of second sentences would be a random sentence from the corpus, while the other half would be the actual sentence that followed the first one.
This is an important process in understanding the relationship between sentences and how they work together.
Make keywords larger
Make keywords larger
Average rank has gone down across most of the pages we’ve optimised (this isn’t a magic bullet, using NLP doesn’t guarantee results just like the rest of SEO)
Average rank has gone down across most of the pages we’ve optimised (this isn’t a magic bullet, using NLP doesn’t guarantee results just like the rest of SEO)
Average rank has gone down across most of the pages we’ve optimised (this isn’t a magic bullet, using NLP doesn’t guarantee results just like the rest of SEO)
Average rank has gone down across most of the pages we’ve optimised (this isn’t a magic bullet, using NLP doesn’t guarantee results just like the rest of SEO)