Google says that this is the biggest algorithm search change since RankBrain and Hummingbird 5 years ago. BERT stands for Bidirectional Encoder Representations from Transformers, clearly they had a need to shorten it!
BERT is a huge, deep and evolutionary search algorithm that learns more about natural language processing.
Below we’re going to look at an example of what’s changed and why it’s most important to cover bases on your website, such as your news/blog pages, content on your website including ‘light-content’ pages such as product pages.
Lets take the search term ‘Parking on a hill with no curb’ :
Here is an example of Google showing a more relevant featured snippet for the query “Parking on a hill with no curb”. In the past, a query like this would confuse Google’s systems. Google said, “We placed too much importance on the word “curb” and ignored the word “no”, not understanding how critical that word was to appropriately responding to this query. So we’d return results for parking on a hill with a curb.” *Source https://bit.ly/337SkjD
Another way of looking at is in terms of sales, you would never use a feature of a service without following up with a benefit to that feature. ‘We can design and create a website for you’ this is merely a feature of a service but it has no context to the consumer. If we added ‘We can design and create a website for you that will start to convert more users into clients’ this is both a feature and a benefit so provides context to the consumer. BERT acts similarly where it requires context around a query.
BERT is merely an evolution of conversational based queries and it doesn’t look like it will stop there!
Are you BERT ready? If you need some advice contact us today firstname.lastname@example.org