Google's BERT: Understanding Search Queries Through Context
November 14, 2019
In October 2019, Google rolled out an algorithm update called BERT. They make an extraordinary claim about it:
With the latest advancements from our research team in the science of language understanding–made possible by machine learning–we’re making a significant improvement to how we understand queries, representing the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search.
One of the biggest leaps forward in the history of search? Let’s break that down and find what it means for your search marketing.
Understanding Context
At the beginning of Stanley’s Kubrick’s surreal classic A Clockwork Orange, the narrator opens the story with language that’s recognizable – yet unknown:
“There was me, that is Alex, and my three droogs, that is Pete, Georgie and Dim and we sat in the Korva milkbar trying to make up our rassoodocks what to do with the evening.”
Author Anthony Burguess created words like “droogs” and “rassoodocks” to develop the dystopian reality of the story. On their own they don’t make sense.
But because people have the ability to contextualize language based on the overall usage and situation, we can intuit the meaning of unknown words.
So you understand what a “droog” is because of the context (with a big assist here from the imagery). “Rassoodock” is a reference to some part of the human body – no doubt in this context meant to be vulgar.
Up until BERT, the Google algorithm couldn’t make sense of these unknown words. But with its advent, it’s closer to knowing what words denote by understanding the overall context – just like a person can.
Understanding Intent
The goal behind this technology is to understand the intent of a search, even when the language used in the query is imprecise or vague.
BERT, you probably assumed, is an acronym. It stands for Bidirectional Encoder Representations from Transformers.
That’s a mouthful, but the interesting part is the transformers, which are models that process words in relation to all the other words in a sentence, rather than one-by-one in order. In other words, they figure out the meaning of words based on use in a sentence, which allows it to better understand the overall context of the query.
Google provides this example of how this affects search results:
Before BERT, the algorithm couldn’t contextualize what “for someone” meant in this phrase. Now it interprets the intent of the search more accurately.
Google also hopes that in time, people will move away from what they call “keyword-ese”, where they input abbreviated queries because they think the search engine will understand better. Instead, people will be able to ask Google to search for something as if they were speaking to a person.
What Does BERT Mean For My SEO Strategy?
The biggest shift you need to make with BERT is to think of your keyword targets more as concepts and less as individual sets of words.
For example, Marketing 360® currently ranks in position 2 for three keyword variations:
In the past, we would have considered optimizing different pages for “how to market” vs “marketing ideas”. Also, we would have optimized the page for other semantically related terms.
With BERT, we see that Google understands that “how to market a chiropractic practice” and “chiropractic marketing ideas” are queries with essentially identical intent. On the SERP for “ideas” vs “how to market” 7 of the 10 results are on both.
If you tend to make an overt effort to optimize for “keyword-ese” phrases, cease those efforts. Now you’re better off using natural language in your keyword targets.
Google will also make more of the semantic connections, so you don’t have to force phrase variations into your content.
Ask:
What are your client’s questions?
How will they ask them with natural speech?
What’s the greater contextual meaning of the topic?
Answer those questions when you decide on topics to target for SEO, then create content that provides excellent value for people.
By the way, if you still don’t get what a droog is, ask Google. BERT knows.