Google is in the process of releasing BERT, the biggest update to the search engine in almost 5 years. BERT furthers Google’s “natural language processing” (NLP), so Google can better understand human language.
BERT stands for “Bidirectional Encoder Representations from Transformers.” Searchengineland.com says it “helps better understand the nuances and context of words in searches and better match those queries with more relevant results.”
“In one example, Google said, with a search for “2019 brazil traveler to usa need a visa,” the word “to” and its relationship to the other words in query are important for understanding the meaning. Previously, Google wouldn’t understand the importance of this connection and would return results about U.S. citizens traveling to Brazil. “With BERT, Search is able to grasp this nuance and know that the very common word “to” actually matters a lot here, and we can provide a much more relevant result for this query,” Google explained.”Welcome BERT: Google’s latest search algorithm to better understand natural language
Google provided several other hypothetical examples to the press:
“Parking on a hill with no curb”: In the past, a query like this would confuse our systems–we placed too much importance on the word “curb” and ignored the word “no”, not understanding how critical that word was to appropriately responding to this query. So we’d return results for parking on a hill with a curb!”Google Applies New BERT Model to Search Rankings, Affecting 1-in-10 Queries
Back on Search Engine Land, SEO experts likely don’t need to change their current technical strategies. The emphasis is on writing content that makes sense to humans. Encouraging natural language has been Google’s goal for many years, and BERT is simply the latest step.
BERT is currently rolling out for English language queries, and other languages will be added later.