Blog: SEO
Programmer looking at an algorithm

Google BERT Update: Understanding Search Queries More Effectively

Avatar for John Locke

John Locke is a SEO consultant from Sacramento, CA. He helps manufacturing businesses rank higher through his web agency, Lockedown Design & SEO.

Google announced on October 25th, 2019 that they are rolling out a new update to their algorithm, named BERT.

BERT is an acronym for Bidirectional Encoder Representations from Transformers.

The “transformers” are words that change the context or a sentence or search query.

The “encoder representations” are subtle concepts and meanings in natural language that Google did not have the nuance to interpret correctly before.

The “bidirectional” means Google is looking at the words before and after each word in the query. How does the meaning of the query change when certain words are present? These are transforming words.

What Does the BERT Update Mean?

Google has gotten better at understanding search queries, but there is a lot of subtlety and nuance to sentence structure. In the past, certain queries would not return a set of search results that would fit your search intent. BERT was introduced to give better search results that better fit what people are looking for.

Google has been testing BERT for several months, and initially announced this initiative in 2018.

The Google blog says about 10% of searches will be affected by BERT, meaning 90% of search queries will not be affected.

Google BERT Update: What You Need to Know

My Biggest Takeaways From This Announcement

  • The BERT Update is about understanding conversational search queries better. Instead of having to type a series of keywords to get the information you want, BERT is supposed to help understand searches phrased in everyday language.
  • Most queries will not be affected, as Google already has user signals that tell them that they are understanding 90% of searches reasonably well.
  • Featured snippets will also be affected by BERT.
  • Google is rolling this update for English results in the United States. It will eventually be rolled out to other locations and languages.
  • Google invested in supercomputers called Cloud TPUs to work on the AI and machine learning for the BERT initiative.
  • There is nothing you can do to optimize your content specifically for BERT. This part of the algorithm is about understanding natural language.

If you want to read a technical document on how BERT works, check out this PDF from Google engineers Jacob Devlin, Ming Wei-Chan, Kenton Lee, and Kristina Toutanova.

Avatar for John Locke

John Locke is a SEO consultant from Sacramento, CA. He helps manufacturing businesses rank higher through his web agency, Lockedown Design & SEO.

Join the Conversation

Your email address will be kept private. Required fields marked *.