Dear Doctor Digital. I hear BERT is now working for Google. Is this the end of Sesame Street?

doctor-digital-question
Doctor Digital

Doctor Digital Says:

No, no, not that Bert – he is still back on Sesame Street with Ernie, co-ordinating skivvies and living his best life. Google’s newest algorithmic update is also called BERT, and this clever acronym helps Google understand natural language better, particularly in conversational search – which is the way most of us phrase our searches, and particularly relates to the rise in voice searching which is going to overtake written searching as machines like Siri and Alexa become our new best friends.

“Hey Siri, what does BERT stand for?”

BERT stands for Bidirectional Encoder Representations from Transformers, and will impact around 10% of current queries on Google. It will also impact organic rankings and featured snippets. So this is no small change and is likely to impact you and your customers.

“Hey Alexa, what does it all mean?”

Google, in essence, has adjusted its algorithm to better understand natural language processing. Here’s the good news: There is absolutely no reason to worry about BERT, and if you create natural copy, which is accessible, clear, simple language and written to be conversational, then you have a big reason to celebrate. If you have been a regular reader of the Digital Ready website, you will know Google has been working towards preferencing language like this, not reams of keyword stuffed text for some time now.

The BERT update aims to do one thing and one thing only: Make it easier for users to search Google more naturally, and receive more relevant results based on those searches. If you want to understand what BERT is really about, one word summarizes the center of this update: context. And context is so, so, so important in everything we do and say. BERT’s technology allows Google to better understand search queries as a whole rather than as a string of words.

People often type long strings of words into Google when searching for something. Prior to BERT, Google’s AI normally interpreted each of these words individually. Now, Google does a better job understanding the words as they relate to each other.

Here’s a great example from Google’s official blog on BERT. Let’s say you are considering working as an esthetician but are worried about how long you might be on your feet.

You go to Google and type in “Do estheticians stand a lot at work.”

“Stand” can have a lot of meanings. As literate humans, we know that in the example’s context, the searcher means “stand on one’s feet.”

Before BERT, Google didn’t understand this. It matched the word “stand” with “stand alone,” which obviously doesn’t have anything to do with what the searcher is looking for. Now, thanks to BERT, the search results are much better and understand the natural context of stand.

For the word nerds, you might enjoy this little insight: BERT is helping Google better understand prepositions like “for” and “to” – arguably the most nuanced and confusing parts of English.

That’s pretty much it. If there’s one thing that you can take away from the BERT update it’s this:

Google WANTS you to write natural, user-friendly copy. It wants you to write with the reader, not Google, in mind.

Since writing content that shows up in search basically means matching your copy to the way people search, write naturally, especially when aiming for longer, more conversational keywords and phrases. Just keep writing in a way that makes it easy for your customers to find you, and information on your website, and BERT will have no trouble indexing your pages.