If I’ve learned one thing about working with Google Search in the past 20 years, it is that Google likes making changes to its algorithm. The most recent change named BERT, is one the the most impactful updates to the algorithm in the past 5 years. Google made this update yesterday (Friday October 25, 2019).
What is BERT?
BERT stands for Bidirectional Encoder Representations from Transformers. Yeah, I can’t make that up.
When people like you or I arrive at Google.com to perform a search, we aren’t always quite sure about the best way to formulate a query. We might not know the right words to use, or how to spell it, because many times, we come to Google looking to learn.
With the new BERT update, Google can now consider the full context of a word by looking at the words that come before and after it—particularly useful for understanding the intent behind search queries.
Here is an example where BERT has helped us grasp the subtle nuances of language that computers don’t quite understand the way humans do.
With the BERT model, Google can far better understand that “for someone” is an important part of the query. Whereas previously, Google missed the meaning and displayed more general results only about filling prescriptions.
Should You be Worried About This?
Not if you have been producing conversational and meaningful content that speaks to the end user. If you are simply writing content to write content and only focussing it on keywords, you may run the risk of losing rankings in Google search results.
The truth of the matter is that only time will tell. None of us really know exactly how this will impact results until we do. But it stands to reason that Google will never penalize you if you are writing with search intent in mind and writing for the human on the other end of the screen.
The real question is, what do think ERNIE would have to say about all of this?
If you are interested in learning more please read the articles below: