OCTOBER 25, 2019
Denver SEO: On October 25th, Google announced that it had begun to implement its BERT (Bidirectional Encoder Representations from Transformers) algorithm. Per Google, BERT is said to impact 10% of all queries and is the search engine’s “biggest leap forward in the past five years.”
The algorithm was birthed out of an open-sourced project aimed at using neural networks to advance contextual understanding of content via natural language processing (NLP).
In simple terms, BERT is meant to help better interpret a query by using a contextual understanding of the phraseology employed. This is done as the entire phrase is analyzed at once which lets BERT understand a keyword term according to all of the words used within it. This stands in contrast to models that look at language from left-to-right thereby pinning a word’s understanding to that which preceded it.
Practically speaking, BERT helps Google to better understand the use of prepositions within a search query as well as to better comprehend words that have double meanings by using contextual understanding.
Note: so far there were no large waves of rank fluctuation increases due to BERT’s roll-out.