With Google BERT, Google is having its biggest algorithm change for the first time in five years. This massive update will affect one in 10 search queries.
The BERT update aims to tackle the interpretation of long-tail search queries. This will allow Google to display search results that are much more relevant.
Google has been able to achieve this by taking advantage of natural language processing. 'NLP' enables Google to better understand the context of search term via semantics. Below is a little more about what you expect from Google's newest update.
BERT was rolled out on the 24th of October 2019. Despite this, BERT hasn’t been displayed identically across all parts of the search algorithm (or all countries).
The update itself is currently showing on US search results, and is affecting one in 10 queries. In many other countries, BERT has not found its way into organic search. As time goes along, Google plans for BERT to be indexed in more countries.
The head liaison officer of Google, Danny Sullivan, outlined that the timeline for BERT reaching other countries is not yet conclusive.
The BERT algorithm stands for 'bidirectional encoder representations from transformers.' This is a reference to a model of algorithms that are constructed from neural networks.
The use of natural language processing enables machine systems to accurately interpret the nuances of the human language. BERT can better understand the context of such queries.
This means that Google can interpret individual words in conjunction with joining words. This advancement in language processing is built on transformers, a mathematical model.
Transformers enable Google to analyse words not only by themselves, but also next to other words in a given sentence. This can be very handy when it comes to prepositions and where words were positioned within a sentence or search query.
Google says that out of all search queries, 15% of them are entirely new. In other words, they have been searched for the first time.
The phrasing of queries has also gotten closer to human speech over time. As well as people typing in sentences that are more like questions, the influence of voice search is also affecting how queries are being made.
In fact, ComScore is expecting voice search to make up half of searches within the next two years. Search queries themselves are also becoming longer.
Over 70% of current searches are classified as long-tail. When searching in Google, many people search with questions that are fully formulated, and in turn, want answers given to them very quickly.
The advent of BERT technology means this approach to searching for queries is much more effective.
Google's work on neural networks has been going on for some years now. With their work finally getting close to their intended goal, it is hoped that search queries can be answered even more accurately and interpreted in the way humans expect.
The Hummingbird algorithm was implemented into Google in 2013. This algorithm update was primed for interpreting whole search queries rather than just single words within queries.
In 2015, Google added Rankbrain to the algorithm, which helped search queries to be processed with multiple meanings taken into account. It also enabled the processing of complex queries that went beyond long-tail search.
With Rankbrain added, it also made first-time searches, dialogues and colloquialisms manageable.
The implementation of BERT affects long-tail queries. This makes it able to interpret the context of long queries that have been entered (or searched using voice).
Because of this, BERT can interpret a group of words or a question in a more accurate fashion. Google has gone further into this, stating that the word 'to,' as well as its relationship to other words, were not given enough countenance.
As minor as this may have seemed, the word 'to' played a huge role in how sentences were interpreted. One example is having someone from a particular country travelling elsewhere, using the word 'to,' in a search query (the distinction being the direction they are travelling).
This algorithm means Google will now take this into consideration and provide results that are more relevant and reflect the searcher's true intent.
As always, Google's focused on content still remains. Because of the new BERT algorithm, content is even more important than before.
Marketers should still focus their attention on making sure content is relevant for searches. Content marketing is and always will be making sure content is valuable and relevant for anyone carrying out a search.
Making sure these parameters are met will help to attract and keep your target audience.
The new language model of BERT means search queries are even more relevant. This provides a great opportunity for those penning content to make sure readers are served with copy that is much more naturally written.
The correct content is able to answer the question the searcher is looking for while providing relevancy and value.
The improvements Google have made to their algorithms, especially with Google BERT, is very impressive.
However, they have acknowledged that the understanding of natural language will always be a challenge.
Marketers should continue to adapt their strategies and think about the automation and SEO they are using in order to make sure traffic and content remains seen.
As search engines develop and increase in complexity, keeping up with them is unlikely to be achieved using a definitive guide or list of tricks to optimise your site for the current algorithm.
The best thing to do is to stay abreast of updates and always keep your audience in mind, regardless of how you choose to move forward.
For more marketing insights, check us out.