An Overview of BERT

Throughout this blog post, I will be going over what BERT is and how it works. As well as explain the benefits of BERT:

What is BERT?

BERT (Bidirectional Encoder Representations from Transformers) is essentially a natural language processing framework developed by Google to help their search engine with its understanding of search queries. Simply put, its aim is to help Google better understand the context of search queries.

For instance, someone searching ‘maths practice books for adults’ may see search results for kids’ maths books instead. However, post-BERT Google should understand that the user is searching for an adult maths book and won’t provide results for the child maths books.

The reason why BERT is more of a framework than a model is that it offers a basis for those who are in the machine learning sector. Allowing them to refine it for whatever tasks they may need it for. 

What is Natural Language Processing?

Natural language processing (NLP) is what allows computers to understand how we naturally communicate, via artificial intelligence. But don’t be confused. This concept is not new for search engines, however, it is definitely an improvement in the field.

Examples of NLP are things such as machine translation, predictive typing, and spam finding tools. To name just a few.

How BERT Works

Using its transformer, BERT can look at data (search query/content, etc.) and learn the contextual relationship between two different words. The reason why this is an improvement to previous methods is that BERT is bidirectional, meaning that the text is not read left to right/right to left. But instead, it is read all at once by the transformer encoder. Therefore, technically it could be said that BERT is in fact non-directional. This method allows Google to learn the context of a word by analysing the words surrounding it.

Importance of BERT

BERT is important as it is trying to ensure that the intent behind the user’s query is fully understood so that the most relevant searches can be given to the user. Because of this, people will be able to quickly find the relevant information they’re looking for. Saving time for everyone.

For example, the framework will also be affecting which featured snippets are served when a person is searching for an answer, etc.

Google Search

Should I Care About BERT?

Currently, BERT has only been applied to US SERPs and is said to affect around 10% of all queries. For anyone who is in the marketing area wondering if there are any changes or preparation that needs to be done, BERT will only be put in place to help Google understand people’s natural language in search queries and content. Therefore, there is no actual preparation or optimisation that can be done.

In comparison, around the time BERT was being rolled out in the US, many people in the SEO industry noticed that this was not as significant as a core update or previous updates such as Penguin, etc. However, this is from an SEO perspective. 

 

Thank you for reading this blog post. Hopefully, you are now more knowledgeable and up to date with BERT, and are excited for it to be rolled out in the UK. If you are looking for guidance and expertise in the marketing industry, Bright Design is here to help. Visit our contact page or call us today on 01604 806020.

Amir Al-Azzawe

Amir Al-Azzawe

I joined Bright Design at the start of 2019 as an SEO Executive to aid with the optimisation of our client’s sites, making sure they rank as high as possible. Prior to this role I studied A Levels in Engineering, Computing, and IT at Silverstone while self-teaching myself to build Desktop/iOS applications and doing some web development freelance work in my spare time.

BERT Algorithm