What can BERT actually achieve without Ernie? We checked it out.
BERT is actually old news , but we are still getting questions about what was actually so “big” and what “we SEOs” do differently. We read through various articles, watched a webinar and just recorded what the Google Update was all about.
What is and what does BERT do?
BERT is considered the largest update of the Google algorithm in five years. BERT stands for “Bidirectional Encoder Representations from Transformers” and is an artificial neural network for NLP (Natural Language Processing) or NLU (Natural Language Understanding). This means that with the help of BERT, Google wants to understand its users’ search queries better in order to be able to deliver more suitable results.
So far, BERT has only been used for an estimated 10% of search queries (number applies to the English-speaking area). It is mainly used in the long-tail area, because that is where the greatest potential for improvement lies. Before its launch, BERT was trained on over three billion words; this basis is a prerequisite for the fact that little text is sufficient to make it “saddle-proof” for specific niches. The best: As a neural network, BERT is able to learn independently.
In its work, BERT carries out several processes that previously had to be carried out by several algorithms. The transformers recognize, for example, correspondences between words, homographs and homophones.
A brief explanation of terms to better understand the following considerations
- Coordinates: Are relationships between words within a sentence. An example: the woman falls on the cat because she is drunk. The “she” refers to the woman, not the cat.
- Homographies: These are words with identical spelling but different meanings. An example of this: Bank = / = Bank.
- Homophones: Words that sound the same and have different meanings (especially relevant for voice input, of course). Small example: grinding = / = painting.
- Co-occurrences: In the following, this mainly refers to word pairs that appear together and can be viewed as a unit, for example idioms, but also word pairs that logically belong together, such as “if” and “then”.
BERT make correct distinctions by taking the context of words into account. For example, grammar, recurring co-occurrences of text units and semantic relationships between words are analyzed and processed more precisely.
In addition, BERT can and will answer questions directed to Google itself in the future. Because BERT can also recognize entities such as people, places, brands or other topics and integrates them into a knowledge graph. There, various information from the network relating to the same entity is linked – since Bert is multilingual, this also applies to information from different languages. This knowledge graph enables BERT to meet complex questions with suitable answers. In addition, BERT is not only able to answer in key words, but also to generate entire texts, since its transformers can determine meaningful word and sentence sequences with the help of probability calculations; According to a study, BERT allegedly already gives more precise answers than people,
What does this mean for SEO?
So far, the effects of the BERT update have hardly been felt. However, due to the improved understanding of the search queries, less traffic is directed to some pages – but this is mostly traffic that has been misdirected anyway. What remains is the more relevant traffic. Penalties in the form of ranking losses are not to be feared, at least with “good” content. So you should continue to rely on relevant, high-quality and target group-oriented content. Because: Optimizing for Bert is not possible.
What you should still consider:
- Word co-occurrences are becoming more important – this can also lead to algorithmic prejudices from which one can possibly take advantage for one’s own purposes by using them.
- Feeding Google snippets will be rewarded
- The FAQ page must not degenerate into advertising text
- Google is answering more and more questions itself
- There is no longer any SERP domination
Conclusion: The SEO world keeps moving, and so do we.