BERT, the new Google algorithm has arrived

BERT has arrived, the new algorithm that interprets and contextualizes the queries and nuances that lie behind every word searched on the web. But how does this new system work? What importance does it occupy in everyday life? What advantages and disadvantages can it offer? Let’s find out.

What is BERT and what is it for?

BERT (acronym of Bidirectional Encoder Representation from Transformers) is the new Google algorithm that uses the Artificial Intelligence system to process and interpret the natural language (or NLP) of users. BERT is able to understand even the most subtle nuances that lie behind the meaning of each word.
As Google’s official blog Pandu Nayak writes, Google Search’s vice president, BERT is an important step forward in research history, perhaps the most interesting change in the last five years.

How does BERT work?

Artificial Intelligence attempts to imitate the functioning of the human brain. Likewise, BERT uses AI to interpret user searches and immediately find useful information on the web.
Google researchers had to carry out several tests before achieving these results. They have selected a thousand sentences of complete meaning and at some point, they have removed about 15% of the words putting BERT to the test.
The algorithm was able to analyze the meaning of a word, in relation to the context.

Through this constant training, BERT reconstructed the sentences as they were originally developed, learning to understand even the smallest nuances that lie behind the meaning of each word

Why is BERT important?

The introduction of the new Google algorithm is an important advance in the history of the web. This means that BERT will analyze the words typed on the bar and propose relevant results to the research carried out by the user.
According to Google researchers, this update may have an influence of about 10% on queries. Most likely, the change will affect the way you interact with the most popular search engine in the world.
Pandu Nayak adds again in the blog that, by applying BERT models to snippers and positioning, it will be possible to obtain more suitable results for the research carried out. These advances cannot be attributed solely to the software, but also to the use of new hardware, namely the TPU cloud.

How BERT works: practical example

Usually, to search for information on the web, type or query a query. But who can say that the words used are actually correct? One can never be quite sure of the right formulation of a question.

Therefore, Google’s task is to first understand the language, even when there may be some spelling or grammatical errors.
The problem arises when particularly complex or conversational queries are proposed. Some prefer to use so-called keywords, thinking that the system can understand, while perhaps it would be easier to formulate a question.
To better understand how BERT works, Google researchers reported a practical example on the blog. They have typed in the search bar a phrase such as: “The 2019 Brazilian traveler in the United States needs a visa”.

At this point, Google’s algorithm, unlike traditional ones, was able to recognize the importance of prepositions too. The old systems misinterpreted the phrase, returning results of US citizens traveling to Brazil and not the other way around, as only BERT was able to analyze and understand.
BERT has already been active in the United States for some time.

It is currently being implemented for the interpretation of the questions formulated exclusively in English. The Google team ensures that the algorithm will soon arrive in other countries of the world.

And what do you think? Is your site optimized? If you want we can do an SEO audit and plan a meeting to evaluate the opportunities for developing and improving your online presence !. Contact us.

Condividi

Facebook
LinkedIn

Leave a Reply

Your email address will not be published. Required fields are marked *

Raccontaci la tua idea,
la trasformeremo in progetto.

Compila il form o chiamaci allo +39 02 89745373