Google is bringing its latest advancements from its research team in the science of language understanding--made possible by machine learning-- to make a significant improvement to how the search engine understand queries.
The company is applying BERT models to Google search.
Introduced last year, Bidirectional Encoder Representations from Transformers, or BERT is a neural network-based technique for natural language processing (NLP) pre-training. This technology enables anyone to train their own question answering system. Google says that models can consider the full context of a word you by looking at the words that come before and after it—particularly useful for understanding the intent behind search queries.
Some of the models we can build with BERT are so complex that they push the limits of what we can do using traditional hardware, so for the first time Google is also using the latest Cloud TPUs to serve search results and get.
According to Pandu Nayak, Vice President at Google Search, by applying BERT models to both ranking and featured snippets in Search, Google is able to do a much better job helping you find useful information. In fact, when it comes to ranking results, BERT will help Search better understand one in 10 searches in the U.S. in English, and Google will bring this to more languages and locales over time.
"Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning, Search will be able to understand the context of the words in your query," Nayak said.
While Google is constantly tweaking its algorithm, BERT could affect as many as 10 percent of English language searches, said Nayak.
“Most ranking changes the average person does not notice, other than the sucking feeling that their searches were better,” said Nayak.
BERT, said Nayak, may be able to determine that a phrase such as “math practice books for adults” likely means the user wants to find math books that adults can use, because of the importance of the word “for.” A prior version of the search engine displayed a book result targeted for “young adults,” according to a demonstration he gave.
Google is rolling out the new algorithm to U.S. users in the coming weeks, the company said. It will later offer it to other countries.
Google search is 20 years old and is the dominant web engine, holding about 90 percent of the market share. It is not clear whether BERT would be used to improve advertising sales that are related to search terms.
Any changes to Google search have ripple effects across industries that depend on it for web traffic. Currently, Google's search results page shows more results from its own services for some queries, or pulls out blocks of texts from websites into what it calls “featured snippets.”