Tìm Kiếm Tìm Kiếm


If there’s one thing I’ve learned over the 15 years working on Google Search, it’s that people’s curiosity is endless. We see billions of searches every day, and 15 percent of those queries are ones we haven’t seen before--so we’ve sầu built ways khổng lồ return results for queries we can’t anticipate.

Bạn đang xem: Tìm kiếm tìm kiếm

When people like you or I come khổng lồ Search, we aren’t always quite sure about the best way to lớn formulate a query. We might not know the right words to lớn use, or how to spell something, because often times, we come to Search looking to learn--we don’t necessarily have the knowledge to begin with. 

At its core, Search is about understanding language. It’s our job lớn figure out what you’re searching for & surface helpful information from the website, no matter how you spell or combine the words in your query. While we’ve sầu continued lớn improve our language understanding capabilities over the years, we sometimes still don’t quite get it right, particularly with complex or conversational queries. In fact, that’s one of the reasons why people often use “keyword-ese,” typing strings of words that they think we’ll understvà, but aren’t actually how they’d naturally ask a question. 

With the lathử nghiệm advancements from our research team in the science of language understanding--made possible by machine learning--we’re making a significant improvement khổng lồ how we understand queries, representing the biggest leap forward in the past five sầu years, và one of the biggest leaps forward in the history of Search. 

Applying BERT models to SearchLast year, we introduced & open-sourced a neural network-based technique for natural language processing (NLP) pre-training called Bidirectional Encoder Representations from Transformers, or as we gọi it--BERT, for short. This giải pháp công nghệ enables anyone lớn train their own state-of-the-art question answering system. 

This breakthrough was the result of Google research on transformers: models that process words in relation to all the other words in a sentence, rather than one-by-one in order. BERT models can therefore consider the full context of a word by looking at the words that come before và after it—particularly useful for understanding the intent behind tìm kiếm queries.

But it’s not just advancements in software that can make this possible: we needed new hardware too. Some of the models we can build with BERT are so complex that they push the limits of what we can bởi using traditional hardware, so for the first time we’re using the lakiểm tra Cloud TPUs lớn serve sầu search results and get you more relevant information quickly. 

Cracking your queriesSo that’s a lot of technical details, but what does it all mean for you? Well, by applying BERT models to both ranking và featured snippets in Search, we’re able to vày a much better job helping you find useful information. In fact, when it comes lớn ranking results, BERT will help Search better underst& one in 10 searches in the U.S. in English, và we’ll bring this to lớn more languages & locales over time.

Xem thêm: Kinh Doanh Rau Quả Sạch Bạn Cần Nhớ, Ý Tưởng Mở Cửa Hàng Bán Rau Quê

Particularly for longer, more conversational queries, or searches where prepositions lượt thích “for” and “to” matter a lot khổng lồ the meaning, Search will be able khổng lồ underst& the context of the words in your query. You can search in a way that feels natural for you.

To launch these improvements, we did a lot of testing to lớn ensure that the changes actually are more helpful. Here are some of the examples that showed up our evaluation process that demonstrate BERT’s ability lớn underst& the intent behind your search.

Xem thêm: Cách Bật Lại Tính Năng Tìm Quanh Đây Trên Zalo Đơn Giản Nhất

Here’s a search for “2019 brazil traveler lớn usa need a visa.” The word “to” & its relationship to lớn the other words in the query are particularly important to lớn understanding the meaning. It’s about a Brazilian traveling to lớn the U.S., & not the other way around. Previously, our algorithms wouldn't underst& the importance of this connection, và we returned results about U.S. citizens traveling khổng lồ Brazil. With BERT, Search is able to grasp this nuance và know that the very common word “to” actually matters a lot here, & we can provide a much more relevant result for this query.