Bidirectional Encoder Representations from Transformers (BERT) understands natural langage queries
Google Fellow and VP/Search Pandu Nayak said in announcing the BERT update in October 2019:
"We’re making a significant improvement to how we understand queries, representing the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search."
What BERT is and does:
The BERT (Bidirectional Encoder Representations from Transformers) algorithm helps the Google search engine with natural-language processes like identifying parts of speech - importantly, understanding the significance in a query of prepositions (see below) - and answering questions.
In tests, BERT has demonstrated its ability to understand natural-language queries better and faster than human linguists.
BERT analyzes search queries, not web pages, so can benefit from whatever help you can give it in understanding what your pages are about, and how relevant they are to any given query.
The prepositions breakthrough:
Prior to the advent of BERT, Google had a lot of trouble understanding queries in which prepositions played key roles.
Example:
Prior to BERT, the query "trip from Boston to Maine" and "trip to Boston from Maine" were indistinguishable, so produced search returns that were relevant to trips going in both directions (and so 50% irrelevant to either query). BERT is able to figure out what either of those queries means, and produce relevant search returns.
What you can do on-page to help BERT identify queries that are relevant to your pages:
- Use structured data markup (JSON-LD) to describe your content consistently in terms BERT can understand.
- Put important keywords in a context likely to be consistent with natural-language queries relevant to your page - especially when the subject of your page involves words that have multiple meanings.
Example: If you have a page about how to "fly" to where you are, use context to help BERT understand that your page is about air travel and not winged insects, wall-less tents or trouser openings.
- On pages with thin content, give BERT clues to meaning via internal links to more of your related content.
Yes, this makes more work for anyone trying to rank pages for Google organic search. But done right, it should bring you more traffic looking for what you're selling, and reduce server load from people looking for something else. Sounds good to me.
Comments on How to help Google's BERT update identify search queries relevant to you