BERT

BERT (Bidirectional Encoder Representations from Transformers) is a natural language processing model Google integrated into Search in 2019, affecting roughly 10% of queries at launch. BERT reads words in context — considering the words before and after each token simultaneously — which made Google dramatically better at understanding the nuances of natural-language queries, particularly prepositions and conversational phrasing. Unlike optimising for RankBrain, you cannot specifically "optimise for BERT"; the best response is to write naturally for humans. BERT's architecture also underpins many of the large language models used in AI search today.