SEO

BERT (Google)

By Paul Brock·Updated on 24-04-2026
TL;DR

BERT (Bidirectional Encoder Representations from Transformers) is Google's 2019 NLP model that understands query context and nuance by analysing words bidirectionally.

BERT was Google's biggest algorithm update in five years when rolled out in 2019. Where earlier systems read left-to-right, BERT reads bidirectionally and understands prepositional nuance. 'Trip from Brazil to the US' and 'trip from the US to Brazil' now return different results — a breakthrough for BERT.

Example

Pre-BERT, Google ignored 'without' in 'can I pick up medication without a prescription'. Post-BERT it understood that 'without' flips intent and showed pages about prescription requirements.

Frequently asked questions

How do I optimise for BERT?

Not directly. BERT rewards natural, nuance-rich language. Avoid keyword stuffing, write like an expert explaining, use logical connectors (however, therefore, except).

Has BERT been replaced by MUM?

MUM is 1000x more capable but deployed selectively. BERT runs alongside MUM and covers the bulk of queries. Both are part of Google's neural-matching stack.

Related terms

Further reading

  • → Our service: SEO

Need help with SEO or GEO?

We help Bitcoin, AI and fintech companies get found in Google and in AI search engines.

Book a call