Hello, My Name is: Bert

Two weeks ago, Google went live with their latest machine-learning algorithm update, Bidirectional Encoder Representations from Transformers (BERT). Following in the footsteps of its predecessors Penguin, Panda, Hummingbird and RankBrain, BERT has been implemented to better understand language and improve our search results. Touted as the most significant leap forward over the past half-decade, BERT is expected to impact 10% of all searches, with significant influence on Featured Snippets and longtail search queries.

What Makes BERT So Special?

So, BERT aims to improve search results through a higher understanding of language; isn’t that what we expected? In a word, yes. But what makes BERT so groundbreaking is that it’s the first deeply bidirectional language representation – combing through words not only leading up to a given word but also following that word. This structure is also transformer based, allowing it to apply the context behind the word used. This contextual application will improve Google’s ability to match answers to longer-term search queries that are natural language and conversational in nature; a significant factor as speech-to-text becomes more and more widely used.

Improving Upon Voice Search

In July of this year, SEO platform Chatmeter reported that Google Assistant devices ability to deliver voice search results with a staggering 94% accuracy, beating out Alexa and Siri with 10% and 12% accuracy, respectively (Chatmeter 2019). With screenless web browsing estimated to represent 30% of all organic searches by 2020, it’s becoming more and more important that AI is able to appropriately match these queries with results. Additionally, 20% of all voice searches are triggered by 25 keywords; searches including words like “how” or “what”, using adjectives like “best” or “easy”. With one of BERT’s primary directives being to better understand more specific, long-tail queries that we often see in voice search, it’s clear that optimizing organic search results for voice assistants is one of Google’s fastest growing priorities.

How to Optimize for BERT

Whenever Google rolls out an update to the AI that controls the Search Network, the first reaction is: “how does this apply to me and what should I do about it?” BERT, itself, is designed to get more granular in its approach to answering questions posed by users and to answer more specific questions. While one might assume there are clear and direct action items for SEO professionals immediately following this unveiling, Google is adamant this isn’t the case – saying, “There’s nothing to optimize for with BERT, nor anything for anyone to be rethinking” (Danny Sullivan, Google). Rather, the fundamentals of rewarding great content remain the same. Although this provides a requital opportunity to get more granular with one’s content, the approach to tailoring quality, relevant content to targeted readers remains unchanged.


At first glance, some may be concerned that the recent BERT installment could equate to lost traffic – and they wouldn’t be wrong. As Google continues their quest to match search queries with the most relevant results, we’re likely to miss out on some of the ancillary, yet less­ relevant, clicks. While this might mean reduced traffic in the short term, quantity is only half of the equation as driving quality traffic is the real name of the game when it comes to SEO.

  • Recent Posts