For more than a decade, Google has consistently invested in machine learning technologies through its research and development (R&D) divisions. One of the primary intended applications of machine learning for Google has always been Natural Language Processing (NLP).
Following years of research and refinement alongside thousands of published papers, Google’s efforts finally paid off in 2019 with the advent of two NLP frameworks: BERT and SMITH.
What is NLP?
Natural Language Processing (NLP) is the ability of a particular software — Google’s search algorithms in this case — to capture the meaning of words from speech and text.
Natural language processing, in its many forms, has been around for more than 50 years and has mirrored the developments in both computing power and language analysis algorithms.
Google’s NLP algorithms were designed with the sole purpose of helping it better understand and process queries on its search engine as humans would. Elements of language like context, tone, phrasing and specificity can be better processed using NLP frameworks.
The BERT and SMITH Effect
From a chronological standpoint, it might seem like SMITH is a substantive update to its predecessor, BERT. However, in reality, SMITH is simply a refinement that shares the same fundamental underpinnings as BERT — and improves on a crucial few aspects.
As early as 2011, Google was already collaborating with researchers on NLP research. At the time, the search giant itself wasn’t sure if their NLP studies would yield concrete benefits. However, it did, and Google introduced BERT in 2019. According to Google, BERT would improve searchability in over 10% of search queries. The algorithm was equipped to point search users towards answers and websites better suited to their queries.
BERT analyzes the meaning of individual words along with how they fit into the context of a sentence. SMITH takes this capability a step further by improving on the processing power of its predecessor, enabling the algorithm to process more information in less time.
Both algorithms are good at isolating passage sections within the text that are relevant to search queries. However, SMITH can do it on a larger scale and in a fraction of the time. As I mentioned earlier, both algorithms are more similar than different.
While Google clearly specified that it didn’t see the algorithm having a long-term effect on the rankings of websites, many B2B businesses have reaped benefits from optimizing for these algorithm updates.
How Does Google NLP Affect Marketing
B2B marketers need to understand and adapt to every update to Google’s search algorithms because every change has the potential to result in significant search and ranking implications. While some updates are more impactful than others, the changes in the results are noticeable enough for marketers to consider tweaking their methods.
Let's turn our attention towards the effect of Google’s NLP algorithms on B2B marketing and how you can optimize your website and content to stay ahead of the curve.
Focus on Long-Tail Keywords
Currently, the average conversion rate of long-tail keywords is 36%. However, before the updates, many online marketers shied away from incorporating these keywords because they could be taken out of context by Google’s algorithm.
The floodgates are now open, and there has never been a better time to optimize for long-tail keywords — especially since they account for more than 70% of searches. Some of the reasons they are preferred over single-word queries or search phrases are:
- They provide context to both the search query and your content
- Amazingly, they can help you rank high for single keywords
- A strong long-tail keyword ranking is key to building a robust marketing funnel
- They are better suited for voice searches of all kinds
The advantage of long-tail keywords has been bolstered by Google’s NLP updates, especially those that deviate from conventional textual search (i.e. voice assistant search queries). You can check out this keyword database to incorporate long-tail keywords into your content.
Prioritize User Intent
Starting with the Hummingbird update, Google has made it clear that user intent will be the primary driver while ranking websites in organic searches. The launch of BERT and SMITH are the first few steps in this direction.
To gain an advantage over your competitors on the search engine results pages (SERPs), your content (and their accompanying keywords) must adequately interpret users’ needs and directly address what they’re looking for.
Provide Specific Answers
Google’s algorithm updates value content specificity almost as much as search intent. There isn’t any space for vague, deflective content as the search engine is constantly refined to provide users with the best possible search experience through concise query responses.
For instance, if you owned a blog focused on online education and you were trying to rank for the query “online course platforms”, your content should detail what someone would need to know about online course platforms, such as the different types of platforms available, features of each individual platform and what to look for when shopping for a solution. One Hour Professor has a good example of an extremely detailed guide that explains everything someone would need to know if they were considering purchasing such a platform and, not surprisingly, it ranks very well for this specific term on Google as well.
When you write this type of content, Google will then pick out a few passages from your articles and show them on SERPs as part of its new passage-based indexing feature.
Pay Attention to Formatting
Website formatting remains one of the features Google looks out for when it tries to decide whether or not your page should rank high for on-page SEO. Website formatting dictates how images and textual content appear on your website, the positioning of various items on every page within your domain, and the use of things like pillar pages, metadata, and site maps.
Google’s webmaster guidelines mark down websites for subversive tactics like thin content, HTML cloaking and keyword stuffing, among others.
Focus on Backlink Quality
The importance of backlinks to off-page SEO strategy can never be understated. However, with the new updates, Google is attaching more consideration to the quality of these links from other websites — not the quantity.
Therefore, you should concern yourself with sourcing links from authoritative, relevant websites with a strong track record in the topic in which you’re looking to rank. Consolidating a repertoire of low-quality backlinks might actually harm you in the long run by placing you within Google’s “flagging” crosshairs.
Much like any other online business out there, B2B brands also market to an internet audience. Therefore, any overarching changes on Google’s search engine should also fall within your SEO purview.
BERT and SMITH are the latest from Google’s stable and, if met with proactive website optimizations, they could improve the strength of your B2B marketing efforts and deliver positive results.
Take the time to consider and optimize for each of the aforementioned areas and you should see an increase in your conversion metrics.
Tag(s): Marketing SEO & Paid Search SEO
Nick Chernets is the founder of DataForSEO, a leading provider of SEO data for the marketing technology industry. With an API-led approach to data delivery, DataForSEO is enabling hundreds of software businesses to enhance their products with reliable, accurate, and fresh data.