When Does Google RankBrain Algorithm Influence a Query Result

  • -

When Does Google RankBrain Algorithm Influence a Query Result

RankBrain used a set of databases based on people, places, and objects (also called entities) to define the algorithm and its automatic learning processes.

These words (queries) are then decomposed into word vectors using a mathematical formula to give these words an “address”. Similar words have similar “directions”.

When Google processes an unknown query, those math relationships are used to better match the query and return multiple related results.

Over time, Google refines results based on user interaction and machine learning to improve the match between users’ search intent and search results returned by Google.

It is important to note that the words the search engines used to throw the words “and” or “they” were not included in the RankBrain analysis. RankBrain is also designed to help understand queries to get the best results, especially for queries with negative targeting. For example, queries that use words like “no” or “no”.

Short of actually being able to read a website’s content, RankBrain is able to identify the context of keywords in a webpage or website.

By doing this it is able to provide search results based on a user’s ‘true’ intent (as opposed to blindly matching those websites that just contain the words that you typed).

It interprets your language and queries – whether you use formal or colloquial terms – then relates them to other similar searches based on previous intent and results. This will then give you the closest results to what you meant by your query.

Borrowing Google’s own example, the query “what is the label of a consumer at the highest level of a food chain?” sounds gibberish to anyone but the user.

With RankBrain, however, Google can make a guess as to what these unfamiliar words mean.

This then allows Google to interpret the query that matched the user’s intent, providing results that detail, in this instance, where a consumer fits in the food chain.

When Google RankBrain Algorithm Influencing a Query Result

RankBrain addresses query in all languages ​​and in all countries.

If RankBrain is most involved, the query is unique and unknown.

For example, prior to Google RankBrain’s announcement, I wrote an article about something that I observed during my own research on Google.

It started when I was looking for information on water rights in Nevada during the drought in California. (We share a river with them). When I looked at the water rights in Clark County or Las Vegas, there was much information about Google on this topic. However, when I looked around for the water rights of Mesquite NV (a city 90 km north), I regained the water authority and nothing in connection with the water usage rights. Water Instead I have sided with mesquite trees, mesquite wood, mesquite barbecue chips, etc.

At that time I did not know what his name was, it was that he existed. We know that now thanks to RankBrain.

Why? Since Google did not know in what relation the mesquite “thing or place” stands to the water rights of “water”, a “kitchen sink” of the results was sent.

The idea of ​​the “best option” is that over time Google will find out which option best suits this query.

If you’ve been searching for a long time, you can remember when you did a search, and Google will show you the actual words used in that search (despite your input). He was the predecessor of RankBrain.

Read more Clever ways to optimize your website for RankBrain

_______________________________________________________________________________

Please contact us for seo service packages at TDHSEO.COM.

TDHSEO Team

Email: tdhseo@gmail.com
Skype: tdhseo
https://www.facebook.com/tdhseo

Thank you!

 


  • -

The affection of Google RankBrain Algorithm on your website

The most important thing for the RankBrain algorithm is content that is created for users and provides real information. Therefore, you should better reflect your content and focus on information rather than inflating your text with empty sentences.

The RankBrain algorithm wants to better understand your content. For this reason, it is important to help Google when reviewing content on your website.

Instead of concentrating on a single keyword, provide users with full content by quoting from different sources. In addition, you can repeat your keywords several times in your text to get high on Google.

First Assumption – User Behavior is Shifting and will Shift even Further

Since search results are becoming better and better in terms of relevance to the user, getting to the top 3 spot – whether that spot is a local listing, a knowledge graph listing, etc – so long as it’s an organic listing and not paid (since paid has an obvious yellow button-like label that says ‘Ad’) it will get more clicks from users now more than ever.

The top 3 spots will eat up the clicks of the rest of the SERP listings. That’s because people are getting more and more satisfied with the results of the top 3 spots that we have a natural tendency to just check out the top 3 and we’re almost sure that we’ll be satisfied with the results.

Second Assumption – Competition is Going to get Tighter

Search took a leap up the competitive ladder. What RankBrain really did is prioritize meaningful results. “Strings to Things” isn’t all that friendly to those playing the search game primarily for traffic. All the articles that are less comprehensive than its 10x counterparts will drop in rankings. Only the best will get ranked – all the other mediocre results will start to fall off.

Search is a zero-sum game. It’s always been. And just when we thought the game wasn’t getting any harder, it bites us back in the ass.

Third Assumption – Machine Learning will Crush Spam and Black Hat Practices

We can conclude that this is the beginning of numerous years of machine learning being incorporated into Google (because it worked better than they thought it would). Considering the direction of Google has been to fight spam and close up loopholes (we’ve felt the effect of this quite strongly since 2010), it’s a surprise that RankBrain is not an algorithm that targets black hat tactics.

That being said, I think that the next machine-learning algorithm Google will launch after RankBrain would deal strongly with fighting spam and loopholes. If RankBrain worked better than they expected, I’m quite sure that they will use the positive result in shutting down spam and black hat problems.

Fourth Assumption – You can affect RankBrain

Google feeds RankBrain ‘offline’ data. Meaning it does not learn on the internet as it is. What Google thinks is good enough to feed to RankBrain, that’s what they feed it. So coining terms that spread such as ‘Growth Hacking’ or ‘Inbound Marketing’ or ‘Link Earning’ could actually signal that you are an authority in such a term and concept.

If this is fed to RankBrain and it recognizes you as the source of the term, that could turn out to be a positive signal for your site and all that is related to it. It’s not easy, but it’s definitely something that I assume could affect the algorithm as it is fed.

Read more What is Google RankBrain and How Does it Work?

_______________________________________________________________________________

Please contact us for seo service packages at TDHSEO.COM.

TDHSEO Team

Email: tdhseo@gmail.com
Skype: tdhseo
https://www.facebook.com/tdhseo

Thank you!


  • -

What is Google RankBrain and How Does it Work?

RankBrain, artificial intelligence (AI) program is used to help process Google search queries. It is designed as a machine learning system so that it can embed vast amounts of written language into mathematical entities (vectors) that the computer can understand.

RankBrain is unique because, when it sees an unfamiliar word or phrase it uses AI to instantly search terms that have a similar meaning that a user has typed. Next, it filters the result accordingly.

rankbrain-artificial-intelligence-program

Google has taken this exceptional initiative to give its users only the most useful results for their search queries. Google’s goal is to offer the searchers the most appropriate and relevant content. According to Google, RankBrain offers different results in different countries for the same query.  This is because the measurements in each state are different, despite the similar names.

Does Google RankBrain Work?

Is the AI doing a good job? So far, it seems like it. After all, Google promoted it from merely processing parts of the unknown key phrases to using it for all search queries.

And why wouldn’t they? After all, RankBrain appears to be doing a better job at improving search results than the Google engineers themselves. In fact, when it was pitted against a number of engineers to find the best page for a search query, the AI outperformed the humans by 10 percent.

 

Quite impressive, isn’t it?

The cool thing about RankBrain is that, in contrast to global changes to the search algorithm, it can improve search results on a per-keyword basis. This makes the SERPs more precise than before and allows for more granular improvements.

Also, ironically, even though RankBrain is a machine learning algorithm, it actually increases the influence of human users on search results. That’s because the AI can use direct feedback from how users interact with your content to judge its quality. For that reason, you need to focus less on pleasing the machines (read algorithm) and more on actually swaying people to click on your stuff.

RankBrain is a machine that learns the artificial intelligence system, transforming letters and numbers into a mathematical algorithm. While this is a technical definition for RankBrain, Google uses this algorithm to improve its search results. With RankBrain, Google learns what users want, what they’re looking for, and plans to deliver better results to users. Search location, keywords entered, etc. By taking into account the variables, Google aims to learn exactly what the user is looking for and how they try to reach a conclusion.

The RankBrain algorithm is considered the third most important factor that evaluates which results should be shown in any search based on Google sources. RankBrain also affects less than 25 per cent of searches worldwide.

Read more When Does Google RankBrain Algorithm Influence a Query Result

_______________________________________________________________________________

Please contact us for seo service packages at TDHSEO.COM.

TDHSEO Team

Email: tdhseo@gmail.com
Skype: tdhseo
https://www.facebook.com/tdhseo

Thank you!


  • -

How will Google Possum Algorithm change local SEO

The latest algorithm update, called by SEO experts s Possum algorithm affects the local search and appearance of the business listing. The ranking of 3 pack and local results are affected by the algorithm.

Many local businesses that have an office address outside the main city but found it difficult to get ranked in the searches. The Possum update is going to boost the local rankings of these businesses. This is good for the businesses as they can now get listed for the keywords including the city name for their services though they are having their physical address outside the city.

Google filters the results of the businesses with the same domain name or phone number. It only shows one or two search results. The latest Possum update is now filtering more such businesses. Now any business can not show up in the search results if they are located in the same building or if the owner is the same even if the business names are different etc.

The Google Possum update made the local search result filtering more sophisticated. There is no need to worry about dropping of local rankings as the update is not going to hurt the rankings. The rankings actually will go up organically for competitive keywords.

Separation between local and organic search

Local and organic search are drifting apart with the latest update. For example, in the past if the URL you were linking to in your GMB listing was filtered organically, it would have had a negative impact. Since the update, this no longer seems to be the case.

Many local businesses will likely see positive results from this, while businesses without a local market might face some competition for rankings.

Location and affiliation

Google is now filtering based on address and affiliation. For businesses that fall outside city limits, ranking for keywords that included that city name was difficult. Since Possum, these businesses have seen a huge spike in rankings.

Conversely, this may cause rankings to drop for clients with one main GMB listing and several affiliated GMB listings.

For example, we have a client who owns a clinic with a primary location, but has separate GMB listings for individual doctors. Google isn’t necessarily removing the listing or enforcing a penalty, but is simply picking the most relevant listing and filtering out the others that are too similar – in some cases, Google is suspected to be going as far as to find an affiliation by owner, even if addresses, phone numbers, websites and Google accounts are separate.

These listing haven’t disappeared, and can be viewed by zooming in further on the local finder.

The location of the user

If you’re searching from a different physical location other than that of the business, you’re likely going to encounter a completely different result. As a general rule of Possum: the further the distance, the lower the ranking. This is unsurprising as many local businesses are looking to optimize for “near me” searches, which doubled from 2014 to 2015.

Read more How does Google Possum Algorithm change search results

_______________________________________________________________________________

Please contact us for seo service packages at TDHSEO.COM.

TDHSEO Team

Email: tdhseo@gmail.com
Skype: tdhseo
https://www.facebook.com/tdhseo

Thank you!


  • -

How does Google Possum Algorithm change search results

Here are the changes the search engine giant has incurred through its Possum update:

  1. Ranking Advantage For Businesses Beyond The City Limits

Previously, it was a major pain for businesses located outside the city limits to appear in the Local Pack Search Results. A lot of local establishments with their services within major cities and the brick and mortar locations outside the perimeters found it enormously tough to get listings in local searches. In spite of putting all their efforts in getting highly optimized accounts, they were stuck in ‘ranking purgatory’.

Possum has fixed the issue by making the geographical proximity of a business a less important factor to get top ranking. Now, a business that is not within the physical limit of a city but close to it can experience a huge spike in ranking as the algorithm has been running a proximity test prior to preparing the listing.

  1. Filter Enhancement Based On Address And Affiliation

In local search, there are tons of businesses with multiple listings, which lead to the appearance of the same listing repeatedly in search results. Till the Possum update, Google used to detect such duplicate listings based on the phone numbers provided by the businesses or their domain names.

Now, Google has enhanced its local filters by adding two new dimensions – ‘address’ and ‘affiliation’ to them. It essentially means, if the physical addresses of multiple listings in the same category are similar, only the ‘best’ and the ‘most relevant’ one will be shown in search results whereas others will be filtered out.

However, Possum is not penalizing the other listings. Rather, it is simply working as an organic search algorithm to push other listings down lower in the SERPs and ensure a better user experience.

  1. Listings Made Sensitive To The Locations Of Searchers

The geographical locations of the searchers were never so crucial for ranking in local search results before the launch of Possum. Earlier, it was only the keywords or the search terms which influenced search results heavily, but now, as the latest algorithm has changed, the physical location of the searcher matters the most.

Google has decided to offer an outstanding search experience to the mobile community with the help of this location-sensitivity. It is now using IP addresses of the searchers to provide tailored search results and hence, it has become of utmost importance to set the right location in order to trigger the most accurate listings.

  1. Search Results Affected By Slight Keyword Variations

In the past, two different searches with almost similar terms used to yield same results in local searches. The scenario has changed these days as Google has turned finicky about even the slightest variations in search terms with Possum.

As the local listings have become extremely sensitive to keyword changes, the results are also varying widely for two slightly different search terms. So, make sure that you test and evaluate your Local SEO efforts by using multiple iterations of a particular key phrase and pick the one with the maximum search volume.

  1. Local Search Filter Precedence Over Organic Search Filter

Before the arrival of Possum, Google’s local search filter was dependent on its organic search filter. As a result, a lot of businesses, linked to certain websites which are filtered out from the organic results, were listed out from local search results.

The change in algorithm has separated local search filter from the organic search filter in order to facilitate its independent functionalities. Hence, businesses with poor listings in SERPs can now achieve good rankings in local search results.

Read more How will Google Possum Algorithm change local SEO

_______________________________________________________________________________

Please contact us for seo service packages at TDHSEO.COM.

TDHSEO Team

Email: tdhseo@gmail.com
Skype: tdhseo
https://www.facebook.com/tdhseo

Thank you!


  • -

How to protect website from Google Fred Algorithm

Did you notice a substantial drop in organic traffic on your site? If so, what can you do to recover from Fred, and how do you protect yourself from future content, back-link, and advertising display algorithm updates? Following are five solutions that will have positive impacts on your website’s SEO.

Create More Engaging Content:

When creating new content, make sure it’s relevant and of the highest quality. Don’t skimp on resources for quality; hire only the best content producers, especially those completing the copy-writing.

The writing should be written for humans, not search engines. Google’s algorithms get smarter and smarter everyday, and the latest Fred update is just a continuation of Google’s path to having nothing but valuable content across the web, which includes good writing flow, spelling, and grammar.

Above all, content should be free of grammatical errors, fact checked, and also educate and entertain your readers. Remembering to chunk content into smaller sections with sub-titles (H tags for you SEOs), and stray far from “black hat” SEO techniques like keyword stuffing, as has been the norm for many years now, but we will continue to issue warnings here until we see compliance. Resistance is futile.

There also used to be a case for shorter content due to modern attention spans while online, but based on studies like the Searchmetrics’ Ranking Factors & Rank Correlations (searchmetrics.com), Google rewards websites with longer-form content. The days of 300-word blog posts are gone for those who want success.

Update Outdated Content:

Most of the websites affected by the Fred update had outdated content that caused some concern for irrelevancy–something Google obviously frowns against.

Update your content, paying close attention to any content that’s not worth any value, such as products no longer available or events that have passed. Along the way optimize each piece of content with updated targeted keywords and call to actions. Make sure all of your lead forms are concurrent with the latest versions, and any outbound links are not directing clicks to dead webpages.

If you must delete content–like a page about a temporary sales page or past job postings–don’t simply hit the delete button or unpublish it. The best SEO practice is to redirect it to another page through what SEOs call a 301 redirect. Point old product pages to the newest product, or old job postings to your careers page.

Check those Back-links:

Back-links are simply other websites that have linked to a page or post on your website for a reference. The problem is some of those websites linking back to you are of poor quality, or even worse, unknowingly display porn or other unwaned links deployed by malware or blackhat spammers. This is where it’s imperative to have an SEO perform some back-link analysis and resolve any bad back-links.  We suggest using webmaster tools (search console) first, but also use third party tools such as MOZ tools to identify and formally disavow bad links.

Less Intrusive Advertising and Affiliates:

From our studies, the Google Fred algorithm update had the largest impact on content-heavy websites. Many of these types of websites, such as a news blog or product review platform, create income through advertising and affiliate marketing.

The problem is some of these ads–you know, those annoying pop ups that are impossible to click off without losing the page you’re viewing–are too intrusive. The same goes for some affiliate ads such as the ones within the text you are reading.

The Fred algorithm penalized those sites with the most intrusive ads, which take away from user experience. Again, user experience is the main focus for Google, and always will be.

Read more Something important about Google algorithm update

_______________________________________________________________________________

Please contact us for seo service packages at TDHSEO.COM.

TDHSEO Team

Email: tdhseo@gmail.com
Skype: tdhseo
https://www.facebook.com/tdhseo

Thank you!


  • -

How does Google Fred Algorithm affect on your website?

After Fred rolled out, it was clear that it was a significant core ranking update tied to quality. If you’ve read my previous posts about Google’s major core ranking updates, you will see many mentions of aggressive monetization, advertising overload, etc. Well, Fred seemed to pump up the volume when it comes to aggressive monetization. Many sites that were aggressively monetizing content at the expense of users got hit hard.

With that quick intro out of the way, let’s hop into specific examples of impact from Fred update

Example1 – Getting smoked.
A major hit based on UX Barriers, aggressive monetization, and low quality user experience.
The first two examples detailed a major surge and then a site dealing with the gray area of Google’s quality algorithms. Now it’s time to enter the dark side of Fred. There are many sites that got hammered by the 3/7 update, with some losing over 50% of their Google organic traffic overnight (and some lost up to 90%).

The next site I’m going to cover has dealt with Panda and major core ranking updates in the past. It’s a relatively large site, in a competitive niche, and is heavily advertising-based from a monetization standpoint.

When Fred rolled through, the site lost over 60% of its Google organic traffic overnight. And with Google representing a majority of its traffic, the site’s revenue was clearly hit hard. It’s also worth noting that Google organic traffic has dropped even further since 3/7 (and is down approximately 70% since Fred rolled out.)

Drop from Fred update.

UX Barriers Galore
If you’ve read my previous posts about Google’s major core ranking updates focused on quality, then you’ve seen many examples of what I call “low quality user engagement”. Well, this site had many problems leading up to the 3/7 update. For example, there were large video ads smack in the middle of the content, deceptive ads that looked like download buttons (which users could mistakenly click thinking they were real download buttons), flash elements that don’t load and just display a shell, and more.

And expanding on the deception point from above, there were links in the main content that looked like internal links, but instead, those links whisked users off to third party advertiser sites. Like I’ve said a thousand times before, “hell hath no fury like a user scorned”. And it looks like Fred had their back.

To add insult to injury, the site isn’t even mobile-friendly. When testing the site on a mobile device, the content is hard to see, it’s hard to navigate around the site, etc. Needless to say, this isn’t helping matters. I can only imagine the horrible signals users are sending Google about the site after visiting from the search results.

This is why it’s so important to analyze your site through the lens of these algorithm updates. Understand all “low quality user engagement” problems riddling your site, weed them out, and improve quality significantly. That’s exactly what Google’s John Mueller has been saying when asked about these updates (including Fred).

Example 2 – Tired of the gray area. Better late than never.
The next example I’m going to cover is a large-scale site driving a lot of traffic from Google organic historically. The site has been in the gray area of Google’s quality algorithms for a long time and has seen major drops and surges over time.

The company finally got tired of sitting in the gray area (which is maddening), and decided to conduct a full-blown quality audit to identify, and then fix, “low quality user engagement” problems. They began working on this in the fall of 2016 and a number of the changes did not hit the site until early 2017. In addition, there are still many changes that need to be implemented. Basically, they are in the beginning stages of fixing many quality problems.

As you can see below, the site has previously dealt with Google’s major core ranking updates focused on quality. Here’s a big hit during Phantom 2 in May of 2015, and then recovery during the November 2015 update:
Impact from Phantom 2 in May of 2015

When a site is in the gray area of Google’s quality algorithms, it can see impact with each major update. For example, it might drop by 20% one update, only to surge 30% during the next. But then it might get smoked by another update, and then regain some of the losses during the next. I’ve always said that they gray area is a maddening place to live.

When Fred rolled out on 3/7/17, the site dropped by approximately 150K sessions per day from Google organic. Now, the site drives a lot of traffic so that wasn’t a massive drop for them. But it also wasn’t negligible. In addition, the site increased during the early January update, so the company was expecting more upward movement, and not less.

Here’s a close-up view of the drop. More about the second half of that screenshot soon (not shown). 

Initial drop after Fred rolls out.

It can be frustrating for site owners working hard on improving quality to see a downturn, but you need to be realistic. Even though the site has a lot of high quality content, it also has a ton of quality issues. I’ve sent numerous deliverables through to this company containing problems to address and fix. Some of those changes have been rolled out, while others have not. Currently, there are still many things to fix from a quality perspective.

Example 3 – Soaring With Fred
The first example covers an amazing case study. It’s a site I’ve helped extensively over the years from a quality standpoint, since it had been impacted by Panda and Phantom in the past. It has surely seen its share of ups and downs based on major algorithm updates.

When Fred rolled out, it didn’t take long for the site owner to reach out to me in a state of Google excitement.

“Seeing big gains today. Traffic levels I haven’t seen in a long time… Is this going to last??”

Needless to say, I immediately dug in.

What I saw was a thing of beauty. It was a surge so strong that it would make any SEO smile. The site literally jumped 125% overnight (and this isn’t a small site with a minimal amount of traffic). Google organic surged by 110K sessions per day and has remained at that level ever since. Here are screenshots from Google Analytics and Google Search Console:

Surging after Fred rolls out.

Clicks and impressions surge after Fred rolls out.

Reviewing the audits I performed for the site revealed many low quality content problems and “low quality user engagement” barriers. And that included aggressive monetization. So the remediation plan covered an anti-Panda and anti-Phantom strategy.

Read more Something important about Google algorithm update

_______________________________________________________________________________

Please contact us for seo service packages at TDHSEO.COM.

TDHSEO Team

Email: tdhseo@gmail.com
Skype: tdhseo
https://www.facebook.com/tdhseo

Thank you!


  • -

How Often Does Google Update Its Algorithm?

Our first peek into this data came in spring of 2010, when Google’s Matt Cutts revealed that “on average, [Google] tends to roll out 350–400 things per year.” It wasn’t an exact number, but given that SEOs at the time (and to this day) were tracking at most dozens of algorithm changes, the idea of roughly one change per day was eye-opening.

In fall of 2011, Eric Schmidt was called to testify before Congress, and revealed our first precise update count and an even more shocking scope of testing and changes:

“To give you a sense of the scale of the changes that Google considers, in 2010 we conducted 13,311 precision evaluations to see whether proposed algorithm changes improved the quality of its search results, 8,157 side-by-side experiments where it presented two sets of search results to a panel of human testers and had the evaluators rank which set of results was better, and 2,800 click evaluations to see how a small sample of real-life Google users responded to the change. Ultimately, the process resulted in 516 changes that were determined to be useful to users based on the data and, therefore, were made to Google’s algorithm.”

Later, Google would reveal similar data in an online feature called “How Search Works.” Unfortunately, some of the earlier years are only available via the Internet Archive, but here’s a screenshot from 2012:

Note that Google uses “launches” and “improvements” somewhat interchangeably. This diagram provided a fascinating peek into Google’s process, and also revealed a startling jump from 13,311 precisions evaluations (changes that were shown to human evaluators) to 118,812 in just two years.

Is the Google algorithm heating up?

Since MozCast has kept the same keyword set since almost the beginning of data collection, we’re able to make some long-term comparisons. The graph below represents five years of temperatures. Note that the system was originally tuned (in early 2012) to an average temperature of 70°F. The redder the bar, the hotter the temperature …

You’ll notice that the temperature ranges aren’t fixed — instead, I’ve split the label into eight roughly equal buckets (i.e. they represent the same number of days). This gives us a little more sensitivity in the more common ranges.

The trend is pretty clear. The latter half of this 5-year timeframe has clearly been hotter than the first half. While warming trend is evident, though, it’s not a steady increase over time like Google’s update counts might suggest. Instead, we see a stark shift in the fall of 2016 and a very hot summer of 2017. More recently, we’ve actually seen signs of cooling. Below are the means and medians for each year (note that 2014 and 2019 are partial years):

  • 2019 – 83.7° / 82.0°
  • 2018 – 89.9° / 88.0°
  • 2017 – 94.0° / 93.7°
  • 2016 – 75.1° / 73.7°
  • 2015 – 62.9° / 60.3°
  • 2014 – 65.8° / 65.9°

Note that search engine rankings are naturally noisy, and our error measurements tend to be large (making day-to-day changes hard to interpret). The difference from 2015 to 2017, however, is clearly significant.

Are there really 9 updates per day?

No, there are only 8.86 – feel better? Ok, that’s probably not what you meant. Even back in 2009, Matt Cutts said something pretty interesting that seems to have been lost in the mists of time…

“We might batch [algorithm changes] up and go to a meeting once a week where we talk about 8 or 10 or 12 or 6 different things that we would want to launch, but then after those get approved … those will roll out as we can get them into production.”

In 2016, I did a study of algorithm flux that demonstrated a weekly pattern evident during clearer episodes of ranking changes. From a software engineering standpoint, this just makes sense — updates have to be approved and tend to be rolled out in batches. So, while measuring a daily average may help illustrate the rate of change, it probably has very little basis in the reality of how Google handles algorithm updates.

Do all of these algo updates matter?

Some changes are small. Many improvements are likely not even things we in the SEO industry would consider “algorithm updates” — they could be new features, for example, or UI changes.

As SERP verticals and features evolve, and new elements are added, there are also more moving parts subject to being fixed and improved. Local SEO, for example, has clearly seen an accelerated rate of change over the past 2-3 years. So, we’d naturally expect the overall rate of change to increase.

A lot of this is also in the eye of the beholder. Let’s say Google makes an update to how they handle misspelled words in Korean. For most of us in the United States, that change isn’t going to be actionable. If you’re a Korean brand trying to rank for a commonly misspelled, high-volume term, this change could be huge. Some changes also are vertical-specific, representing radical change for one industry and little or no impact outside that niche.

On the other hand, you’ll hear comments in the industry along the lines of “There are 3,000 changes per year; stop worrying about it!” To me that’s like saying “The weather changes every day; stop worrying about it!” Yes, not every weather report is interesting, but I still want to know when it’s going to snow or if there’s a tornado coming my way. Recognizing that most updates won’t affect you is fine, but it’s a fallacy to stretch that into saying that no updates matter or that SEOs shouldn’t care about algorithm changes.

Ultimately, I believe it helps to know when major changes happen, if only to understand whether rankings shifted due something we did or something Google did. It’s also clear that the rate of change has accelerated, no matter how you measure it, and there’s no evidence to suggest that Google is slowing down.

Read more: How Often Does Google Update Its Algorithm?

_______________________________________________________________________________

Please contact us for seo service packages at TDHSEO.COM.

TDHSEO Team

Email: tdhseo@gmail.com
Skype: tdhseo
https://www.facebook.com/tdhseo

Thank you!