The affection of Google RankBrain Algorithm on your website

  • -

The affection of Google RankBrain Algorithm on your website

The most important thing for the RankBrain algorithm is content that is created for users and provides real information. Therefore, you should better reflect your content and focus on information rather than inflating your text with empty sentences.

The RankBrain algorithm wants to better understand your content. For this reason, it is important to help Google when reviewing content on your website.

Instead of concentrating on a single keyword, provide users with full content by quoting from different sources. In addition, you can repeat your keywords several times in your text to get high on Google.

First Assumption – User Behavior is Shifting and will Shift even Further

Since search results are becoming better and better in terms of relevance to the user, getting to the top 3 spot – whether that spot is a local listing, a knowledge graph listing, etc – so long as it’s an organic listing and not paid (since paid has an obvious yellow button-like label that says ‘Ad’) it will get more clicks from users now more than ever.

The top 3 spots will eat up the clicks of the rest of the SERP listings. That’s because people are getting more and more satisfied with the results of the top 3 spots that we have a natural tendency to just check out the top 3 and we’re almost sure that we’ll be satisfied with the results.

Second Assumption – Competition is Going to get Tighter

Search took a leap up the competitive ladder. What RankBrain really did is prioritize meaningful results. “Strings to Things” isn’t all that friendly to those playing the search game primarily for traffic. All the articles that are less comprehensive than its 10x counterparts will drop in rankings. Only the best will get ranked – all the other mediocre results will start to fall off.

Search is a zero-sum game. It’s always been. And just when we thought the game wasn’t getting any harder, it bites us back in the ass.

Third Assumption – Machine Learning will Crush Spam and Black Hat Practices

We can conclude that this is the beginning of numerous years of machine learning being incorporated into Google (because it worked better than they thought it would). Considering the direction of Google has been to fight spam and close up loopholes (we’ve felt the effect of this quite strongly since 2010), it’s a surprise that RankBrain is not an algorithm that targets black hat tactics.

That being said, I think that the next machine-learning algorithm Google will launch after RankBrain would deal strongly with fighting spam and loopholes. If RankBrain worked better than they expected, I’m quite sure that they will use the positive result in shutting down spam and black hat problems.

Fourth Assumption – You can affect RankBrain

Google feeds RankBrain ‘offline’ data. Meaning it does not learn on the internet as it is. What Google thinks is good enough to feed to RankBrain, that’s what they feed it. So coining terms that spread such as ‘Growth Hacking’ or ‘Inbound Marketing’ or ‘Link Earning’ could actually signal that you are an authority in such a term and concept.

If this is fed to RankBrain and it recognizes you as the source of the term, that could turn out to be a positive signal for your site and all that is related to it. It’s not easy, but it’s definitely something that I assume could affect the algorithm as it is fed.

Read more What is Google RankBrain and How Does it Work?


Please contact us for seo service packages at TDHSEO.COM.


Skype: tdhseo

Thank you!

  • -

What is Google RankBrain and How Does it Work?

RankBrain, artificial intelligence (AI) program is used to help process Google search queries. It is designed as a machine learning system so that it can embed vast amounts of written language into mathematical entities (vectors) that the computer can understand.

RankBrain is unique because, when it sees an unfamiliar word or phrase it uses AI to instantly search terms that have a similar meaning that a user has typed. Next, it filters the result accordingly.


Google has taken this exceptional initiative to give its users only the most useful results for their search queries. Google’s goal is to offer the searchers the most appropriate and relevant content. According to Google, RankBrain offers different results in different countries for the same query.  This is because the measurements in each state are different, despite the similar names.

Does Google RankBrain Work?

Is the AI doing a good job? So far, it seems like it. After all, Google promoted it from merely processing parts of the unknown key phrases to using it for all search queries.

And why wouldn’t they? After all, RankBrain appears to be doing a better job at improving search results than the Google engineers themselves. In fact, when it was pitted against a number of engineers to find the best page for a search query, the AI outperformed the humans by 10 percent.


Quite impressive, isn’t it?

The cool thing about RankBrain is that, in contrast to global changes to the search algorithm, it can improve search results on a per-keyword basis. This makes the SERPs more precise than before and allows for more granular improvements.

Also, ironically, even though RankBrain is a machine learning algorithm, it actually increases the influence of human users on search results. That’s because the AI can use direct feedback from how users interact with your content to judge its quality. For that reason, you need to focus less on pleasing the machines (read algorithm) and more on actually swaying people to click on your stuff.

RankBrain is a machine that learns the artificial intelligence system, transforming letters and numbers into a mathematical algorithm. While this is a technical definition for RankBrain, Google uses this algorithm to improve its search results. With RankBrain, Google learns what users want, what they’re looking for, and plans to deliver better results to users. Search location, keywords entered, etc. By taking into account the variables, Google aims to learn exactly what the user is looking for and how they try to reach a conclusion.

The RankBrain algorithm is considered the third most important factor that evaluates which results should be shown in any search based on Google sources. RankBrain also affects less than 25 per cent of searches worldwide.

Read more When Does Google RankBrain Algorithm Influence a Query Result


Please contact us for seo service packages at TDHSEO.COM.


Skype: tdhseo

Thank you!

  • -

How will Google Possum Algorithm change local SEO

The latest algorithm update, called by SEO experts s Possum algorithm affects the local search and appearance of the business listing. The ranking of 3 pack and local results are affected by the algorithm.

Many local businesses that have an office address outside the main city but found it difficult to get ranked in the searches. The Possum update is going to boost the local rankings of these businesses. This is good for the businesses as they can now get listed for the keywords including the city name for their services though they are having their physical address outside the city.

Google filters the results of the businesses with the same domain name or phone number. It only shows one or two search results. The latest Possum update is now filtering more such businesses. Now any business can not show up in the search results if they are located in the same building or if the owner is the same even if the business names are different etc.

The Google Possum update made the local search result filtering more sophisticated. There is no need to worry about dropping of local rankings as the update is not going to hurt the rankings. The rankings actually will go up organically for competitive keywords.

Separation between local and organic search

Local and organic search are drifting apart with the latest update. For example, in the past if the URL you were linking to in your GMB listing was filtered organically, it would have had a negative impact. Since the update, this no longer seems to be the case.

Many local businesses will likely see positive results from this, while businesses without a local market might face some competition for rankings.

Location and affiliation

Google is now filtering based on address and affiliation. For businesses that fall outside city limits, ranking for keywords that included that city name was difficult. Since Possum, these businesses have seen a huge spike in rankings.

Conversely, this may cause rankings to drop for clients with one main GMB listing and several affiliated GMB listings.

For example, we have a client who owns a clinic with a primary location, but has separate GMB listings for individual doctors. Google isn’t necessarily removing the listing or enforcing a penalty, but is simply picking the most relevant listing and filtering out the others that are too similar – in some cases, Google is suspected to be going as far as to find an affiliation by owner, even if addresses, phone numbers, websites and Google accounts are separate.

These listing haven’t disappeared, and can be viewed by zooming in further on the local finder.

The location of the user

If you’re searching from a different physical location other than that of the business, you’re likely going to encounter a completely different result. As a general rule of Possum: the further the distance, the lower the ranking. This is unsurprising as many local businesses are looking to optimize for “near me” searches, which doubled from 2014 to 2015.

Read more How does Google Possum Algorithm change search results


Please contact us for seo service packages at TDHSEO.COM.


Skype: tdhseo

Thank you!

  • -

How does Google Possum Algorithm change search results

Here are the changes the search engine giant has incurred through its Possum update:

  1. Ranking Advantage For Businesses Beyond The City Limits

Previously, it was a major pain for businesses located outside the city limits to appear in the Local Pack Search Results. A lot of local establishments with their services within major cities and the brick and mortar locations outside the perimeters found it enormously tough to get listings in local searches. In spite of putting all their efforts in getting highly optimized accounts, they were stuck in ‘ranking purgatory’.

Possum has fixed the issue by making the geographical proximity of a business a less important factor to get top ranking. Now, a business that is not within the physical limit of a city but close to it can experience a huge spike in ranking as the algorithm has been running a proximity test prior to preparing the listing.

  1. Filter Enhancement Based On Address And Affiliation

In local search, there are tons of businesses with multiple listings, which lead to the appearance of the same listing repeatedly in search results. Till the Possum update, Google used to detect such duplicate listings based on the phone numbers provided by the businesses or their domain names.

Now, Google has enhanced its local filters by adding two new dimensions – ‘address’ and ‘affiliation’ to them. It essentially means, if the physical addresses of multiple listings in the same category are similar, only the ‘best’ and the ‘most relevant’ one will be shown in search results whereas others will be filtered out.

However, Possum is not penalizing the other listings. Rather, it is simply working as an organic search algorithm to push other listings down lower in the SERPs and ensure a better user experience.

  1. Listings Made Sensitive To The Locations Of Searchers

The geographical locations of the searchers were never so crucial for ranking in local search results before the launch of Possum. Earlier, it was only the keywords or the search terms which influenced search results heavily, but now, as the latest algorithm has changed, the physical location of the searcher matters the most.

Google has decided to offer an outstanding search experience to the mobile community with the help of this location-sensitivity. It is now using IP addresses of the searchers to provide tailored search results and hence, it has become of utmost importance to set the right location in order to trigger the most accurate listings.

  1. Search Results Affected By Slight Keyword Variations

In the past, two different searches with almost similar terms used to yield same results in local searches. The scenario has changed these days as Google has turned finicky about even the slightest variations in search terms with Possum.

As the local listings have become extremely sensitive to keyword changes, the results are also varying widely for two slightly different search terms. So, make sure that you test and evaluate your Local SEO efforts by using multiple iterations of a particular key phrase and pick the one with the maximum search volume.

  1. Local Search Filter Precedence Over Organic Search Filter

Before the arrival of Possum, Google’s local search filter was dependent on its organic search filter. As a result, a lot of businesses, linked to certain websites which are filtered out from the organic results, were listed out from local search results.

The change in algorithm has separated local search filter from the organic search filter in order to facilitate its independent functionalities. Hence, businesses with poor listings in SERPs can now achieve good rankings in local search results.

Read more How will Google Possum Algorithm change local SEO


Please contact us for seo service packages at TDHSEO.COM.


Skype: tdhseo

Thank you!

  • -

What can you do when Google Fred Penalty hits your website

As we’ve seen, Google won’t tell you exactly what’s in the Fred algorithm. That’s why some SEO professionals had to make some inferences.

However, Gary Illyes did offer an important piece of advice about how to avoid getting hit with a Fred penalty.

This past April, at the AMA session during the SMX West conference, Illyes said that the answer to Fred is in the webmaster quality guidelines.

So here’s what you can do to stay in the good graces of Google:

  • Provide high-quality content – In your content marketing efforts, make sure that your articles are top-notch. If you have to, hire a professional writer to give you the best quality. Also, strive for longform content as that tends to cover subjects more exhaustively.

The Ultimate Guide to 10x Content - Avg Content Length- Google Fred Penalty

Google Fred Penalty

  • Avoid “forced” backlinks – If you’re in the habit of buying backlinks from people who own their own private blog networks (PBNs), stop that practice immediately. Even “honest” guest-posting can get you into trouble if you’re forcing unnatural backlinks from low-quality sites. Instead, strive to produce quality content (see above) and let Google’s algorithm push it to the top of the SERPs.
  • Avoid excessive ads – The Fred update also hit sites with an abundance of ads. Although it’s unclear how many ads is “too many” for Fred, you can use common sense. If you think people will find your site annoying because it has a couple of pop-ups, a video ad in the corner, ads in the middle of the content, and banner ads all over the page, you can be fairly certain that you have “too many” ads.

Avoid creating pages with too many ads or you will be penalized by Fred - Google Fred Penalty

Avoid creating pages with too many ads or you will be penalized by Fred – Google Fred Penalty

If some of your content lost rank because of Fred, you might not be able to fully recover it. However, going forward, you can make sure that your future content ranks well. Do that by producing quality articles, avoiding backlink spam, and showing only a few ads on your site.

In addition, make sure your website is focused and does not try to cover general topics.

Read more How to protect website from Google Fred Algorithm


Please contact us for seo service packages at TDHSEO.COM.


Skype: tdhseo

Thank you!

  • -

How to protect website from Google Fred Algorithm

Did you notice a substantial drop in organic traffic on your site? If so, what can you do to recover from Fred, and how do you protect yourself from future content, back-link, and advertising display algorithm updates? Following are five solutions that will have positive impacts on your website’s SEO.

Create More Engaging Content:

When creating new content, make sure it’s relevant and of the highest quality. Don’t skimp on resources for quality; hire only the best content producers, especially those completing the copy-writing.

The writing should be written for humans, not search engines. Google’s algorithms get smarter and smarter everyday, and the latest Fred update is just a continuation of Google’s path to having nothing but valuable content across the web, which includes good writing flow, spelling, and grammar.

Above all, content should be free of grammatical errors, fact checked, and also educate and entertain your readers. Remembering to chunk content into smaller sections with sub-titles (H tags for you SEOs), and stray far from “black hat” SEO techniques like keyword stuffing, as has been the norm for many years now, but we will continue to issue warnings here until we see compliance. Resistance is futile.

There also used to be a case for shorter content due to modern attention spans while online, but based on studies like the Searchmetrics’ Ranking Factors & Rank Correlations (, Google rewards websites with longer-form content. The days of 300-word blog posts are gone for those who want success.

Update Outdated Content:

Most of the websites affected by the Fred update had outdated content that caused some concern for irrelevancy–something Google obviously frowns against.

Update your content, paying close attention to any content that’s not worth any value, such as products no longer available or events that have passed. Along the way optimize each piece of content with updated targeted keywords and call to actions. Make sure all of your lead forms are concurrent with the latest versions, and any outbound links are not directing clicks to dead webpages.

If you must delete content–like a page about a temporary sales page or past job postings–don’t simply hit the delete button or unpublish it. The best SEO practice is to redirect it to another page through what SEOs call a 301 redirect. Point old product pages to the newest product, or old job postings to your careers page.

Check those Back-links:

Back-links are simply other websites that have linked to a page or post on your website for a reference. The problem is some of those websites linking back to you are of poor quality, or even worse, unknowingly display porn or other unwaned links deployed by malware or blackhat spammers. This is where it’s imperative to have an SEO perform some back-link analysis and resolve any bad back-links.  We suggest using webmaster tools (search console) first, but also use third party tools such as MOZ tools to identify and formally disavow bad links.

Less Intrusive Advertising and Affiliates:

From our studies, the Google Fred algorithm update had the largest impact on content-heavy websites. Many of these types of websites, such as a news blog or product review platform, create income through advertising and affiliate marketing.

The problem is some of these ads–you know, those annoying pop ups that are impossible to click off without losing the page you’re viewing–are too intrusive. The same goes for some affiliate ads such as the ones within the text you are reading.

The Fred algorithm penalized those sites with the most intrusive ads, which take away from user experience. Again, user experience is the main focus for Google, and always will be.

Read more Something important about Google algorithm update


Please contact us for seo service packages at TDHSEO.COM.


Skype: tdhseo

Thank you!

  • -

Smart tips to recover website from Fred Algorithm update

As Google has not released any official statement regarding this update and given a hint to the types of links this update will be targeting, there are certain steps you can follow if the rankings have been hit from the update.

  • It is imperative you have built a natural anchor text cloud including – business and generic keywords, naked URLs, and brand name.
  • You must have seen a number of sites ranking with site wide-do follow links, but make sure you are cautious of such link profile. It has a risk of going down anytime.
  • Make sure that the backlink is related to the context.
  • Avoid using automated tools or bots in order to create backlinks for example – mass blog commenting bots
  • Always remember that the quality of the links has more relevance than the total number of the links.
  • If a link can be claimed easily by everyone, then the chances are high that it has less value.
  • Make sure you have avoided paid links, link pyramids, and link exchange.
  • You can use affiliate links but make sure they should not be the sole motive of the page. Make sure affiliate links are only added when they are according to your content and avoid overdoing them.
  • The focus should be on the users instead of stuffing keywords strategies.
  • Inclusion of various other content forms such as gifs and videos should be encouraged.
  • Just because you have a chance to claim a particular link, it does mean that you have to go for it. Almost every website is after such links and moreover the search engines don’t value these links. Instead, you should go for the high-quality links.

Here are some important bullets to consider:

  • Perform a crawl analysis of your site and audit heavily through the lens of Google’s core ranking updates focused on quality.
  • Identify major problems (including UX, content, and aggressive advertising), and fix them as quickly as you can. But don’t rush the changes… make sure they are implemented flawlessly. Don’t inject more problems onto your site.
  • Read the Quality Rater Guidelines (QRG) thoroughly – and then read it again. Like I’ve said many times, it’s packed with amazing information directly from Google detailing what should be rated low and high quality. I’ve seen a serious connection between what I’m seeing in the field and what’s contained in the QRG.
  • Don’t put band-aids on the situation. Fix as many of the problems as you can. Band-aids will keep you in the gray area of Google’s quality algorithms and you might not see much upward movement during subsequent updates.
  • Work hard on publishing killer content over the long-term, building links naturally, building your brand, and improving user engagement. Again, Google needs to see significant improvement over the long term, not just over a few weeks.
  • Periodically audit your site through the lens of “quality”. Don’t make changes and never revisit the situation. That’s how problems creep in. And when enough problems creep in, sites can get hammered. Don’t let that happen.
  • You’ll need to wait for the next major core ranking update focused on quality in order to see movement. There’s not a set timeframe for those updates to roll out, although the past few have been almost exactly one month apart (1/6, 2/7, 3/7, and then 4/4). Google needs to refresh its quality algorithms, and when it does, you have a chance of recovering (or at least starting to recover). Keep your head down, work hard, and keep enhancing quality. If you do, you absolutely can recover. Don’t give up.

Read more Something important about Google algorithm update


Please contact us for seo service packages at TDHSEO.COM.


Skype: tdhseo

Thank you!

  • -

How does Google Fred Algorithm affect on your website?

After Fred rolled out, it was clear that it was a significant core ranking update tied to quality. If you’ve read my previous posts about Google’s major core ranking updates, you will see many mentions of aggressive monetization, advertising overload, etc. Well, Fred seemed to pump up the volume when it comes to aggressive monetization. Many sites that were aggressively monetizing content at the expense of users got hit hard.

With that quick intro out of the way, let’s hop into specific examples of impact from Fred update

Example1 – Getting smoked.
A major hit based on UX Barriers, aggressive monetization, and low quality user experience.
The first two examples detailed a major surge and then a site dealing with the gray area of Google’s quality algorithms. Now it’s time to enter the dark side of Fred. There are many sites that got hammered by the 3/7 update, with some losing over 50% of their Google organic traffic overnight (and some lost up to 90%).

The next site I’m going to cover has dealt with Panda and major core ranking updates in the past. It’s a relatively large site, in a competitive niche, and is heavily advertising-based from a monetization standpoint.

When Fred rolled through, the site lost over 60% of its Google organic traffic overnight. And with Google representing a majority of its traffic, the site’s revenue was clearly hit hard. It’s also worth noting that Google organic traffic has dropped even further since 3/7 (and is down approximately 70% since Fred rolled out.)

Drop from Fred update.

UX Barriers Galore
If you’ve read my previous posts about Google’s major core ranking updates focused on quality, then you’ve seen many examples of what I call “low quality user engagement”. Well, this site had many problems leading up to the 3/7 update. For example, there were large video ads smack in the middle of the content, deceptive ads that looked like download buttons (which users could mistakenly click thinking they were real download buttons), flash elements that don’t load and just display a shell, and more.

And expanding on the deception point from above, there were links in the main content that looked like internal links, but instead, those links whisked users off to third party advertiser sites. Like I’ve said a thousand times before, “hell hath no fury like a user scorned”. And it looks like Fred had their back.

To add insult to injury, the site isn’t even mobile-friendly. When testing the site on a mobile device, the content is hard to see, it’s hard to navigate around the site, etc. Needless to say, this isn’t helping matters. I can only imagine the horrible signals users are sending Google about the site after visiting from the search results.

This is why it’s so important to analyze your site through the lens of these algorithm updates. Understand all “low quality user engagement” problems riddling your site, weed them out, and improve quality significantly. That’s exactly what Google’s John Mueller has been saying when asked about these updates (including Fred).

Example 2 – Tired of the gray area. Better late than never.
The next example I’m going to cover is a large-scale site driving a lot of traffic from Google organic historically. The site has been in the gray area of Google’s quality algorithms for a long time and has seen major drops and surges over time.

The company finally got tired of sitting in the gray area (which is maddening), and decided to conduct a full-blown quality audit to identify, and then fix, “low quality user engagement” problems. They began working on this in the fall of 2016 and a number of the changes did not hit the site until early 2017. In addition, there are still many changes that need to be implemented. Basically, they are in the beginning stages of fixing many quality problems.

As you can see below, the site has previously dealt with Google’s major core ranking updates focused on quality. Here’s a big hit during Phantom 2 in May of 2015, and then recovery during the November 2015 update:
Impact from Phantom 2 in May of 2015

When a site is in the gray area of Google’s quality algorithms, it can see impact with each major update. For example, it might drop by 20% one update, only to surge 30% during the next. But then it might get smoked by another update, and then regain some of the losses during the next. I’ve always said that they gray area is a maddening place to live.

When Fred rolled out on 3/7/17, the site dropped by approximately 150K sessions per day from Google organic. Now, the site drives a lot of traffic so that wasn’t a massive drop for them. But it also wasn’t negligible. In addition, the site increased during the early January update, so the company was expecting more upward movement, and not less.

Here’s a close-up view of the drop. More about the second half of that screenshot soon (not shown). 

Initial drop after Fred rolls out.

It can be frustrating for site owners working hard on improving quality to see a downturn, but you need to be realistic. Even though the site has a lot of high quality content, it also has a ton of quality issues. I’ve sent numerous deliverables through to this company containing problems to address and fix. Some of those changes have been rolled out, while others have not. Currently, there are still many things to fix from a quality perspective.

Example 3 – Soaring With Fred
The first example covers an amazing case study. It’s a site I’ve helped extensively over the years from a quality standpoint, since it had been impacted by Panda and Phantom in the past. It has surely seen its share of ups and downs based on major algorithm updates.

When Fred rolled out, it didn’t take long for the site owner to reach out to me in a state of Google excitement.

“Seeing big gains today. Traffic levels I haven’t seen in a long time… Is this going to last??”

Needless to say, I immediately dug in.

What I saw was a thing of beauty. It was a surge so strong that it would make any SEO smile. The site literally jumped 125% overnight (and this isn’t a small site with a minimal amount of traffic). Google organic surged by 110K sessions per day and has remained at that level ever since. Here are screenshots from Google Analytics and Google Search Console:

Surging after Fred rolls out.

Clicks and impressions surge after Fred rolls out.

Reviewing the audits I performed for the site revealed many low quality content problems and “low quality user engagement” barriers. And that included aggressive monetization. So the remediation plan covered an anti-Panda and anti-Phantom strategy.

Read more Something important about Google algorithm update


Please contact us for seo service packages at TDHSEO.COM.


Skype: tdhseo

Thank you!

  • -

Something important about Google algorithm update

Category : SEO News

From time to time, Google changes its search algorithms. These updates stem from Google’s commitment to improve your experience with their search engine, to promote the most relevant content, and to penalize websites that attempt to game their system.

A Brief History of Google Algorithm Updates—and What They Mean for Your Website’s Content

Panda Update

In 2011 and then again in 2014, Google released a series of eight search engine algorithm changes. All of these changes are part of what’s called the Panda update.

The differences between each minor algorithm tweak is pretty nuanced. However, the changes all had the same goal: to decrease the ranking of thin, poorly researched, less useful content in Google search results.

What does this mean for you? The Panda update is essentially a content filter, weeding out bad, spammy content and promoting in-depth content. In order to stay on top of Google search results, you must consistently produce high-value, well-researched web content that answers your audience’s questions. That content must be concise, free of grammatical errors, and, with rare exception, never be duplicate. The sources it uses also must be credible.

Penguin Update

In 2012, Google released the Penguin update to address the widespread overuse of keywords and links. Prior to this update, sites could boost their search result ranking by building links to other popular websites. They also could manipulate the search engine through keyword stuffing, i.e. adding so many keywords that the text sounds unnatural and does not provide any real value to the reader.

How can this update influence how you produce content for your website? The Penguin update rewards natural link building to credible, relevant sources as well as the strategic placement of relevant keywords. If you use great sources in your in-depth content, you will see your Google search results improve. If you use appropriate keywords and do not overuse those keywords, you will see your Google search results improve even more.

One more important consideration: pay attention to which sites are linking to yours. Because of the Penguin update, your site can be penalized if spammy sites build links to it. Monitor the sites that link to yours and, when they do not appear credible, use Google’s Disavow Backlinks tool to sever the link.

Hummingbird Update

In 2013 and again in 2015, Google engineers recognized that many search queries contained conversational phrases and questions. To help catch what users intended to search for rather than to catch only specific keywords, they released the Hummingbird update.

Due to the Hummingbird’s focus on conversational language, it rewards long-tail keywords that hit on the user’s intended search. For example, in the post-Hummingbird era, content with the keyword, “how to knit a sweater,” would perform better than content with the keyword, “sweater knitting.”

As a result, how-to articles and other tutorial-style web content have become more prevalent and users have been able to locate informative content more easily. Unlike the Panda and Penguin updates, the Hummingbird update does not focus on penalization or utilize specific methods to decrease search rankings.

Pigeon Update

In 2014, Google effectively combined search queries with Google Maps to offer more localized search results.

How did this update change search queries? Users can now type in “what is the best pizzeria in Manhattan,” “best pizzeria Manhattan,” or, if they’re currently in Manhattan, just “pizzeria,” and the search engine will list local pizzerias in Manhattan. The results will also include their locations embedded on Google Maps and the restaurant’s address, operating hours, and phone number.

The Pigeon update, unlike other algorithm updates, doesn’t require you to change how you produce content for your site or how you incorporate keywords. The update also doesn’t penalize your site.

Fred Update

In March 2017, Google dropped its latest algorithm change, the Fred Update, without releasing any information about the update’s intended goals.

A week after the update, SEO consultant Barry Schwartz analyzed which sites benefited from Fred and which got hit hard. He identified that revenue-driven sites with significant amounts of ad placement lost between 50 and 90% of their traffic from Google searches.

How does this influence your site? Be cautious about the number of ads you feature on each page of your website, and make sure that your content focuses on providing value to your readership instead of simply generating revenue for your business.

Read more How Often Does Google Update Its Algorithm?


Please contact us for seo service packages at TDHSEO.COM.


Skype: tdhseo

Thank you!