Category Archives: WEBSITE MANAGEMENT

  • -

What is the target of Google Penguin Algorithm

In September 2016, Google’s latest algorithm update indiscriminately affected online brands web traffic, again. Without a well-planned SEO strategy, brands have had a hell of a time ranking in search engines and effectively promoting themselves online. Let’s explore why.

Unlike its 3.0 predecessor, this update is in reality the next generation of the Penguin algorithm. For starters, the 4.0 release saw Penguin become part of Google’s core algorithm. As a result of this, Google said that it would no longer be announcing any further updates to Penguin.

The second piece of news was equally momentous, if not more so. With the rollout of Penguin 4.0, Google announced that henceforth Penguin would be live, reevaluating sites in real time as they are being re-indexed. This in fact is huge, because it means that if your site should be impacted by Penguin, you would not need to wait months or even years for the next Penguin update in order to bounce back. What’s more is that Google later revealed that the new algorithm does not issue a penalty per se, but rather devalues the spammy links. The devaluation of links is more of a lack of a positive than it is the attribution of a negative. Think of it like this, links are good, they increase your ranking when done properly. Should they however be “spammy links,” Google will simply ignore them, and your page will not have the added value of having links on it (or as many links). This is in contradistinction to the modus operandi of previous versions of Penguin, that actually demoted the ranking of a page that contained spammy links.

Google’s Penguin hunts for spammy backlinks

Gaining and maintaining organic backlinks is one of SEO’s primary jobs. So what does that mean?

Organic backlinks are unpurchased and unsolicited web links between someone’s website, social media apps and blogs. Organic or natural links are one of the top three website ranking factors that Google Search takes into consideration. Unnatural or spammy backlinks are web links, which are purchased in bulk to artificially boost web traffic to the specific website. Before 2012, the more unnatural backlinks your SEO specialist could buy, the more authority or more traffic your website would have. SEO guys and gals that purchase backlinks today are considered Black Hat SEO specialists–also known as spammers.

Initial Penguin Algorithm update released in April 2012, making purchasing backlinks a violation of Google Webmaster guidelines, which resulted for the first time ever as a ranking penalty.  If your brand isn’t monstrous, but your site has thousands of backlinks, you should check with your webmaster. Chances are many of those backlinks are spammy AF. Pre 2012, it was not uncommon for a local mom-and-pop shop to have tens of thousands of backlinks from all over the world. That’s a red flag and the Penguin Update will hit you with a penalty and there goes your site’s traffic.

The evolution of Google Penguin Algorithm

With the release of Penguin 4.0, the algorithm has in a sense completed the evolutionary cycle. It has certainly come a long way from its original construct, skimming for link spam on the homepage. In fact, even the sort of tense relationship between the algorithm and the SEO community has in many ways been healed as Penguin completed its evolution.

Penguin Evolution

No longer are those legitimate sites who have been hit with a Penguin penalty waiting (which in the case of the latest update was years) to recover. As a result, you can make the case that the most interesting and dynamic aspect of Penguin’s progression has not been technological, but sociological – as in its most modern form the algorithm has balanced both technological need with communal unanimity.

Read more How Does Google Penguin Algorithm Work

_______________________________________________________________________________

Please contact us for seo service packages at TDHSEO.COM.

TDHSEO Team

Email: tdhseo@gmail.com
Skype: tdhseo
https://www.facebook.com/tdhseo

Thank you!


  • -

Something important about Pirate Algorithm Update

Google first released the Pirate algorithm in August 2012 and rolled out a second major update to the algorithm in October 2014. The search engine algorithm focuses on tackling the huge problem of online copyright infringements. Websites that have been reported for copyright and have received adverse amounts of web page removal notices will be penalised. Allowing Google users to find quality, and legitimate information less of a challenge.

This algorithm has been designed around data received from website users and owners, who have reported websites for using content that falls into the category of infringement. However, it is important to consider that while some web pages may be demoted for adverse complaint signals, Google cannot remove said content unless a valid copyright ejection is received from the owner of the copyrighted material along with the rights.

Google recently made a change to their algorithm designed to demote major sites hosting pirated content, but the algorithm has a secondary side effect – bringing more visibility to smaller torrent sites that had been previously buried in the search results due to the more popular sites ranking so well.

The new algo takes the number of legitimate DMCAs filed against a site into account when ranking sites, with more DMCAs resulting in a lower ranking.  However, smaller sites hosting pirated content that weren’t at the top of the search results were not impacted as much as other more popular sites, as many companies focus only on the top ranking pirate sites when filing DMCAs.  This change could mean that companies will have to not only focus on the top ranking sites, but also the ones multiple pages deep in the search results to prevent them from showing up at the top later.

Google will notify a website owner of a content infringement via a DMCA take-down notice, the notice is sent to the website owners Webmaster Tools account. A DMCA will include details of infringing url(s) or in some cases a whole website.

Google provides a full transparency list of websites that have been reported for content infringements, which can be viewed here: Transparency Report

All websites that are reported to Google will be documented, allowing said data to be analyzed and used to configure the Pirate algorithm. A website with a high number of removal reports will usually be demoted on all searches across the Google search engine.

Google has kept making changes in its search engine algorithms to demote the most wicked pirate sites. One of the most renowned changes is an improved effort to make such sites less visible in search results, directly indicating that they will not appear in the initial search pages. Since 2012, Google has been running a down ranking system but is reported to lack effectiveness as per the copyright industry groups such as RIAA and MPAA.

Just last week, the giant has announced of the improved version that aims to address this comment/issue. With the updated version of ‘How Google Fights Piracy’ report that was originally introduced in 2013 for defending the claims by film and music copyright holders, Google seems to give an overview of all the efforts for fighting piracy as well as reveal the importance of responsibility of copyright holders for making the content available. The 26-page report delineates the following anti-piracy principles of Google:

  • Defense against abuse
  • Generation of better and more legitimate alternatives to keep piracy at bay
  • Provision of transparency
  • Funds monitoring, as Google believes that the most effective way to fight against online pirates is to reduce their money supply while prohibiting rogue sites from its ad and payment services
  • Guarantee of effectiveness, efficiency, and scalability

Because this filter is regularly updated, the formerly influenced sites can manage to escape in case they have rectified the mistakes or made improvements. Well, at the same time, the filter is also capable of sensing new sites that managed to escape before as well as releasing ‘falsely caught sites’.

The update is just as other updates such as Penguin and allows processing all sites to catch any site appearing to be in violation. Once caught, the site is then stuck with a downgrade until they tend to get fewer or no complaints to get back into the race. However, since its day of introduction, the filter has never been rerun, which means a real pirate site along with new violators during this two years, which  need to be punished, might have managed to escape. This has perhaps made Google to finally update its Pirate Filter after two years!

Read  more Who is affected by Googe Pirate Algorithm

_______________________________________________________________________________

Please contact us for seo service packages at TDHSEO.COM.

TDHSEO Team

Email: tdhseo@gmail.com
Skype: tdhseo
https://www.facebook.com/tdhseo

Thank you!


  • -

Who is affected by Googe Pirate Algorithm

Once Google’s Pirate Update was released back in 2012, which was created to penalize pirate sites, everybody stopped. Everyone wasn’t really sure what to do. How severe was the punishment for those running pirate sites? Are there really any lessons to be learned by other SEOs and marketers?

Every time Google rolls out a brand new algorithm update, it seems that there are quite a few people in the SEO world who goes crazy.  But it reminds me of being a student in high school all over again.  Every time there’s a new big project for a class, or a surprise test announced, everybody loses their cool.  But in reality, we know that Google is doing all this for the best, just like those teachers in high school.  All they want for us to do is to do things better.  Create better content, use fewer keywords, create higher quality links, and so forth.

The Sites At Risk

According to Google’s transparency report, most of teh sites at the top of the copyright notice pile are file sharing sites. This list includes cyberlockers with files for download, search sites widely used to find infringing material, Bittorrent sites and community sites for swapping files. There are very few blogs, legitimate forums or other non-piracy oriented sites on the list.

While this means that legitimate sites that don’t specialize in pirated content aren’t likely to get bit, it also means spam blogs and plagiarist sites and nefarious content farms are not on the list either. However, they are typically addressed and filtered out by other methods.

In short, the sites most at risk are the ones that are in the crosshairs of the major copyright holders as they are the ones sending off the most DMCA notices and racking up the most “points” against the domains they’re dealing with.

Still, this isn’t necessarily a guarantee that more notices equals greater penalty. Google also tracks how many of the total URLs have been reported and all of the sites at the top of the list have had less than 5% of their URLs involved, most less than 1%.

Depending on how Google approaches this penalty, it may be possible for a site with fewer URLs involved but a higher percentage to receive a stiffer penalty.

Overall, it’s safe to mention that Google has removed 300 million illegal download URLs, at the terribly least, over the past several years. whereas this seems like an astronomical number, it’s nevertheless to satisfy either MPAA or RIAA.

However, reports documenting the Pirate Update unrolled on the week of October 20, can be indicating that higher penalties area unit hit torrent websites.

A week has gone along since Google unrolled the latest Pirate Update, and there are some massive torrent players who reportedly walked the plank. Pirate Bay, one among the main torrent websites for embezzled media downloads, lost roughly forty-eight % of its Google visibility, according to an early analysis generated by Search Metrics. However, TorrentFreak is coverage this major drop doesn’t part Pirate Bay, claiming that they are doing not receive abundant traffic from Google. Pirate Bay depends a lot of on direct traffic from those that are looking for torrents; for the foremost half, their target market is already aware that they exist.

Other hard-hit torrent web site in Search Metrics’ analysis includes free-tv-video.me, move4k.to, mp3skull.com, myfreemp3.cc, and kickass.to. There are a total of 30 websites enclosed within the list, with keywords starting from picture downloads to look at movies free. As of now, these thirty websites have basically fallen from Google’s virtual formation. This all sounds nice, and if these reports are literally the case, Google is on the correct track.

Read more Something important about Pirate Algorithm Update

_______________________________________________________________________________

Please contact us for seo service packages at TDHSEO.COM.

TDHSEO Team

Email: tdhseo@gmail.com
Skype: tdhseo
https://www.facebook.com/tdhseo

Thank you!


  • -

The important knowledges of Google Hummingbird Algorithm

Google Hummingbird was an updated algorithm released by the search engine in 2013. The goal of Hummingbird was to help Google better understand semantic search. Rather than matching words in a query to words on websites, the Google algorithm now strives to understand the meaning behind the words so that it can provide results based upon the searcher’s intent. This helps to improve the quality of the results because users can be matched to pages that might better answer their query, even if websites use slightly different language to describe the topic at hand.

Since 1998 Google has used the same old algorithm and upgrading it iteratively every year. Google recently replaced it’s entire algorithm with a new version.

Google Hummingbird is actually a project that seeks to improve the Google search engine experience for users by going beyond keyword focus and instead taking into account the context and all of the content in the entire search phrase to provide more of a natural-language, or conversational, approach to search queries.

Unlike two similar projects from Google, Google Panda and Penguin, which both serve as updates for Google’s existing search algorithm engine, Google Hummingbird introduces a completely new search algorithm that is estimated to affect more than 90% of all Google searches.

Google wants to process “real” speech patterns

Having the best platform for processing conversational queries is an important part of that, and that’s where Hummingbird fits in, though it’s just the beginning of a long process.

Think of Google’s Hummingbird algorithm as a two-year-old child. So far it’s learned a few very basic concepts.

These concepts represent building blocks, and it is now possible to teach it even more concepts going forward. It appears that a lot of this learning is derived from the rich array of information that Google has on all search queries done on the web, including the query sequences.

For example, consider the following query sequence, starting with the user asking “give me some pictures of the transamerica building”:

The user looks at these results, and then decides to ask the next question, “how tall is it”:

Note that the latter query recognizes the word “it” as referring to the Transamerica Building because that was identified in the prior query. This is part of the sophistication of natural language queries.

Another example is the notion of comparison queries. Consider the query “pomegranate vs cranberry juice”:

The Knowledge Graph

These examples involve Google’s Knowledge Graph, where natural language search benefits from the ability to pull real-time answers to queries that understand the specific context of the query.

Note that the Knowledge Graph has accepted some forms of conversational queries for a while, but a big part of Hummingbird was about expanding this capability to the rest of Google search.

I have seen people argue about whether or not Hummingbird was just a front end translator for search queries, or whether it is really about understanding more complex types of user intent.

The practical examples we have now may behave more like the former, but make no mistake that Google wants to be able to do the latter as well.

The mind reading algorithm

Google wants to understand what is on your mind, well, before its on your mind.

Consider Google Now as ultimately being part of this mix. Imagine being able to have Google address search queries like these:

  1. Where do I find someone that can install my surround sound system?
  2. What year did the Sox lose that one game playoff?
  3. What are the predictions for the price of gas next summer?
  4. What time is my dinner on Tuesday night, where is it, and how do I get there?

No, these queries will not work right now, but it gives you some idea of where this is all headed.

These all require quite a bit of semantic analysis, as well as pulling in additional information including your personal context.

The 4th question I added was to show that Google is not likely to care if the search is happening across web sites, in your address book, or both. Not all of this is Hummingbird, per se, but it is all part of the larger landscape.

To give you an idea on how long this has taken to build, Google’s Amit Singhal first filed a patent called Search queries improved based on query semantic information in March of 2003. In short, development of this technology has taken a very long time, and is a very big deal.

Read more The influence of Google Hummingbird Algorithm on SEO

_______________________________________________________________________________

Please contact us for seo service packages at TDHSEO.COM.

TDHSEO Team

Email: tdhseo@gmail.com
Skype: tdhseo
https://www.facebook.com/tdhseo

Thank you!


  • -

What is exactly Google Pigeon Algorithm?

Launched: July 2014

It aims to affect the ranking of local listings in a search. This algorithm will also affect Google Maps along with Google search. It currently is working in US English results and soon will be working with other languages. It is aimed to improve results using the searcher’s location.

It will give preference to local websites and have a dramatic effect on local businesses. However, it got mixed responses from critics saying that ranking will decrease as a result of this algorithm. It uses the location and distance of a user as its key strategy. This way the local directory listings get preference over other websites.

It was a significant change to how Google ranked and ordered the local search results both for local queries in Google search, and within Google Maps.

Indeed, some are referring to Pigeon as the biggest Google update to the local search results since the Venice update in 2012. Google said the latest update would make local search more closely mimic traditional organic rankings.

Early reports showed consistent feedback that specific queries and sectors had been impacted, like real estate, and that directories were now being favored in the results above local businesses (possibly due to the authority of a directory site like Yelp over a local business’s site).

So what exactly is “Pigeon”?

  • The main reason for this update is to provide local search results that are more relevant and accurate.
  • Pigeon isn’t trying to clean up the SERPS from low-quality content like its predecessors
  • The foundation of the change is within the local search ranking algorithm.
  • Currently for all we know, this update only affects US English results.

Read more How could Google Pigeon affect websites and SEO

_______________________________________________________________________________

Please contact us for seo service packages at TDHSEO.COM.

TDHSEO Team

Email: tdhseo@gmail.com
Skype: tdhseo
https://www.facebook.com/tdhseo

Thank you!


  • -

What to know about Google Mobile Friendly Algorithm Update

One of the biggest worries about “mobilegeddon” was that Google was going to start ranking mobile-friendly websites higher on their search results pages, leaving non-mobile-friendly websites behind in the dust. This was expected to be particularly damaging to smaller businesses that didn’t have the resources to create a responsive website design or a separate mobile version of their website.

What Is Mobile Friendly

A web page is tagged eligible for ranking as mobile friendly if it meets the following criteria, as detected in real time by Googlebot.

  • Avoids software that is not common on mobile devices
  • Utilises text that can be read without zooming
  • Sizes content to the screen so users don’t have to zoom or scroll horizontally
  • Spaces links far enough apart so that the chosen one can be tapped easily

This isn’t exactly what the Google mobile-friendly algorithm update does. Yes, the algorithm does rank mobile-friendly websites higher now – but only for mobile searches, which – once you think about it – is completely logical.

Websites that doesn’t target mobile users – for example, websites that publishes long, in-depth research papers – aren’t going to be punished. Their ranking will remain the same for searches done via desktop or laptop. However, their ranking will drop for searches done on mobile devices. This won’t affect websites whose audiences aren’t searching for them on mobile devices.

What Happens If Your Site Is Not Mobile Friendly?

If your site is not mobile friendly and is difficult to use on handheld devices then Google will penalise it and rank it lower on it’s mobile search results. To give publishers even more of an incentive to offer mobile friendly pages, Google announced that in May 2016  it will increase the importance of having mobile pages and that sites that are not mobile friendly will rank even lower than before.

Google stated that when it first introduced this as a ranking criteria last year, the premise was to give mobile users a better search experience. Most publishers and site owners have taken heed of this notification and developed their sites accordingly.

It is without doubt that mobile searches are now a high proportion of traffic on any site and will only increase in volume. It will only be a matter of time before mobile use overtakes any other type of search and it will be imperative for the success of any site that it is mobile friendly.

If your website is not optimised for mobile devices it will no longer appear in the top Google search results for mobile, meaning low visibility from mobile users and a reduction in overall website traffic. With 25% of all searches coming from mobile devices this loss in traffic could seriously limit the opportunities you have to reach and engage your potential customers. To ensure you’re still seen, consider optimising your website with a responsive design solution. Responsive design allows you to build or retro fit one core web solution, which then adapts based on the device being used, whether it is desktop, mobile or tablet. It is the ultimate in usability, enabling you to deliver content from one source to users on any device. With mobile browsing only set to increase and Google now taking action to accommodate this shift businesses would be wise to follow suit

How it impacts on

The update will impact search results in two key ways:

1. More mobile-friendly results will display in search.
This means that sites optimised for mobile devices will be prioritised in search results. This is a significant change, and will affect mobile searches worldwide, and across all languages.
2. More relevant app content will display in search results. 
Google have already started to consider content from indexed apps as a ranking factor via the introduction of App Indexing. This means signed-in users who have a particular app installed will receive relevant app content more prominently in search.

Previous updates have pushed towards mobile compliance by encouraging sites to be correctly configured and accessible via modern devices. As over half of all search is already conducted via mobile, and with the continued rise of smartwatches and tablets, mobile compliance has never been more important.

Read more How to test Mobile Friendly Compatibility for your website

_______________________________________________________________________________

Please contact us for seo service packages at TDHSEO.COM.

TDHSEO Team

Email: tdhseo@gmail.com
Skype: tdhseo
https://www.facebook.com/tdhseo

Thank you!


  • -

When Does Google RankBrain Algorithm Influence a Query Result

RankBrain used a set of databases based on people, places, and objects (also called entities) to define the algorithm and its automatic learning processes.

These words (queries) are then decomposed into word vectors using a mathematical formula to give these words an “address”. Similar words have similar “directions”.

When Google processes an unknown query, those math relationships are used to better match the query and return multiple related results.

Over time, Google refines results based on user interaction and machine learning to improve the match between users’ search intent and search results returned by Google.

It is important to note that the words the search engines used to throw the words “and” or “they” were not included in the RankBrain analysis. RankBrain is also designed to help understand queries to get the best results, especially for queries with negative targeting. For example, queries that use words like “no” or “no”.

Short of actually being able to read a website’s content, RankBrain is able to identify the context of keywords in a webpage or website.

By doing this it is able to provide search results based on a user’s ‘true’ intent (as opposed to blindly matching those websites that just contain the words that you typed).

It interprets your language and queries – whether you use formal or colloquial terms – then relates them to other similar searches based on previous intent and results. This will then give you the closest results to what you meant by your query.

Borrowing Google’s own example, the query “what is the label of a consumer at the highest level of a food chain?” sounds gibberish to anyone but the user.

With RankBrain, however, Google can make a guess as to what these unfamiliar words mean.

This then allows Google to interpret the query that matched the user’s intent, providing results that detail, in this instance, where a consumer fits in the food chain.

When Google RankBrain Algorithm Influencing a Query Result

RankBrain addresses query in all languages ​​and in all countries.

If RankBrain is most involved, the query is unique and unknown.

For example, prior to Google RankBrain’s announcement, I wrote an article about something that I observed during my own research on Google.

It started when I was looking for information on water rights in Nevada during the drought in California. (We share a river with them). When I looked at the water rights in Clark County or Las Vegas, there was much information about Google on this topic. However, when I looked around for the water rights of Mesquite NV (a city 90 km north), I regained the water authority and nothing in connection with the water usage rights. Water Instead I have sided with mesquite trees, mesquite wood, mesquite barbecue chips, etc.

At that time I did not know what his name was, it was that he existed. We know that now thanks to RankBrain.

Why? Since Google did not know in what relation the mesquite “thing or place” stands to the water rights of “water”, a “kitchen sink” of the results was sent.

The idea of ​​the “best option” is that over time Google will find out which option best suits this query.

If you’ve been searching for a long time, you can remember when you did a search, and Google will show you the actual words used in that search (despite your input). He was the predecessor of RankBrain.

Read more Clever ways to optimize your website for RankBrain

_______________________________________________________________________________

Please contact us for seo service packages at TDHSEO.COM.

TDHSEO Team

Email: tdhseo@gmail.com
Skype: tdhseo
https://www.facebook.com/tdhseo

Thank you!

 


  • -

The affection of Google RankBrain Algorithm on your website

The most important thing for the RankBrain algorithm is content that is created for users and provides real information. Therefore, you should better reflect your content and focus on information rather than inflating your text with empty sentences.

The RankBrain algorithm wants to better understand your content. For this reason, it is important to help Google when reviewing content on your website.

Instead of concentrating on a single keyword, provide users with full content by quoting from different sources. In addition, you can repeat your keywords several times in your text to get high on Google.

First Assumption – User Behavior is Shifting and will Shift even Further

Since search results are becoming better and better in terms of relevance to the user, getting to the top 3 spot – whether that spot is a local listing, a knowledge graph listing, etc – so long as it’s an organic listing and not paid (since paid has an obvious yellow button-like label that says ‘Ad’) it will get more clicks from users now more than ever.

The top 3 spots will eat up the clicks of the rest of the SERP listings. That’s because people are getting more and more satisfied with the results of the top 3 spots that we have a natural tendency to just check out the top 3 and we’re almost sure that we’ll be satisfied with the results.

Second Assumption – Competition is Going to get Tighter

Search took a leap up the competitive ladder. What RankBrain really did is prioritize meaningful results. “Strings to Things” isn’t all that friendly to those playing the search game primarily for traffic. All the articles that are less comprehensive than its 10x counterparts will drop in rankings. Only the best will get ranked – all the other mediocre results will start to fall off.

Search is a zero-sum game. It’s always been. And just when we thought the game wasn’t getting any harder, it bites us back in the ass.

Third Assumption – Machine Learning will Crush Spam and Black Hat Practices

We can conclude that this is the beginning of numerous years of machine learning being incorporated into Google (because it worked better than they thought it would). Considering the direction of Google has been to fight spam and close up loopholes (we’ve felt the effect of this quite strongly since 2010), it’s a surprise that RankBrain is not an algorithm that targets black hat tactics.

That being said, I think that the next machine-learning algorithm Google will launch after RankBrain would deal strongly with fighting spam and loopholes. If RankBrain worked better than they expected, I’m quite sure that they will use the positive result in shutting down spam and black hat problems.

Fourth Assumption – You can affect RankBrain

Google feeds RankBrain ‘offline’ data. Meaning it does not learn on the internet as it is. What Google thinks is good enough to feed to RankBrain, that’s what they feed it. So coining terms that spread such as ‘Growth Hacking’ or ‘Inbound Marketing’ or ‘Link Earning’ could actually signal that you are an authority in such a term and concept.

If this is fed to RankBrain and it recognizes you as the source of the term, that could turn out to be a positive signal for your site and all that is related to it. It’s not easy, but it’s definitely something that I assume could affect the algorithm as it is fed.

Read more What is Google RankBrain and How Does it Work?

_______________________________________________________________________________

Please contact us for seo service packages at TDHSEO.COM.

TDHSEO Team

Email: tdhseo@gmail.com
Skype: tdhseo
https://www.facebook.com/tdhseo

Thank you!


  • -

What is Google RankBrain and How Does it Work?

RankBrain, artificial intelligence (AI) program is used to help process Google search queries. It is designed as a machine learning system so that it can embed vast amounts of written language into mathematical entities (vectors) that the computer can understand.

RankBrain is unique because, when it sees an unfamiliar word or phrase it uses AI to instantly search terms that have a similar meaning that a user has typed. Next, it filters the result accordingly.

rankbrain-artificial-intelligence-program

Google has taken this exceptional initiative to give its users only the most useful results for their search queries. Google’s goal is to offer the searchers the most appropriate and relevant content. According to Google, RankBrain offers different results in different countries for the same query.  This is because the measurements in each state are different, despite the similar names.

Does Google RankBrain Work?

Is the AI doing a good job? So far, it seems like it. After all, Google promoted it from merely processing parts of the unknown key phrases to using it for all search queries.

And why wouldn’t they? After all, RankBrain appears to be doing a better job at improving search results than the Google engineers themselves. In fact, when it was pitted against a number of engineers to find the best page for a search query, the AI outperformed the humans by 10 percent.

 

Quite impressive, isn’t it?

The cool thing about RankBrain is that, in contrast to global changes to the search algorithm, it can improve search results on a per-keyword basis. This makes the SERPs more precise than before and allows for more granular improvements.

Also, ironically, even though RankBrain is a machine learning algorithm, it actually increases the influence of human users on search results. That’s because the AI can use direct feedback from how users interact with your content to judge its quality. For that reason, you need to focus less on pleasing the machines (read algorithm) and more on actually swaying people to click on your stuff.

RankBrain is a machine that learns the artificial intelligence system, transforming letters and numbers into a mathematical algorithm. While this is a technical definition for RankBrain, Google uses this algorithm to improve its search results. With RankBrain, Google learns what users want, what they’re looking for, and plans to deliver better results to users. Search location, keywords entered, etc. By taking into account the variables, Google aims to learn exactly what the user is looking for and how they try to reach a conclusion.

The RankBrain algorithm is considered the third most important factor that evaluates which results should be shown in any search based on Google sources. RankBrain also affects less than 25 per cent of searches worldwide.

Read more When Does Google RankBrain Algorithm Influence a Query Result

_______________________________________________________________________________

Please contact us for seo service packages at TDHSEO.COM.

TDHSEO Team

Email: tdhseo@gmail.com
Skype: tdhseo
https://www.facebook.com/tdhseo

Thank you!


  • -

How will Google Possum Algorithm change local SEO

The latest algorithm update, called by SEO experts s Possum algorithm affects the local search and appearance of the business listing. The ranking of 3 pack and local results are affected by the algorithm.

Many local businesses that have an office address outside the main city but found it difficult to get ranked in the searches. The Possum update is going to boost the local rankings of these businesses. This is good for the businesses as they can now get listed for the keywords including the city name for their services though they are having their physical address outside the city.

Google filters the results of the businesses with the same domain name or phone number. It only shows one or two search results. The latest Possum update is now filtering more such businesses. Now any business can not show up in the search results if they are located in the same building or if the owner is the same even if the business names are different etc.

The Google Possum update made the local search result filtering more sophisticated. There is no need to worry about dropping of local rankings as the update is not going to hurt the rankings. The rankings actually will go up organically for competitive keywords.

Separation between local and organic search

Local and organic search are drifting apart with the latest update. For example, in the past if the URL you were linking to in your GMB listing was filtered organically, it would have had a negative impact. Since the update, this no longer seems to be the case.

Many local businesses will likely see positive results from this, while businesses without a local market might face some competition for rankings.

Location and affiliation

Google is now filtering based on address and affiliation. For businesses that fall outside city limits, ranking for keywords that included that city name was difficult. Since Possum, these businesses have seen a huge spike in rankings.

Conversely, this may cause rankings to drop for clients with one main GMB listing and several affiliated GMB listings.

For example, we have a client who owns a clinic with a primary location, but has separate GMB listings for individual doctors. Google isn’t necessarily removing the listing or enforcing a penalty, but is simply picking the most relevant listing and filtering out the others that are too similar – in some cases, Google is suspected to be going as far as to find an affiliation by owner, even if addresses, phone numbers, websites and Google accounts are separate.

These listing haven’t disappeared, and can be viewed by zooming in further on the local finder.

The location of the user

If you’re searching from a different physical location other than that of the business, you’re likely going to encounter a completely different result. As a general rule of Possum: the further the distance, the lower the ranking. This is unsurprising as many local businesses are looking to optimize for “near me” searches, which doubled from 2014 to 2015.

Read more How does Google Possum Algorithm change search results

_______________________________________________________________________________

Please contact us for seo service packages at TDHSEO.COM.

TDHSEO Team

Email: tdhseo@gmail.com
Skype: tdhseo
https://www.facebook.com/tdhseo

Thank you!