Author Archives: Shane Long

  • -

Dofollow vs Nofollow backlinks – Which is better for SEO

The main difference between a Dofollow and a Nofollow link is that the former passes PageRank signals (many refer to this as link juice or SEO), and the latter does not. In other words, the difference lies in how the Google algorithm treats every link. The links, however, function in a similar way but Google looks at them differently depending on how you use the tag.

For Dofollow backlinks

A website’s PageRank is normally its value in Google’s eyes. PageRank signals are just points that tell Google along with other search engines the page that is being linked to is simply valuable or possess valuable information on it. Pages with a higher PageRank generally appear higher up on search engine results pages.

When a highly ranked site links to a lower-ranked one with a Dofollow link they are passing along their ‘link juice’ and it boosts the lower-ranked website’s value to the search engine. The lower-ranked one gets more PageRank points from the superior quality Dofollow link.

These points are the vital differences regarding why a Dofollow link is better than the other kind when you are building links back to your website. While Dofollow links may be crucial for SEO purposes, nofollow links are essential for websites too.

For Nofollow backlinks

Do NoFollow Links Really Offer NO SEO Value?

Well, not exactly. It’s a bit complicated but let’s dig into things. Google wants you to think there’s no value but many leading minds in the SEO field have shared their own thoughts and insights based on their own experiments and experiences.

Matt Cutts (the lead guy at Google who controls the knobs and dials on their search algorithm) has stated that their engine takes “nofollow” literally and does not “follow” the link at all. But studies reveal that Google does follow the link, but it does not index the linked-to page, unless it was in Google’s index already for other reasons (such as other, DoFollow links that point to the page). This is an important piece of the equation that should be noted.

Another thing we know in this modern era of SEO, post-Penguin update, is that link diversity is key. This means diversity in the websites that are linking to you, diversity in the link text used for these links and diversity in the types of links in your overall inbound link portfolio. Think about it, wouldn’t it look a little suspicious if ALL your site’s inbound links were DoFollow links? This isn’t natural since the large majority of links from social media sites for example are NoFollow links and a percentage of website’s who pass on links are default NoFollow links. The point is a healthy mix of NoFollow links among your treasured DoFollow links is a good thing and they shouldn’t be removed unless you’re confident they’re low quality links that could harm your site.

It’s also true that NoFollow links are likely to help you attract DoFollow links which based on the fine print in Cutts’ definition of NoFollow links, could increase the value of the NoFollow links that are pointing to the same web page.

Obviously, this is a good thing if we get a link from a great site that’s authoritative and relevant to our site’s page that’s getting the link traffic. But suppose this page has several, or a dozen links that are automatically dofollow – does that page lose all of the ranking love it gets from Google? Most likely not, but yes, it’ll be diminished over time. Any webmaster or SEO pro who is maintaining a site really doesn’t want to lose Google love, do they? Can you blame them if they want to keep as much link juice as possible? If you were in their shoes, wouldn’t you feel the same way? Sure you would – you know it.

Dofollow vs Nofollow backlinks

The Do-Follow Links Argument

The proponents of the “do-follow only” camp have a pretty logical argument backing them up. In terms of SEO value (“link juice”, to use the nomenclature) Google should theoretically only count do-follow links. Any link that is no-follow has a direct and clear instruction telling search engines to ignore it. Think of the rel=nofollow tag as one of those detour signs you occasionally see. The street is still there, it’s just you’re being told to pass by it.

Although the NoFollow link by definition is flagged to not be followed by search algorithms, as a contributing factor for PageRank, it’s important to look at the indirect SEO benefits and additional value these links offer. Stop obsessing over only attracting DoFollow links and instead focus your efforts on creating quality content that will naturally attract a mix of NoFollow and well deserved DoFollow links.

Do Nofollow links carry any benefit?

As stated by Google, Nofollow links don’t pass any PageRank or ranking benefits onto the target site. Using Nofollow drops the target from Google’s graph of the Internet. Thus your SEO will be inefficient if your backlink profile consists only of the Nofollow kind. Handily, there are tools available to check if your backlink is Nofollow. Otherwise, you can figure it out manually, by right clicking on your browser, clicking on “view page source”, and searching for the a rel=”nofollow” attribute.

But are Nofollow links entirely useless? Used in the right way, they pose numerous benefits. When multiple sites in your industry link back to yours with your target keyword/s as their anchor text, your ranking may skyrocket, as documented by Adam White. Thus, if the links are from related sites, especially those of high domain authority, Google will very likely give you credit for the anchor text used.

It is important to bear in mind that as a site owner, you should care more about human interaction rather than whether the search engine spider crawls your backlink. That means, it’s more important that people actually click a link pointing to your website, regardless of whether it’s Dofollow or Nofollow. A click is a guaranteed visit whereas a higher SERP guarantee you higher visibility. To a site visitor, a Nofollow link looks the same as any other. If it is appealing, they will click on it. Thus whilst Nofollow links don’t deliver link juice, they certainly deliver referral traffic, especially when featured strategically. And traffic opens possibilities: conversions from leads, sales, calls or even simply valuable demographic data and analytics.

Furthermore, think one step ahead and realise that Nofollow links can eventuate into Dofollow links. The onus is on spreading awareness of your site through driving human click-rate, and a powerful avenue is through authoritative recommendation. For example, let’s imagine a situation where you publish a press release about your new website. The press release link may be Nofollowed. However, if a popular media personality sees your post and likes it enough, they will link it on their blog. This Dofollow link will not only boost your visibility on indexes, but your visibility to humans too.

Related post: Tips for link building to speed up keyword ranking


Please contact us for seo service packages at TDHSEO.COM.


Skype: tdhseo

Facebook Messenger: tdhseocom

Telegram: + 084  07790 89915

Whatsapp: + 084  07790 89915

Thank you!

  • -

Onpage optimization techniques you should focus on

SEO isn’t specific to a particular type of content or a website — it’s a bunch of practices aimed at appearing on the higher positions in the search engines. It requires a lot of geeky technical stuff like understanding of redirects, HTML and web server technology.

A successful SEO campaign still starts with some important on-page SEO factors that you must optimize before you do anything else.

So, What are the most important techniques you need to do with?

Fulfill Search Queries

– Create/optimize the existing content to the audience’s needs. This helps to gain SERP visibility, attract more qualified traffic and build trust.
– Make sure to have an H1 tag on every website page. Most CMSs automatically wrap your title in <h1> tag, but some do not. The H1 tag is a must on any website page as long as it helps search engines understand what your page is all about.

Optimize the URL slug:

– Keep it as short as possible (4 words at most—it makes it easy to understand and remember by users, but it also improves your CTR.)
– Also, try to include the keyword in the URL as well—it will definitely help with the on-page optimization.
If your page has already been published for a while, do not change the URL, especially if it’s already ranking in the – – – SERPs or if other pages already link to it. Doing this would mean you are migrating your URL and it’s best to avoid it in most cases.

Optimize the Page URL

Have an instance of your focus keyword in the URL, without using any special character, symbol, commas, etc.

Use hyphens (instead of underscore) to separate different strings. This makes the URL clean and easier for the user to guess the content on the page.

In addition, opt for a user-friendly URL structure for your entire website. Something that both search engines and the user can remember and relate to, but without any compromise to your business goals.

For e.g., a permalink structure like ‘yourdomain/this-is-test-post’ is preferred by many websites, but if you are a news website you may want to follow a date wise structure like ‘yourdomain/2019/08/15/this-is-a-test-post’.

Optimize the meta description

– Include the target keyword in this description.
– Remember, the meta description should be under 230 characters—anything above that will be truncated by Google in the SERPs.
– Same as with the page titles, keywords are not everything. Your meta description should be compelling and tell readers exactly what information will be provided on the page.
– While meta descriptions don’t have a direct impact on rankings, they will increase the click-through rate and that is a ranking factor.

Optimize the images

The efforts on page optimization of images placed on the most important promoted pages will pay off in spades. At least you should include the ALT tag.

That’s how the optimized images can help you:

– they influence the ranking of the promoted page;
– they are included on the list of image searches;
– attract more traffic to the site.

 Compress images to improve the loading time of the website and supply them with the alt text. Search engines use alt text to identify the content of the web page, so it’s a great way to make the website more accessible and improve its ranking.

Important Onpage optimization techniques

Create Trust & Engagement Through UI, UX, and Branding

– Improve your website performance. Website performance metrics like the page load speed are a part of UX. Make sure to research and improve these to boost the conversion and keep your visitors satisfied.

– Responsive Web Design. Back in 2015 Google started penalizing mobile-unfriendly websites. The number of mobile visitors is growing each year and today RWD is a must.

– Build trust through UI, visuals, navigation, branding — all these pieces define if your website looks trustworthy.
Include social media sharing buttons. Help your visitors save and share your content across the Web.

Include your focus keyword

Remember, on-page optimization is not about gaming the system. It’s about sending the right signals, both to the user and to the search engine.

Essentially, on-page SEO is all about optimizing your content to answer a particular user query.

For that, use strings (technically called keywords) that relate to the user query.

Use a combination of exact match keywords and related keywords, but don’t overdo that.

Ideally, an exact match keyword density of 1.5 to 2%, sprinkled with a few more LSI keywords is good enough to send the right signals to the search engines.

Optimize the page content

Now that you have optimized the meta data supporting your page or blog post, it’s time to move on to optimizing the actual content on it.

Here are the steps you need to follow to do this:

Try to include the keyword in the h1 heading, but do not force this. Again, it is far better to publish natural (rather than keyword-stuffed) content.
– Make sure your page or blog post has an h1, but remember that there should be only one h1, and it should be above the fold. Typically, your h1 will be the actual title of the blog post or page.
– Same as with the meta tags optimization, focus on creating an attractive, compelling h1, rather than something that feels built exclusively for Google’s crawlers.
– You can use the CoSchedule’s Headline Analyzer to analyze your headline.

Optimize the content in the body of the page.
– Try to include your target keyword in the first 100 words of the page or blog post.
optimizing on page copy for seo keyword
– In general, avoid including the exact target keyword more than 3-4 times/page.
– Add other keywords from the same keyword bucket in the body of your content. This will help Google contextualize your page or blog article, so that it shows it to users searching for the information you provide.
– Try to add synonyms to your target keyword as well. This is an excellent move not only because it will help Google contextualize your content, but also because it will help you avoid using the exact target keyword too many times.
– Include LSI (Latent Semantic Indexing) keywords too. These keywords are semantically related to your target keyword. To find more LSI keywords, go to, enter your target keyword and pick the most relevant suggestions to include in the body of your page content.

Read more How to optimize image for SEO


Please contact us for seo service packages at TDHSEO.COM.


Skype: tdhseo

Facebook Messenger: tdhseocom

Telegram: + 084  07790 89915

Whatsapp: + 084  07790 89915

Thank you!

  • -

How to optimize image for SEO

Here you’ll find out how to optimize the images for the web so that you make your website fast and SEO friendly.

Choose the Right File Format

Before you start modifying your images, make sure you’ve chosen the best file type. There are several types of files you can use:

PNG – produces higher quality images, but also has a larger file size. Was created as a lossless image format, although it can also be lossy.
JPEG – uses lossy and lossless optimization. You can adjust the quality level for a good balance of quality and file size.
GIF – only uses 256 colors. It’s the best choice for animated images. It only uses lossless compression.
There are several others, such as JPEG XR and WebP, but they’re not universally supported by all browsers. Ideally, you should use JPEG or JPG for images with lots of color and PNG for simple images.

Use descriptive filenames

Before we talk about naming your files, let’s talk a little SEO and planning. You

ARE doing your keyword research BEFORE your post – correct? If not, now is the time to start.

Here’s a post I wrote about How To Plan Blog Posts that Google Loves that I thought would help you out. You need to make sure you know what keyphrase you are trying to rank for as well as similar (but different phrases) so that you can optimize your photos/images around them.

Are you uploading photos named DSC0001.jpg or maybe wiaw-5.jpg? If so, you are losing a fantastic opportunity to optimize for SEO. You want to give your photos and images descriptive file names. This will help search engines readily understand what your images (and ultimately your blog post) are about.

Let’s say you are writing a blog post about a peanut butter banana smoothie recipe. You could name your images:

The idea is to name your files using your keyphrase and variations of it for the different images.

Since chances are you have been blogging for quite a while and haven’t done this for all your photos, I don’t want you to stress over this. I would prefer you learn what to do and start doing this for all your posts from here forward. Then as you update and optimize old posts, you can take care of the photos then.

Make Sure You Name Your Images Appropriately

Yes, even the file names matter.

If you name your file in a descriptive way you’ll help Google identify the object on the image easier.

But anything is better than “untitled-1.jpg”.

Let’s say that you have an image of a dog.

Then name it “dog.jpg”, or “my-new-dog.jpg”.

It’s a good idea to use keywords in your file names.

But remember that a file name should be short, so don’t go overboard with the file name.

It should make sense.

Best tips to optimize image

Use Images of the Right Size

While the images are a must on your website and in your blog posts, they are also the main reason behind slow loading speeds.

It’s for this reason that it is important for you to make your images (width or height) fit your needs.

A good practice is not to make them a lot bigger than you need them to be.

You may be asking, but doesn’t the browser fit the image to the required size?

The answer is yes it does.

But the problem is that the browser still has to load the full-sized image, even if it shows it only in a width of 500 px.

On my blog, every image (except for the featured one) is 800 px wide, never more.

This can be different for you, it’s your call but remember that bigger the image (pixel-wise), the bigger the file size.

And with that in mind, the browser needs more time to load it.

You can resize the images in Photoshop or any of these free photo editors.

If that is not enough or you just want to resize them in bulk, here is a great tool to do so.

Just remember that you need to change the width.

The height will change automatically.

Use Images As Citations

If you don’t know what I am talking about, don’t worry.

Citations are mentions of your business that can help you rank in local SEO.

And a good thing is that you can embed this data into images and then use them as citations on platforms where you can publish them.

Use your NAP and be consistent to improve your local SEO.

The tools above can be easily used for this purpose too.

Just don’t upload them to your website.

Create Descriptive Image Captions

The image captions are one of image SEO best practices.

What’s more important is that they are visible to your visitor.

They can give a more detailed context of an image and provide a better user experience.

It’s even said by many experts that if you include captions they can decrease bounce rate.

The thing is that we don’t always read the full article but to better understand it we are drawn to captions.

So it’s a good idea to have descriptive captions included to better illustrate what the image is about.

They also give extra insights to search engines to better understand the image.

Reduce the File Size

Now that you have reduced the image size (yes even some file size), it’s time to reduce the file size.

To do that you can again use Photoshop or Gimp and combine file size reduction with image resizing.

If you don’t have any of the tools or you are not comfortable using them, there are other tools to use.

One thing to note when reducing the size of the image is, that you are reducing the quality of the image.

But don’t be afraid to do it.

The results will most likely still be a great looking image with a big size reduction.

As you can see, our new SEO Image Optimizer is going to become your indispensable tool when optimizing your product pages, blog posts, and other web pages of your eCommerce site. So, grab this add-on and take the most of it if you:

– Don’t want most of your potential customers to leave your store because of low page loading speed

– Wish to drive more traffic to your website

– Have a strong desire to protect your product images from being stolen by your competitors

Read more Onpage optimization techniques you should focus on


Please contact us for seo service packages at TDHSEO.COM.


Skype: tdhseo

Facebook Messenger: tdhseocom

Telegram: + 084  07790 89915

Whatsapp: + 084  07790 89915

Thank you!

  • -

The basical knowledges of Google Panda Algorithm

Initially released on February 3, 2011, the main purpose of the Google Panda algorithm is simple. Panda works to reward high-quality more relevant websites and demote low-quality websites in Google’s organic search engine results.

With more than 28 updates since its launch, Google Panda has addressed and targeted a number of issues in Google search results including:

  • Thin content – These are weak pages with no content or with very little relevant content served to the user. Google recognises that there is very limited significant content or resources that can be beneficial to the user. A perfect example would be a healthcare website that describes very serious health conditions in one sentence. As we can understand, a user looking for information about a health condition would naturally expect to find an in-depth web page – not a page with virtually no text content.
  • Low-quality content – Websites that are populated with content, however, lack in-depth information and offer little or no value to readers.
  • Duplicate content – Plagiarism or duplication of content is a serious offence that Google does not take lightly. Whether on-site or off-site, duplication of content will get you into a big problem with Google Panda. For example, having the same content duplicated on multiple pages on your website might place you under Panda’s radar and negatively affecting your search ranking.
  • Content farming – Having a big number of low-quality pages that provide low or very little value to readers. In most cases, content farming involves hiring a huge number of low-quality writers to create short content that covers a wide range of search engine queries with the sole aim of ranking for keywords.
  • Lack of authority – Google is very serious about the trustworthy of the information provided to readers on websites. With that in mind, providing misleading information to readers might place you on the wrong end of Panda penalties. An easy way to start building up your authority is having an author box under all your writings. This adds legitimacy to your content as it’s backed by a real, verifiable person within the industry.
  • Excessive ad-to-content ratio – Websites that have excessive paid adverts than meaningful content. If you’re populating your website with too many ads this ultimately results in poor user experience. Websites that do not endorse a balance or provide any meaningful information may be devalued by Panda.
  • Low-quality user-generated content (UGC) – These may include guest blog posts that are full of grammatical errors, cannot be trusted and are not authoritative. Many forms of poor UGC are created for spammy SEO purposes and are highly sceptical of having their URLs devalued.
  • Misleading and deceiving content – If your website pledges to deliver content that matches a given search query but then fails to deliver on the promise. This is highly deceptive and may result in high user bounce rates.
  • A website that’s blocked by users – A website that visitors are blocking either through a Google Chrome browser extension or directly in the search engine is a clear indication that Panda might penalise it.
  • Low-quality and broken affiliate links – Having numerous paid affiliate links that have poor and low-quality content might bring you lots of problems. Again, having affiliate links that don’t take visitors to the promised website or location might bring you even further problems.

Other issues may include non-optimised pages, content with lots of grammatical errors, sites without a specific topical focus and keyword stuffing.

Since it may take some time to see whether this change has affected your site either positively or negatively, it’s best not to overreact to any perceived shifts in traffic. Instead, we recommend keeping an eye on your site’s traffic over the next few months, and if you see any large fluctuations in traffic from organic searches, you should take steps to address any issues you might have. This will ensure that you see positive changes in the next Panda update.

Rather than chasing algorithm changes and trying to anticipate how to tailor your site to each new update, we recommend focusing on following SEO best practices like establishing your site’s authority and creating quality content that provides value for your customers. If you’re not sure whether your site meets these standards for quality, take a look at Google’s Quality Rating Guidelines, which can help you make sure you are communicating the purpose of your site and providing the best possible experience for people who use it.

In addition, it’s important to address any technical issues like crawl errors, broken links, or poor site performance that may be adversely affecting your search rankings. Google looks at over 200 ranking signals to determine how a site ranks in its search results, and while content quality is important, it is also essential to make sure your site runs properly, is easy to navigate, and is viewable on mobile devices.

Panda 1: Launched on 23rd Feb 2011

Google panda or Farmer update is focused on quality of the content delivered in a pages of websites. It helps People in finding high-quality sites in Google’s search results. This change tackles the difficult task of algorithmically assessing website quality. So this change had affected low quality sites so badly, that they had to update the low quality pages to improve their whole site’s ranking. It’s better to focus on developing good quality content rather than trying to optimize for any particular Google Algorithm. Good quality content basically content means fresh, informative and interesting content.

Panda 2.0: Launched on 11th Apr 2011

After successfully implementing the first Panda update, lots of trusted publishers and high quality pages started getting more traffic. After this, Google rolled out Panda 2.0 update globally to all English language Google users. Post this implementation, Google started collecting data about sites which users had blocked. This was an indication to Google that those sites might be offering low user experience.

Panda 2.1: Launched on 9th May 2011

After a month Google again launched a new minor update, which was made to enhance the algorithm of Panda 2.0. Search results were not affected that much in this update.

Panda 2.2: Launched on 21st June 2011

This update was made for the web pages which used copied content from any other original source of content. This targeted the sites which used to scrap content from other sources. In this implementation Google tried prevent the low quality copied sites and push original content higher up on the search engine result pages.

Panda 2.3: Launched on 23rd July 2011

This was a small algorithmic change of last panda update which expanded Panda filter and gave better ability to result in higher quality content for better user experience.

Panda 2.4: Launched on 12th Aug 2011

After successfully implementing the Panda update globally in English language, Google rolled out Panda 2.4 the same features for all languages except Chinese, Japanese and Korean. This change impacted almost 6-9% of queries for most languages.

Panda 2.5: Launched on 28th Sept 2011

Google made some further changes after few months in Panda Algorithm. This update was made to wider the analyses of searching high quality content.

Panda 3, Flux: Launched on 5th Oct 2011

This was a little change on Google Panda update, which broadened the analyses of search results according to valuable content, consolidated duplication, improving usability and engagement quality.

Panda 3.1: Launched on 18th Nov 2011

After releasing the flux update Google was rolled out several Panda update which were Panda 3.1, 3.2, 3.3, 3.4, 3.5, 3.6, 3.7 and 3.8. These all were minor Panda refreshing data updates. Search results were not affected that much but became more accurate and sensitive.

Panda 3.9: Launched on 24th July 2012

Google rolled out another update on Panda algorithm. It did not impact the search results much, although some sites noticed the affect 5 to 6 days after the update. It was a little change on the Panda update and designed to remove the low quality sites from the search engines. Then there were two more minor refreshing updates were launched by Google, which affected almost 1% of the results.

Panda #20: Launched on 27th Sept 2012

This was one of the major updates of Google Panda. Google rolled out this update by changing the Panda data and the algorithm as well. It was overlapping with the previous update EMD (Exact Match Domains) .After this change 2.4% of the search queries were affected. Then there were 4 more updates were launched by Google to further improve the results. Post this update Google stop giving priority of those sites which used false keywords in their domain and got good ranking. Google prioritized the domain name which had the content depth corresponding to the keyword which used in domain URL.

Panda #25: Launched on 14th Mar 2013

Again Google rolled out an important update – Panda 25 started dealing with spammers and people who abuse the process. Google made some changes in this update by changing the indexing process of Data.

Panda Dance: launched on 11th June 2013

It is almost like the Google Dance update in which Google ran the update approximately for 10 days. Google checked the effect after implementation and saw the ranking dance. Google named the new algorithm update as the Panda Dance, knowing that keyword rankings for all sites were began to dance or jumped up and down each and every month.

Panda Recovery: launched on 18th July 2013

Google rolled out a new update on Panda. Goal of this implementation was to soften some previous penalties on the low quality sites. Some targeting methods were changed to get fine search results.

Panda 4.0: launched on 19th May 2014

It was one of the most important updates of Panda. Due to this change approximate 7.5% of English language queries were affected. Main goal of Panda update was to penalized the poor scraper sites, and boosting sites with great quality content was reached by this Panda updates.

Panda 4.1: Launched on 23 Sept 2014

Google rolled out another Panda update which affected 3-5% of search results based on the location. Google worked on it so accurately that it improved the Panda algorithm as well. It elevated the high quality small and medium size sites so perfectly that everyone tried to make their site fresh and informative. Then, there was one more and last update Panda 4.2 (Launched on 17th July 2015) rolled out by Google, which impacted the English search queries of about 2-3%.

Read more Tips to reveal Google Panda Penalties


Please contact us for seo service packages at TDHSEO.COM.


Skype: tdhseo

Thank you!

  • -

How to realize your site has been hit by Google Penguin Penalties

Regardless of what triggered the penalty, and whether it’s for a personal site or a client’s, you need to be able to diagnose what caused it and fix it. In most cases, you can get almost all of your search traffic back in the short-term.

If you’re ready to get rid of any penalties by Google Penguin Algorithm holding back your organic search traffic or you’d just like to prepare for future problems, let’s get started.

How Can I Discover if I have been penalized By Google Penguin Algorithm

If Google determines that your site contains spam backlinks, one of these actions may apply to your site. Many SEO’s simply call this a punishment because it reduces Google’s ranking and leads to data loss.

For example, in 2013 Google punished Rap Genius (now Genius) for participating in a link scheme. We followed your ranking and found that a dozen keywords we followed had lost 4 to 6 pages on Google:

how to recover from Google Penguin Algorithm

This was a widely publicized Google penalty that paid much attention to Google Penguin and Rap Genius.

Since then, much has changed how Google punishes a website because of the Google Penguin algorithm update. For example, Google has (in the past) penalized a complete website.

As of 2016, Google announced that the penalties would be more precise. Gary Illyes of Google said that Google Penguin is much “nicer” because:

  • Now it updates in real time.
  • Ignore spam instead of using it as a classification factor.
  • grainy and will not punish entire sites as often

If Google Penguin Algorithm penalizes your website then your website rank will get down and as result website traffic will also get a drop.

Links will play a role in SEO despite the changes brought about by Google’s Penguin 4.0 update, so websites should continue to include them in SEO ranking efforts. But as any search ranking effort, link building cannot be the only strategy because numerous links without any online noise is likely to raise doubts with Google crawlers.

If you have made efforts to build online buzz about your website and have earned links organically through different strategies, you’re likely to get rewarded by the search engine with a higher rank.

Links continue to remain a strong part of SEO, but it cannot be followed in a sequestered manner, which is separate from other strategies. Under the new update, devalued websites can actually rebuild their ranking by simply building high-quality links.

Websites affected by the Penguin update can make changes to recover quickly:

  • Avoid over optimising anchor texts and websites in general, so they look more natural for Google bots.
  • Remove low-quality links to your website to ensure Penguin updates don’t penalise you.
  • Add high-quality links from reputed source sites.
  • Create a mobile-friendly interface without popups to avoid penalties from Google.
  • Build organic links through different online marketing strategies.
  • Build high-value content to ensure better inbound links to your website.

The Penguin 4.0 update is designed to eliminate poor link building efforts by focussing on organic strategies. Websites will want to consider making these changes quickly in order to stay relevant and high up in search rankings.

Read more How Does Google Penguin Algorithm Work


Please contact us for seo service packages at TDHSEO.COM.


Skype: tdhseo

Thank you!

  • -

How Does Google Penguin Algorithm Work

Google Penguin targets sites that have created spam backlinks to replicate Google results and get better rankings. Google’s algorithm assigns each website a large number of classification factors. Some factors, such as the speed of the website and the HTTP protocol, others less. Here are some examples of backlinks to give you an idea of ​​the type of backlinks Google Penguin targets:

Types of Backlinks:

  • from unknown websites
  • Which have the same or similar anchor text
  • Obviously, they were built with a bot or tool
  • Who were paid or stimulated
  • Who comes from different and strange Countries?
  • Which were built in large quantities in a short time
  • From doubtful country

In a live video, Google said the Google algorithm “tagged” your links, which was a great way to think about backlinks. They gave some examples like:

  • Links that were rejected.
  • Penguin touched links.

The easiest way to understand how Google Penguin works is to read the “Connection diagrams” section of the Google Webmaster Guidelines. They explain in detail the types of link schemes that can negatively affect your site:

  • Buying or selling links to follow.
  • Exchange of links
  • large-scale marketing of objects
  • Follow ads with links.
  • Forum or blog comments containing links.
  • Left at the bottom of the page or templates (at site level)

A mathematician does not have to understand what “negative influence” means. If you have enough links pointing to your website, you’re likely to lose rank in Google.

It happens that some of these backlinks or websites are not easily recognizable to the human eye. In recent years spammers have been very successful in making black hat links look like white. This has been a big challenge for Google as many SEOs write quality content and use it on low-quality, deleted domains, and private blog networks.

It is a permanent cat and mouse game: Google is launching a new update; SEO is responding with a new tactic, Google is responding with an update that is geared towards this tactic.

It’s part of the core Google algorithm now

Until now, Penguin has been its own entity.

With Penguin 4.0, Google says that “Penguin is now part of our core algorithm,” which it notes consists of more than 200 other unique signals that can affect rankings.

Google Penguin is About Spam Link Building

Google Penguin is designed to punish sites those manipulate the search results falsely with black hat link building techniques.

Google called it webspam. Google strongly condemn the false manipulation in search results.

Google said we want webmasters to focus on the quality content, not SEO, and black hat SEO absolutely not.

It’s real-time

As Gary Illyes of Google’s Search Ranking Team explained:

Historically, the list of sites affected by Penguin was periodically refreshed at the same time.

Once a webmaster considerably improved their site and its presence on the internet, many of Google’s algorithms would take that into consideration very fast, but others, like Penguin, needed to be refreshed.

With this change, Penguin’s data is refreshed in real time, so changes will be visible much faster, typically taking effect shortly after we recrawl and reindex a page.

It’s granular

According to Illyes: “Penguin now devalues spam by adjusting ranking based on spam signals, rather than affecting ranking of the whole site.”

What does this mean in practical terms? That isn’t so clear.

The effects probably won’t be seen immediately

It’s not known whether the new Penguin code has been rolled out to all of Google’s data centers.

But even if it has, it could take time before the effects are seen given that there are almost certainly many URLs that will need to be recrawled.

Read more What is the target of Google Penguin Algorithm


Please contact us for seo service packages at TDHSEO.COM.


Skype: tdhseo

Thank you!

  • -

What is the target of Google Penguin Algorithm

In September 2016, Google’s latest algorithm update indiscriminately affected online brands web traffic, again. Without a well-planned SEO strategy, brands have had a hell of a time ranking in search engines and effectively promoting themselves online. Let’s explore why.

Unlike its 3.0 predecessor, this update is in reality the next generation of the Penguin algorithm. For starters, the 4.0 release saw Penguin become part of Google’s core algorithm. As a result of this, Google said that it would no longer be announcing any further updates to Penguin.

The second piece of news was equally momentous, if not more so. With the rollout of Penguin 4.0, Google announced that henceforth Penguin would be live, reevaluating sites in real time as they are being re-indexed. This in fact is huge, because it means that if your site should be impacted by Penguin, you would not need to wait months or even years for the next Penguin update in order to bounce back. What’s more is that Google later revealed that the new algorithm does not issue a penalty per se, but rather devalues the spammy links. The devaluation of links is more of a lack of a positive than it is the attribution of a negative. Think of it like this, links are good, they increase your ranking when done properly. Should they however be “spammy links,” Google will simply ignore them, and your page will not have the added value of having links on it (or as many links). This is in contradistinction to the modus operandi of previous versions of Penguin, that actually demoted the ranking of a page that contained spammy links.

Google’s Penguin hunts for spammy backlinks

Gaining and maintaining organic backlinks is one of SEO’s primary jobs. So what does that mean?

Organic backlinks are unpurchased and unsolicited web links between someone’s website, social media apps and blogs. Organic or natural links are one of the top three website ranking factors that Google Search takes into consideration. Unnatural or spammy backlinks are web links, which are purchased in bulk to artificially boost web traffic to the specific website. Before 2012, the more unnatural backlinks your SEO specialist could buy, the more authority or more traffic your website would have. SEO guys and gals that purchase backlinks today are considered Black Hat SEO specialists–also known as spammers.

Initial Penguin Algorithm update released in April 2012, making purchasing backlinks a violation of Google Webmaster guidelines, which resulted for the first time ever as a ranking penalty.  If your brand isn’t monstrous, but your site has thousands of backlinks, you should check with your webmaster. Chances are many of those backlinks are spammy AF. Pre 2012, it was not uncommon for a local mom-and-pop shop to have tens of thousands of backlinks from all over the world. That’s a red flag and the Penguin Update will hit you with a penalty and there goes your site’s traffic.

The evolution of Google Penguin Algorithm

With the release of Penguin 4.0, the algorithm has in a sense completed the evolutionary cycle. It has certainly come a long way from its original construct, skimming for link spam on the homepage. In fact, even the sort of tense relationship between the algorithm and the SEO community has in many ways been healed as Penguin completed its evolution.

Penguin Evolution

No longer are those legitimate sites who have been hit with a Penguin penalty waiting (which in the case of the latest update was years) to recover. As a result, you can make the case that the most interesting and dynamic aspect of Penguin’s progression has not been technological, but sociological – as in its most modern form the algorithm has balanced both technological need with communal unanimity.

Read more How Does Google Penguin Algorithm Work


Please contact us for seo service packages at TDHSEO.COM.


Skype: tdhseo

Thank you!

  • -

Something important about Pirate Algorithm Update

Google first released the Pirate algorithm in August 2012 and rolled out a second major update to the algorithm in October 2014. The search engine algorithm focuses on tackling the huge problem of online copyright infringements. Websites that have been reported for copyright and have received adverse amounts of web page removal notices will be penalised. Allowing Google users to find quality, and legitimate information less of a challenge.

This algorithm has been designed around data received from website users and owners, who have reported websites for using content that falls into the category of infringement. However, it is important to consider that while some web pages may be demoted for adverse complaint signals, Google cannot remove said content unless a valid copyright ejection is received from the owner of the copyrighted material along with the rights.

Google recently made a change to their algorithm designed to demote major sites hosting pirated content, but the algorithm has a secondary side effect – bringing more visibility to smaller torrent sites that had been previously buried in the search results due to the more popular sites ranking so well.

The new algo takes the number of legitimate DMCAs filed against a site into account when ranking sites, with more DMCAs resulting in a lower ranking.  However, smaller sites hosting pirated content that weren’t at the top of the search results were not impacted as much as other more popular sites, as many companies focus only on the top ranking pirate sites when filing DMCAs.  This change could mean that companies will have to not only focus on the top ranking sites, but also the ones multiple pages deep in the search results to prevent them from showing up at the top later.

Google will notify a website owner of a content infringement via a DMCA take-down notice, the notice is sent to the website owners Webmaster Tools account. A DMCA will include details of infringing url(s) or in some cases a whole website.

Google provides a full transparency list of websites that have been reported for content infringements, which can be viewed here: Transparency Report

All websites that are reported to Google will be documented, allowing said data to be analyzed and used to configure the Pirate algorithm. A website with a high number of removal reports will usually be demoted on all searches across the Google search engine.

Google has kept making changes in its search engine algorithms to demote the most wicked pirate sites. One of the most renowned changes is an improved effort to make such sites less visible in search results, directly indicating that they will not appear in the initial search pages. Since 2012, Google has been running a down ranking system but is reported to lack effectiveness as per the copyright industry groups such as RIAA and MPAA.

Just last week, the giant has announced of the improved version that aims to address this comment/issue. With the updated version of ‘How Google Fights Piracy’ report that was originally introduced in 2013 for defending the claims by film and music copyright holders, Google seems to give an overview of all the efforts for fighting piracy as well as reveal the importance of responsibility of copyright holders for making the content available. The 26-page report delineates the following anti-piracy principles of Google:

  • Defense against abuse
  • Generation of better and more legitimate alternatives to keep piracy at bay
  • Provision of transparency
  • Funds monitoring, as Google believes that the most effective way to fight against online pirates is to reduce their money supply while prohibiting rogue sites from its ad and payment services
  • Guarantee of effectiveness, efficiency, and scalability

Because this filter is regularly updated, the formerly influenced sites can manage to escape in case they have rectified the mistakes or made improvements. Well, at the same time, the filter is also capable of sensing new sites that managed to escape before as well as releasing ‘falsely caught sites’.

The update is just as other updates such as Penguin and allows processing all sites to catch any site appearing to be in violation. Once caught, the site is then stuck with a downgrade until they tend to get fewer or no complaints to get back into the race. However, since its day of introduction, the filter has never been rerun, which means a real pirate site along with new violators during this two years, which  need to be punished, might have managed to escape. This has perhaps made Google to finally update its Pirate Filter after two years!

Read  more Who is affected by Googe Pirate Algorithm


Please contact us for seo service packages at TDHSEO.COM.


Skype: tdhseo

Thank you!

  • -

Who is affected by Googe Pirate Algorithm

Once Google’s Pirate Update was released back in 2012, which was created to penalize pirate sites, everybody stopped. Everyone wasn’t really sure what to do. How severe was the punishment for those running pirate sites? Are there really any lessons to be learned by other SEOs and marketers?

Every time Google rolls out a brand new algorithm update, it seems that there are quite a few people in the SEO world who goes crazy.  But it reminds me of being a student in high school all over again.  Every time there’s a new big project for a class, or a surprise test announced, everybody loses their cool.  But in reality, we know that Google is doing all this for the best, just like those teachers in high school.  All they want for us to do is to do things better.  Create better content, use fewer keywords, create higher quality links, and so forth.

The Sites At Risk

According to Google’s transparency report, most of teh sites at the top of the copyright notice pile are file sharing sites. This list includes cyberlockers with files for download, search sites widely used to find infringing material, Bittorrent sites and community sites for swapping files. There are very few blogs, legitimate forums or other non-piracy oriented sites on the list.

While this means that legitimate sites that don’t specialize in pirated content aren’t likely to get bit, it also means spam blogs and plagiarist sites and nefarious content farms are not on the list either. However, they are typically addressed and filtered out by other methods.

In short, the sites most at risk are the ones that are in the crosshairs of the major copyright holders as they are the ones sending off the most DMCA notices and racking up the most “points” against the domains they’re dealing with.

Still, this isn’t necessarily a guarantee that more notices equals greater penalty. Google also tracks how many of the total URLs have been reported and all of the sites at the top of the list have had less than 5% of their URLs involved, most less than 1%.

Depending on how Google approaches this penalty, it may be possible for a site with fewer URLs involved but a higher percentage to receive a stiffer penalty.

Overall, it’s safe to mention that Google has removed 300 million illegal download URLs, at the terribly least, over the past several years. whereas this seems like an astronomical number, it’s nevertheless to satisfy either MPAA or RIAA.

However, reports documenting the Pirate Update unrolled on the week of October 20, can be indicating that higher penalties area unit hit torrent websites.

A week has gone along since Google unrolled the latest Pirate Update, and there are some massive torrent players who reportedly walked the plank. Pirate Bay, one among the main torrent websites for embezzled media downloads, lost roughly forty-eight % of its Google visibility, according to an early analysis generated by Search Metrics. However, TorrentFreak is coverage this major drop doesn’t part Pirate Bay, claiming that they are doing not receive abundant traffic from Google. Pirate Bay depends a lot of on direct traffic from those that are looking for torrents; for the foremost half, their target market is already aware that they exist.

Other hard-hit torrent web site in Search Metrics’ analysis includes,,,, and There are a total of 30 websites enclosed within the list, with keywords starting from picture downloads to look at movies free. As of now, these thirty websites have basically fallen from Google’s virtual formation. This all sounds nice, and if these reports are literally the case, Google is on the correct track.

Read more Something important about Pirate Algorithm Update


Please contact us for seo service packages at TDHSEO.COM.


Skype: tdhseo

Thank you!

  • -

Google hummingbird algorithm and what to know about it all

Category : SEO News

Hummingbird is the name Google calls its new search algorithm (the thing that determines which results to display when a user types in a search), and it means a lot for your website. Google says the name Hummingbird is used because of the “fast and precise” birds that we find in nature. So let’s have a look at the hummingbird so that we can get a better understanding of Hummingbird.

Google has always had a simple message for website owners: concentrate on users. That advice applies to Hummingbird more than any other algorithm or update that has been released in the last 15 years.

Google does not want you to over-optimize your pages and websites purely for search results IF your website or page does not answer the searcher’s query. Remember that all Google cares about is the searcher. It has no loyalty, contract or responsibility to websites or website owners. This is good news if you’re a true beginner building a new website.

To stay high in the search results you need to focus on the user too. This all comes down to content. Hummingbird loves in-depth, good quality and original content because that is what users want. So the best way for you to deal with Hummingbird is to create good content.

Hummingbirds are one of the smallest species of birds in the world. But they’re able to flap their wings incredibly fast (this is where the hum comes from) which allows them to reach flight speeds of 34 miles per hour. And they are the only birds that can fly backwards. It is these impressive abilities that led to the Aztecs revering these birds to the extent that they wore them as a talisman. To the Aztecs, hummingbirds represented vigor, energy, skill in battle, and sexual potency.

Google’s Hummingbird has nothing to do with sexual potency (as far as we know, although there are probably Google fetishists out there somewhere). But the other characteristics of the hummingbird are important to discuss in order to get a better understanding of this new algorithm.

Hummingbird is a more vigorous attempt by the search kings in Google to be more human. Up until now when a user typed in a search query Google would pick out what it considered to be keywords. It used those words to go hunting for websites that might be what the user was looking for, and displayed those websites in its results.

But the search engine didn’t really understand what the user was asking. With Hummingbird Google can now better understand the context of a search query. The search engine is working harder and in a more agile and energetic way to deliver results that actually answer the user’s query, rather than looking at keywords in isolation and making an educated guess.

This means that Hummingbird is more than a search algorithm update – it’s a vigorous and energetic attempt to up the skill level, react to a user’s search, and deliver better results. Knowing the details of Google Hummingbird will help you keep up with the Joneses of SEO.

Read more Something important about Google algorithm update


Please contact us for seo service packages at TDHSEO.COM.


Skype: tdhseo

Thank you!