Home > Digital Marketing News > Google’s Take on Pruning of Content on Websites

Google’s Take on Pruning of Content on Websites

Reading Time: 4 minutes

Ask anyone in the Search industry how to improve the quality of a website, chances are that few will ponder on the topic of content pruning. Even if some do, they are confused which one to prune and which one leaves it. 

In a recent Google webmaster hangout John Mueller has cleared the air on the type of pages one can keep and the one that needs to be pruned.

Do Low traffic pages affect the website? 

It is a preconceived notion that the pages which bring low traffic should be removed to maintain the quality of the website. Generally, low performing pages tend to attract a small amount of traffic on the site, and hence it should be no indexed or removed.

A news publisher has asked the same question to John Mueller in the recent hangout. The questions were asked in the context of news websites, but the answer by John Mueller widens the scope of application in other sites as well.

Question by the Publisher :

“We’re publishing news and articles. For example, we have 100 new articles every day and ten of them give us 95% of the organic search traffic. Another 90 go nowhere.

We’re afraid that Google can decide our website is interesting only for 10%.

There’s an idea to hide some boring local news under no index tag to make the overall quality of all publishing content better. What do you think?”

Google’s Way of Analysing The Website Quality

Mueller explained how Google’s algorithm looks at the webpages and the website as a whole to have an understanding of the quality of the site.

Mueller answered : 

“In general, we do look at the content on a per page basis. And we also try to understand the site on an overall basis, to understand how well is this site working, is this something that users appreciate if everything is essentially working the way that it should be working.

So it’s not completely out of the question to think about all of your content and think about what you really want to have indexed.”

He stated that judging a webpage solely on the traffic it receives is not the right metric to consider a page as low quality.”

Mueller added:

“But especially with a news website, it seems pretty normal that you’d have a lot of articles that are interesting for a short period of time, that are perhaps more of a snapshot from a day-to-day basis for a local area.

And it’s kind of normal that they don’t become big, popular stories on your website.

So from that point of view, I wouldn’t necessarily call those articles low quality articles, for example.”

He further stated to the publisher that just because a news article is not popular does not mean it qualifies as low-quality pages.He laid down the things to look at while determining the quality of the page.He also explained the issues on the page such as content that are hard to comprehend, broken English and poorly structured content which can leave a bad experience to the user. He also gave a solution for the websites that contain both mixes of good and poor quality content on the website.

Mueller :

“On the other hand, if you’re publishing articles from … hundreds of different authors, and they’re from varying quality and some of them are really bad, they’re kind of hard to read, they’re structured in a bad way, their English is broken.

And some of them are really high quality pieces of art, almost that you’re providing. Then creating that kind of a mix on a website makes it really hard for Google and for users to understand that actually you do have a lot of gems on your website…

So that’s the situation where I would go in and say, we need to provide some kind of quality filtering, or some kind of quality bar ahead of time, so that users and Google can recognize, this is really what I want to be known for.

And these are all things, maybe user-submitted content, that is something we’re publishing because we’re working with these people, but it’s not what we want to be known for. Then that’s the situation where you might say, maybe I’ll put no index on these, or maybe I’ll initially put no index on these until I see that actually they’re doing really well.

So for that, I would see it making sense that you provide some kind of quality filtering.

But if it’s a news website, where… by definition, you have a variety of different articles, they’re all well-written, they’re reasonable, just the topics aren’t that interesting for the long run, that’s kind of normal.

That’s not something where I’d say you need to block that from being indexed. Because it’s not low quality content. It’s just less popular content.”

John Mueller made a crucial point on diagnosing the quality of content. He said evaluating the content is the best way to tell whether the page is receiving poor traffic is due to less popular content or poorly structured content.

 Just because a content receiving less traffic does mean the page quality is low, in fact such content does not have any effect on the site quality. It could potentially flag your potential problem on the page.

 You can determine page quality and take the recommended action on it.

  1. If the information on the page is relevant but not updated (It is important to focus on information update).
  2. If the information on the page looks incomplete (Remove it).
  3. If the topic on the page is unpopular and receives less traffic (Keep it).

Popular Searches

SEO Company in Australia | Digital Marketing Services | SEO Agency in Australia | SEO Australia | Ecommerce SEO Services | Technical SEO Audit Services | Google Penalty Recovery Services | Local SEO Services in Australia | PPC Services in Australia | ASO Services | SEO Consultants Australia | SEO Guide | What is On page SEO | What is Digital Marketing | Technical SEO Factors | Google Algorithm Updates | Google Reverse Image Search | Google Ranking Factors | HTTP Codes List | What is Schema Markup

Leave a Comment

Related Posts