Google offers advice to website owners that have different websites for different locations
In a recent English Google SEO office-hours session on YouTube, a user asked a question regarding duplicate content issues. He explained that the pages are not indexed since the same content was duplicated on different versions of websites for different locations.
Here’s the question:
We have a client that ran a franchisee business in France. And so they have multiple websites, one for each location they run the business. The thing is that every website has
exact same content but location address and phone number. So the first two websites are performing very well in terms of positioning for some related search keywords. But every new location that has been added, based on the first two websites, they are not indexed at all.
And despite asking the Search Console to index them, they are not indexed. So the reason being is for duplicate content. So what is your recommendation for that to solve this problem?”
Google’s John Mueller recommended two approaches that can be taken in order to resolve this issue:
On the same website, create different landing pages for each location
The first approach is to have only a single website and create specific landing pages for each location on the same domain. John Mueller says that one advantage of this approach would be that you have to optimize only one website for SEO instead of having to optimize different websites. It would also help in ranking the pages better and make your domain stronger.
“Pick one domain and focus on that and have individual landing pages for the locations within that one domain. That means that the content is just indexed once and the addresses are more like attributes of that main content. That would probably make it also such that the content is much more visible in search, because instead of having to promote all of these individual websites, you promote one strong website. And that makes it a little bit easier to rank that content in search”, said John Mueller.
Content should not be exactly replicated
The other approach John Mueller suggested is to make the content unique on all different variations of websites for different locations. He also added that the content does not have to be 100 percent unique, but it does need to have some distinguishing content/elements that indicate to the crawler that the pages are not duplicates.
“If you know that this is going to be a very small number of different locations, then maybe there is something that you can do to make the content a little bit more unique across these locations.
I mean, they don’t have to be very unique in that they’re completely different websites kind of thing. But it should be clear that, when our systems process the text on a page, it’s something separate”, said John Mueller
You can watch the entire discussion on this topic here:
SEO Services | SEO Agency | Pay Per Click Services | Ecommerce SEO Services | Website Audit Services | Local SEO Company | ASO Services | Professional SEO Company | Enterprise SEO Company | Google Penalty Recovery Services | What is a Blog | What is Sitemap | On Page SEO | What is Digital Marketing | Types of Digital Marketing | Google Reverse Image Search | What are Breadcrumbs | SERP | Top Youtube Searches | SEO Tools | Landing Pages |What is Cloaking | Types of SEO