You are viewing this forum as a guest. Login to an existing account, or create a new account, to reply to topics and to create new topics.
Google webmaster tools is reporting 2 instances of duplicate content. The details are:
Online Store Search - QE - Page 32
/index.php?app=ecom&ns=prodsearchp&ref=&count=18&offset=558
/index.php?app=ecom&ns=prodsearchp&ref=Speed+control&count=18&offset=558
Online Store Search - QE
/ItemAdvancedSearch
/index.php?app=ecom&ns=prodsearchp&ref=&prodsort=DEFAULT
All pages are <meta name="robots" content="noodp, noydir, index, follow" />.
Is there a reason why we would want Google to index search result pages like these?
We have set URL Parameters that should stop Google from indexing them but that system seems to be a bit hit and miss. Would it be better to have such pages marked as "nofollow" and if so how can we do that?
Last edited by sdn (11-06-2017 04:34:59)
Offline
Really good SEO controls for pagination in search results is something I'd like to tackle at some point to K9. The issue you're pointing out is not surprising, but not really detrimental either. We'd rather see Google go through the search results, if the bot wants to, and follow links it sees from there. A few duplicate content warnings are outweighed by the potential exposure of everything else. That's the reasoning, anyway. Like I said, it's on the radar to tackle programmatically in a future update.
Offline