One of our clients is a well known Job’s portal in DACH, they have around 100k jobs with around 200k pages index in Google.
We plan to release a new job search feature that will result in millions of SERP pages being created.
These pages are created using:
a) Filtering logic on the sidebar i.e. you can filter for jobs in location + industry (Bern) + function (pharmaceutical) + careerlevel (director) therefore resulting in a SERP page for example query = “bern pharmaceutical director jobs”
b) We have a free text search input where we store the most used terms in a sitemap and place likes to it in “similar searchs” box.
We’ve created rules to try and only follow / index only valuable pages such as:
i) Place nofollow on links to SERP pages with has less then X num of results and add noindex,follow in theof that SERP page.
ii) To limit the number of pages served to search engines we will nofollow links if a user selects more than 1 facet in a taxonomy i.e. user selects two locations or two industries.
We are hoping that these measures will:
A) Help us avoid any thin content penalty and also lower the bad user experiance if a user comes to our site from a search engine and enters a result page with no value.
B) Avoid the crawler indexing low valuable pages and not index our priory pages (we use sitemaps with prio num. but want to make sure that there is another on-page element that shows this) and The “low value” pages will also be deeper in our website and as we will use a well designed breadcrumb, the links to it should be minimal.
Now coming to my question:
Question 01: Should we Internal nofollow links for eCommerce SERP pages with little value and noindex them?
Should I nofollow those links which lead to low-quality pages? If i do, will this stop the search bot from wasting it’s time on those pages (I’m worried that the bot would go down a bad path and not index our other important content). And is there a risk that your search algorithm won’t like this as it looks like we are trying to hide something?
The alternative solution could also be to use rel-canonical to point the page to a previous valuable content page.
Question 02: Should a website actually worry about how the search bot will navigate through the site and only worry about the user journey?
Should we actually care about this and trust that your search bot should be intelligent enough to not get stuck in said funnel and trust that it will follow our sitemap prio?
Question 03: As we will be releasing considerably more pages over the coming months, how do you recommend that the release takes place?
i.e. releasing it slowly on an exponential bases? releasing it all at once? releasing only the top levels first?