Google’s John Mueller
recently explained that submitting a URL for indexing, when it redirects to
another URL, doesn’t make sense.
This topic was covered
in the latest #AskGoogleWebmasters video in which Mueller answers a question
about client-side JavaScript redirects.
Here is the question
that was submitted:
“Can
Google evergreen Chromium, detect client-side JavaScript redirects? I’m not
able to submit GSC indexing request to pages that have client-side JS redirect
to a subscription page.”
In response, Mueller
first went over what it means to have an evergreen Googlebot. It’s a fairly
recent change that you can learn more about here.
Mueller addressed the
redirect question saying that client-side redirects are followed by Googlebot
the same way as server-side redirects.
Despite Googlebot
being able to follow client-side JS redirects, it still doesn’t make sense to
submit a redirecting URL for indexing.
That’s true whether
it’s client-side, server-side, JS, or HTML.
A redirect sends a
signal to Googlebot that the site owner would prefer to have a different URL indexed.
So, with that said, it
makes more sense to submit the URL that should be indexed instead.
Another option is to
make sure Google is able to discover the preferred URL. If it’s linked to
within the website then Googlebot will discover it during its normal crawling
process.
Mueller also adds that
using a sitemap file can help Google discover URLs faster.
No comments:
Post a Comment