Jun 24, 2019 4:36 AM
We just released our new website on hubspot, but unfortunatly our pages can't be indexed by google. When we try to see them as googlebot on the search console, we have a crawl anomaly (https://cl.ly/c8b189ec7594), it seems to be a 403.
I found this article talking about this issue, saying : "Googlebot: HubSpot does not allow the crawling of HubSpot pages from the Googlebot originating from non-Google IP addresses. If you attempt to crawl your HubSpot site as Googlebot, you will likely see a 403 error."
But I don't see any solution to this problem? What can I do to make sure my website can be indexed by google ? I can't event submit a sitemap on the search console as Google can't "fetch" it.
Thanks a lot,
Jun 24, 2019 5:08 AM
Jun 24, 2019 5:11 AM
Yes sure, it's https://www.comet.co/. The homepage is not on Hubspot, but all the others pages are (https://www.comet.co/pourquoi-comet, https://www.comet.co/fonctionnalites, https://www.comet.co/offre-entreprises, https://www.comet.co/offre-startups, https://www.comet.co/tarifs ...).
I already checked at the robot.txt on hubspot, and I searched for a noindex in the source code but maybe I missed something (I hope so 🙂 )
Thanks a lot,
Jun 24, 2019 6:09 AM
Hope that helps.
Jun 24, 2019 6:24 AM
Thank you very much for your fast answer Matthiew!
The google bot is unable to read them (403 error when I make some crawler tests), so we can't ask for an indexation, so the pages can't be indexed... Same for the hubspot sitemap that can't be submitted in the search console (https://cl.ly/5c45ac335a6a).
Jun 24, 2019 6:35 AM
OK, good to know.
In that case, I can't see any other issues that could be causing this on your side. I'd recommend submitting a ticket to the Hubspot support team so they can check what is happening.