- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Google Search Console: Sitemap Could Not Be Read
SOLVEDec 10, 2020 2:03 PM
Hello,
I am trying to submit my HubSpot blog and landing page sitemap https://info.magaya.com/sitemap.xml to Google Search Console, and I'm getting the following error:
- Couldn't Fetch
- General HTTP Error
Our main site is magaya.com (WordPress) and the blog and landing pages are on the subdomain on HubSpot.
We have seen a slight decline in search volume since moving the blog to HubSpot, so I'm trying to get to the bottom of things.
Please advse.
Thanks!
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
Accepted Solutions
Dec 11, 2020 11:25 AM
Hmmm - very odd. Nice job on the 301s 🙂
I've looked at that sitemap from all angles and it's 200OK all the way, so I'm leaning towards something glitchy happening.
Before we get to fixing it for Google, if you haven't already set up the main domain and the info sub domain on Bing Webmaster Tools: https://www.bing.com/toolbox/webmaster Submit the sitemaps there and see if Bing also has problems fetchingt the info. subdmain one.
You still need to fix the problem for Google but at least if Bing can get it, you know you did a good job and you're not going mad.
To begin the fix for Google - this is what I'd do.
1. Sign post the new sitemap on the main domain.
Add this line:
Sitemap: https://info.magaya.com/sitemap.xml
to robots.txt file of your main domain: https://www.magaya.com/robots.txt
It's OK to have two sitemaps in one robots.txt and this signpost means that at least if the Google glitch fixes while you're not looking, it'll just crawl the new sitemap in the usual way without you having to manually resubmit.
2. Remove odd formatting from subdomain sitemap + sign post again
Save this https://info.magaya.com/robots.txt , somewhere safe, then click to edit it.. follow these insrtuctions if you've not done that in HubSpot before:
https://knowledge.hubspot.com/cos-general/customize-your-robots-txt-file
Delete these lines:
Disallow: /sample-* Disallow: /blog/sample-*
Because the tab might be making Google bot do strange things.
Add this line: Sitemap: https://info.magaya.com/sitemap.xml
Check GSC in a couple of days to make sure it's not suddenly crawling loads of sample blogs / sample landing / sample thank you pages. Or better, delete any sample blogs etc , then you don't need those two disallow lines anyway.
Give it a couple of days and resubmit that sitemap if Google hasn't already crawled it.
If this doesn't work, hit me up here: jon@noisylittlemonkey.com and I'll get someone more geeky to look at it. IT SHOULD BE EASY! 🙂
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content