There has been a lot of chatter regarding using reporting services like SEMRush and Moz which will give you site audits. Usually, these reports give back information such as errors and issues One of the major errors I've seen reported on a lot of HubSpot sites are the Duplication errors for things such as:
The first bullet, as mentioned, can be fixed with using URL mapping tool and HubL, the last two, however, we have zero control over. I know we can use JS to change the titles and meta's for the listing and Google's searching should be ok with checking those differences on the page. This can be tedious to do and a much better way would be to allow us to have a "Page Title Builder" in the SEO & Crawlers Tab. This can let us fi page titles and have them render server side rather than relying on JS to fix things and hoping they will be picked up during the next Google crawl.
I have created a mockup below for this (based on Yoast's SEO WP Plugin) to better explain a possible solution:
Also, I think it would be awesome to have Description fields added to the blog topic's so we can create custom Meta Descriptions based on those topics and not have to rely on just the default blog meta description or one that Google pulls automatically (which is usually the post excerpt of the most recent post for that page).
This issue remains one that we plan to fix, but since we don't currently have a clear timeline on the development I'm going to mark it as 'being reviewed'. I will post another update when we have more to share.
- Snaedis
Hey everyone,
I've been chatting with AJ and a few other folks about this, and wanted to update this idea.
The quick answer is that we know that the inability to edit listing pages is a pain point and we're looking to solve for it long term. We plan on making any page editable in the CMS, but getting there is unfortunately non-trivial.
From my research I've only seen this picked up by 3rd party tools like Moz and Screamingfrog. @ravenousblue I'll shoot you an email to see what you're seeing in Google Search Console because my impression has been that this is annoying (clients see errors in other tools) but not actually strongly impacting SEO (Google understands these are listing pages from the rel canonicals).
I believe SEMrush now also correctly ignores listing pages for their duplicate content checks.
Thanks for your patience as we fix this the right way!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.