We use cookies to make HubSpot's community a better place. Cookies help to provide a more personalized experience and relevant advertising for you, and web analytics for us. To learn more, and to see a full list of cookies we use, check out our Cookie Policy (baked goods not included).
Nov 18, 2021 8:05 AM - edited Nov 18, 2021 9:58 AM
Hi HS community 😉
Today I want to ask you about your experience in the crawl budget optimization on the HS CMS. When I check the crawl report for our pages I see that 52% of the crawls are javascript ones.
.
It's not good, cause for a healthy website the majority of crawls should be made for the HTML files. To give you some examples, these are the paths to files that consume a lot of the crawl budget:
/_hcms/forms//embed/v3/form/
/hs/cta/ctas/v2/public/cs/loader-v2.js
What's your experience with optimizing that? Blocking these catalogs through robot.txt doesn't seem like the right solution.
Solved! Go to Solution.
Nov 22, 2021 3:32 PM
Have you minified and compressed? You could also combine CSS and JS into a single file which would result in less calls.
|
Nov 22, 2021 3:32 PM
Have you minified and compressed? You could also combine CSS and JS into a single file which would result in less calls.
|
Nov 19, 2021 8:57 AM
HI @BCzerniakowski,
that's a great question! I wanted to tag a couple of subject matter experts to get their opinion on this topic:
hi @JonPayne, @webdew, @Kevin-C, @Josh, any insights you could share with us? Thank you!
Cheers
Mia, Community Team
![]() | Wusstest du, dass es auch eine DACH-Community gibt? Nimm an regionalen Unterhaltungen teil, indem du deine Spracheinstellungen änderst Did you know that the Community is available in other languages? Join regional conversations by changing your language settings |