You may have heard the phrase “crawl budget” or something similar floating around when discussing SEO and specifically how Google searches your site and makes decisions about how to rank your pages
We know that Google uses “bots” to scan your site and find more information about your content, but many marketing teams leave it at that. Why not dig deeper? Google’s system makes decisions about how often to scan your site with its botvia a “crawl budget.” The better your site is designed and the more you freshen up your content, the more often Google schedules your site to be crawled (numbers that can shift over time).
Generally you want your site to be crawled more often and kept on Google’s radar as much as possible, so let’s take a look at the best ways to help increase the “crawl budget” associated with your site.
This is a big one – Google only has so much patience, and if it looks like your site is slow or has pages that have difficulty loading, it will quickly lower its crawl. That’s good news for your visitors, because it keeps the site from getting clogged up by bots, but bad news for your ranking. Fix the problem from improving your site performance.
You should know the drill here: Take a look at your page loading times and coding, and find ways to speed things up by removing unnecessary content and updating the formats you are using. If you aren’t sure where to begin, we have a few ideas about what you can focus on.
Bots also don’t like to encounter broken links or a ton of URLs that each have to be scanned one at a time. Instead, why don’t you clean up your site so that it uses fewer individual URLs and smarter navigation. Concentrate your content in layered landing pages with more information per page, and cut down on multiple different URLs for every product – if possible.
Bots will also start giving your site less attention if they find a lot of poor content. Specifically, avoid content that copies text from other areas of site, or is very similar to content you already have. Obviously poor and low-quality content will also give the bots pause. It seems like this piece of advice gets repeated at least once a week, but it holds true: Focus on strong, valuable and well-written content that doesn’t try to cheat the system.
While Google’s bot system has improved, it still has trouble dealing with some types of content, especially when it comes to Flash, Silverfish, and so. A lot of animation is bad news, because bots may have trouble crawling it, or it will at least take a long time, and that threatens your crawl budget. So focus on simplistic web design, cut down on unnecessary images, and generally aim for a very clean interface.
The Search Console is a handy tool to check in and see if you have any reported crawl errors from the latest bot investigations. In a perfect world you’d notice broken links and other problems before a bot finds them, but this stuff happens. Check in periodically to see if the bots have picked up on anything that you missed.
If the bots from various search engines notice that you are producing a lot of new content for them to check, they will increase your crawl budget. This also has direct ties to better SEO, because Google likes to see an active site that is engaging frequently with visitors. So keep up on that blog and don’t be afraid to update your product information when necessary.
There are a lot of strategies around about how to fool the Google bot or hide certain information from it. Remember, Google is constantly improving its bot capabilities, and there aren’t many of these tricks that work anymore. It’s also far better to spend your time on one of the other steps above.