I’ve consulted with hundreds of business, and I’ve answered thousands of SEO questions.
When I sit down with a new client, I always ask about their goals.
Most of the time, I hear the same ones:
“I want to optimize my content for keywords.”
“I want to rank higher on Google.”
“I want to drive more traffic to my site.”
But there’s a much more fundamental need that every SEO needs to address.
Think about it. What is SEO?
It’s an obvious question but bear with me.
SEO is optimizing your site for search engines, right? That means it’s important to consider how search engines interact with your site.
This is a building block of SEO, but it doesn’t get as much air time as keywords or rankings even though it should.
There are many factors that affect how search engines treat your content.
Crawl budget is one of those factors.
If you understand crawl budget, you can use some techniques to make sure your site gets crawled effectively.
That means you’ll have better SERP results and better SEO overall.
Don’t worry––there’s nothing too technical here. It’s easy for anyone to pick up, and it can give you a serious SEO advantage.
What is crawl budget?
In the past few years, the term “crawl budget” made its rounds in SEO circles. Lots of people speculated about what exactly it meant, but there was never an official definition.
Until January 2017, that is.
That’s when Google’s Gary Illyes published a blog post that defined specifically what crawl budget is.
According to Illyes, what people call “crawl budget” is actually considered to be two parts in Google’s eyes.
Those two parts are crawl rate limit and crawl demand.
Here’s how Google defines crawl rate limit:
Basically, Google doesn’t want to overload your website by crawling it too heavily.
Crawl rate limit stops Googlebot from making too many requests and slowing down your site.
If no one’s using your site, it’ll respond to Googlebot quickly, and so Googlebot will most likely crawl it more.
Got it? Awesome.
Now let’s look at the definition of crawl demand:
This has huge implications for SEO. Google is saying that popular content and fresh content has a higher crawl demand (and we’ll talk about this later on).
Google sums everything up to give a definition for crawl budget:
“Taking crawl rate and crawl demand together we define crawl budget as the number of URLs Googlebot can and wants to crawl.”
Pretty simple, right?
Ideally, you want all of your pages to get crawled, and you want Googlebot to want to crawl your site.
So now it’s time to address the question that’s on your mind: How do you leverage crawl budget to get better SEO?
It all comes down to giving Google exactly what it wants.
Google has one mission: to provide the best searching experience for users.
If you can help them with that mission, you’ll be rewarded.
Let’s look at how exactly you can do that.
Improving site speed
Site speed is crucial for good SEO, and that’s true for crawl budget too.
It makes sense that if your site is faster and performs better overall, it’ll be able to handle more requests from Googlebot and human users at the same time.
Google notes this at the bottom of their blog post on crawl budget:
So if you speed up your site, your site’s crawl budget will increase.
But before you do anything about it, you should test your site’s speed.
I use the Pingdom tester:
You can test your speed from four locations. I recommend using all of them to get an idea of how users from around the world are experiencing your site.
This may take anywhere from several seconds to a couple of minutes depending on the size of your site.
Once it’s finished, you’ll see a helpful summary:
This will give you a good idea of how your site stacks up.
In particular, your load time and number of requests are important.
Your load time should be as short as possible. It should be in the range of 500 milliseconds to 2 seconds.
Any longer than that and you run the risk of Google paying less attention to your site.
Google’s John Mueller has said that if a page on your site takes over 2 seconds to load, Google may crawl your site more slowly.
2 seconds is pretty fast, right? For Google, it’s not fast enough.
Your site should also have as few requests as possible.
Googlebot can make more requests and crawl your site faster if you have a low number of requests to start with.
So how few is good enough?
Web performance expert Steve Souders Tweeted that the average number of requests is 99.
If you’re under that mark, great. And if you’re in the 0-50 range, that’s even better.
You can see by now how closely related site speed is to crawl budget. But not every site is lightning fast.
If your site is slower than a snail, don’t worry. There are some easy steps you can take to speed it up.
Here’s what I recommend.
1. Declutter your design
This might be a tough one.
It’s 2017 and everybody’s trying to get the trendiest designs possible.
While lots of designs certainly look nice, they don’t always perform well for users.
Take BuzzFeed, for instance.
Definitely trendy.
But is it functional? Not so much.
The numbers speak for themselves:
It’s just under that 2-second danger zone, and it has a whopping 222 requests.
Ouch.
I’m not saying you can’t have good design and high performance.
But I am saying that not every nice design is worth it.
If you find your site is slower than you’d like, try simplifying. You can still have a great design, and in fact, you can often keep your current design. You just have to make a few adjustments.
Start by enabling compression. This will reduce the bandwidth your pages take up.
Gzip is one of the best tools for this.
Next, try reducing your image sizes or compressing them.
Huge images can be an absolute killer of speed. They can easily cause your site to take way longer than 2 seconds to load.
I recommend trying to reduce your file sizes first, although compression is a more convenient option.
Compressor.io will do this for you in a flash:
And if you have any unnecessary images, I suggest removing them entirely.
When you add new images, try to limit the file size if you can.
Finally, you should make some adjustments to your code if you are able to do so.
Excess code can weigh down a website like an elephant. It not only causes a poor user experience but also forces your site to take longer to respond.
If you’re a programmer or know a developer, it’s worth your time to improve your code.
You want to do two things: 1) Make JavaScript asynchronous and 2) combine as many CSS files as possible.
That may sound foreign to you, but these steps are just streamlining your code.
So remember, when it comes to design, you want a harmonious balance of fashion and function. Don’t let appearance make you forget about performance.
All of this will help you increase your crawl budget and allow Googlebot to send more requests. This will make sure your site is crawled more often.
2. Watch out for crawl errors
Here’s another tip from Google:
Crawl errors happen when Googlebot is crawling your site and runs into an issue.
Obviously, you want 0 crawl errors, but how do you monitor them?
You might not know that Google will actually inform you of any errors it encounters while crawling your site.
It’s a feature from one of my favorite tools, Google Search Console.
To get to the Crawl Errors section, open Search Console and select the property you want to analyze.
In the sidebar on the left side, click on Crawl.
Then click on “Crawl Errors.”
You’ll see the Crawl Errors dashboard:
There’s a graph that will show you a visual overview of how many errors you have.
At the bottom, there’s a list that specifies the URLs that are returning crawl errors (if you have any).
There are two types of common errors: site errors and URL errors.
For our purposes, we want to look out for server errors, since that’s what Google warns against in their blog post.
If you have a server error, it will look something like this:
Here’s more on server errors from Google:
Google has more information on server errors on this page.
Usually, these errors mean that your site is taking too long to respond. Your server might be overloaded or even temporarily down.
To help diagnose these errors, Google recommends using the Fetch as Google found in Search Console.
To get there, go to the sidebar in Search Console and click on Crawl, then on Fetch as Google.
Try fetching your homepage by leaving the box blank and clicking “Fetch.” (You should also try “Fetch and Render” separately to cover all the bases.)
If Fetch as Google returns your homepage without any errors, then you can probably assume Google can access your site okay. That’s a good sign, and the error may go away.
If errors persist or you can’t fetch your site, look into fixing any server errors. You might need to speak to an expert about this, but it’s well worth it since server errors can be deadly.
3. Clean up your low-value URLs
In that blog post, Google specifically calls out “low-value URLs.” These “can negatively affect a site’s crawling and indexing.”
It makes sense. Google wants high-quality content––the best of the best. Low-value URLs act as filler and by definition are not high quality whatsoever.
These low-quality pages can include:
There are a few simple rules you can follow to avoid low-value URLs like these.
First, make sure all of your pages are working correctly.
You want as few 404 pages (especially soft 404s) as possible, and you should only use redirects when you absolutely need to.
Second, avoid duplicate content.
You probably already know that Google will slap you on the wrist for duplicate content, but it can also negatively affect your crawl budget.
Third, avoid infinite spaces.
These happen when links go on and on. Google gives the example of a calendar with a “Next Month” link. Googlebot could theoretically follow those links forever.
As Google puts it, “crawling those URLs may use unnecessary bandwidth and could result in Googlebot failing to completely index the real content on your site.”
See how dangerous this can be? It’s something you want to avoid at all costs.
In short, make sure every URL on your site is contributing in some way. Don’t resort to black hat SEO or other shady tactics.
Increase your popularity
Every site wants to be popular. It’s part of all SEO strategies.
As it turns out, popularity is important to Google too.
Check out this little snippet from Google:
So even though it’s not that obvious at first, a good content marketing strategy can increase your crawl budget.
You want people to be sharing your content like crazy. The more links to your site, the more Google will crawl you.
A good backlink strategy can help you get more popular. And of course, if you can create a sharing frenzy on social media, you’ll be golden.
Google will recognize your site as an authority, and you’ll get a higher crawl budget.
But in order to get popular, you first need to…
Keep everything fresh
Another way to increase your crawl budget is to give Google what it wants: fresh content.
This goes hand-in-hand with popularity. The more fresh, comprehensive content you put out, the more people will share it and the more of a crawl budget you’ll get.
There are a few different ways of refreshing your site.
The first way is to create new content. I know that’s obvious, but I have to mention it because it’s the most effective way of keeping your site fresh.
But this freshness can actually decrease over time. In other words, Google will give the content an initial boost when it’s first published, but over time its SEO goes down.
That’s why it’s important to keep adding fresh content and make it a regular part of your content strategy.
The second way is to modify existing content. If Google notices a change in your content, it will likely recrawl your site.
If it finds improvements in your content, Googlebot will give you a higher crawl budget.
The third way is to add new pages. Google operates on the logic that popular sites will grow, and a good sign of growth is the addition of entirely new pages.
But you shouldn’t add new pages just to add new pages. Google still looks for quality first and foremost.
The fourth way is to get links from fresh sites. You probably know that backlinks pass on link juice to your site.
If you get links from sites that Google considers fresh, then some of that freshness passes on to your site.
This should be a part of your larger link building strategy. Target other sites that are constantly updated, and that will help you work toward a bigger crawl budget.
Conclusion
Crawl budget isn’t talked about a lot but it’s at the core of SEO.
It makes you think about what your site is contributing to the Internet.
It’s also a brutally honest metric. You can’t take shortcuts to increase your crawl budget.
You have to create awesome content and let everyone know about it.
After all, Google has its users in mind. What do Google users want? The best and most relevant results.
By extension, Google wants to increase crawl budgets.
It wants users to find the best content, so if it sees sites doing things the right way, it will reward those sites.
On the other hand, if your site isn’t contributing much, then you’re going to have a hard time getting a bigger crawl budget.
At the end of the day, crawl budget is all about quality. And that’s what you should be focusing on.
So by all means, use these techniques to up your crawl budget. But never forget to bring something to the table.
How are you going to increase your crawl budget? What tips do you have to get a site crawled more?
Grow your traffic