Crawl budget is too often disregarded by SEO “experts”. Maybe because the most common articles you will read online are from people that work on sites with a few thousand pages that might not gain as much ROI as the larger sites that tackle the same issue. No matter the size though crawl budget is as important if not more than any other technical SEO task.
You don’t need to be an expert to derive to a conclusion of which pages are useful and which are not for your website. There is little to no difference to what an engine would conclude as useful and what your users or yourself would think as useful.
Unfortunately, most people will tell you that the more pages you have indexed, the more chances you’ll get to ranking for something which in return will increase your traffic and consequently your leads and all that mumbo jumbo. This couldn’t be further from the truth.
In today’s world the more pages you have the fewer chances of you getting higher rankings. The complexity you add for your users and engines is somewhat “penalizing” your chances of actually getting where you wanted to get. It doesn’t matter if the pages you add are easily crawlable or not, or if you have thousands of links pointing to them accessible from everywhere. The only thing that matters is that you have so many pages that don’t provide value to the users (other than targetting different keywords) that Google eventually decreases your domain’s “authority”.
By reducing indexable pages you help engines identify, crawl and index pages that have greater value for you and your users.
A few examples that you might want to check on your site are as follows.
If you make use of tag pages, make sure that you are making the best of it. Do you need to index hundreds of thousands of them? Or just a handful will do?
When I perform technical audits I always try to keep track of the tag pages that perform better organically but also provide revenue. For the rest of the tags that usually comprise 99% of the total, I try to figure out if I can also reduce them by de-duplicating or combining them.
Don’t be afraid to test adding a noindex tag. If you have hundreds pick 10 – 20 tags and add a noindex to them. See for yourself if that makes a negative impact and if not move up to a 100. Test and evaluate and test again. This is what SEO is after all.
Search pages are a big subject, you will hear people say to noindex them entirely based on what Google use to say 10 years ago. You will also hear people fanatically tell you to let Google index everything. The truth lies somewhere in the middle.
Similarly to tag pages, search pages can be of use but only if you are certain that the search pages that you index are valuable to the user. Again, test and evaluate. Which pages make sense to be indexed, which performs best with engines and your users?
I usually try to calculate internal use, a correlation between terms in order to provide similar search terms and automatically keep them indexed, conversion rate per search but also calculating internal search trends and automatically indexing search pages that trend in a certain period of time helps me include the seasonality effect.
A couple of quick examples
If you are selling products and you have a filter for a price that creates crawlable pages such as this “example.com/item?price=10min&20max” then you need to make sure that you noindex those pages.
Similarly with any kind of filter that creates an infinite space.
Another very often case I see is calendars making all dates being crawlable links.
Google also explained the issue with infinite spaces more than a decade ago and nothing has changed since.
There is almost an infinite amount of cases and examples that we can go through. Obviously the more pages your website has the more of those cases you can find. Once you have some foundation laid then you won’t have a problem keeping things tidy.
If you believe that you are in need of some support with your crawl budget optimization though please feel free to reach out for a free SEO audit.