The Importance of Recrawling Your Website
It’s always recommended to complete an SEO audit a few times per year (despite the fact that most companies only do about one per year), but no matter how many SEO audits you may do it’s important that you continue to recrawl your website. Crawling your website is a quick and easy way to spot flaws in your SEO system; flaws can occur at any moment throughout the year. In order to make sure you’re not just catching these flaws every time you go through a big organized audit (you know, the kind with a spreadsheet), you have to recrawl regularly.
Unfortunately, many companies either don’t bother with this continuous recrawl or worse, they don’t know how to crawl their website for errors at all. In order to be successful, the sooner you learn this little trick the better, so let’s get right to it.
Top Reasons Recrawling Your Website Is Needed
There are always a few different ways to solve SEO and technical issues with your website, which can make things unorganized. A developer might see a clear path to solving one issue, but this could cause SEO problems that he/she couldn’t have anticipated. A few of these potential problems include:
- Hiding links instead of removing them.
For a development team, it is oftentimes easier to just hide links than go through and remove them all. If your development team doesn’t know too much about SEO, this might not seem like a problem. Of course, we SEOs know that this is considered cloaking, which is a black hat tactic Google will definitely penalize you for having.
- Two robots tags causing a noindex.
It makes complete sense, but it isn’t something that many think about when doing an initial audit because you can fix the problem in a way that makes it look like everything is OK. You might find pages that are noindex that really should be indexed by Google, so you go through and remove the original Meta robots tag that was “noindex.” Seems simple enough, right?
When you do your second recrawl, however, that’s when you’ll notice if things aren’t working with this particular change. This is where you can dig deeper to see if you have two Meta robots tags across your pages—one saying index and one saying noindex. Unfortunately, search engines always follow the robots tag that has the most restrictive content values.
- Rel Canonical and 404 page confusion.
This is a huge place where the confusion happens. You complete a site audit and then tell your developers to use the canonical URL tag to help let the search engines know that all link juice should be passed to a particular version of your URL. It’s one line of code, your developers make the change, and it seems like you’re done.
If you didn’t do a recrawl, you wouldn’t be able to see if the tag was implemented incorrectly. It might look fine when looking at your code, but actually visiting your page there could be a problem (sending users to a 404 error page). This is usually because of mixed case (lowercase vs. mixed case), which is probably the most popular problem you’d find.
It’s clear that recrawling your site after you have made changes makes sense, but also remember that recrawling your site every few months is a good way to make sure you’re not missing anything. Your development and/or design team (or anyone in your company, really) could have made minor changes throughout the last few months, and a recrawl is the only want to really make sure you’re still on the same page where you left off long ago after your SEO audit.
How to Get Started Recrawling Your Website (Yes, It’s Up to You!)
Whenever we think “recrawl” it sounds like it’s something Google has to do for you, but recrawling is completely in your control. Most companies use a tool to help them complete a recall and spot any SEO issues that might have occurred, including the ones discussed above.
A few of the most popular tools include:
- Raven’s Site Auditor Tool. You can crawl up to 1,000 pages per website and up to 10,000 total pages per day. You should get results within 24 hours depending on the size of your recrawl, and the best part is that you can schedule a recrawl weekly or monthly so you never have to remember.
- Screaming Frog. This tool has a lot of power and can handle large sites. You can export your data to slice and dice later and is free for up to 500 URLs. Visit the site to learn more about the other features the tool can offer aside from just recrawling your site.
- Request a Recrawl from Google. Although not necessarily a tool, you can also request a recrawl from Google, and it’s fairly simple: You visit the Webmaster Tools homepage, click “Fetch as Google,” type in the page you want to check and click Fetch.
In the end, it’s the tools that show you the problems with your website, but it’s up to you from there. You have to be on the lookout and if you see something that doesn’t look quite right but you’re not sure why to get your development team involved.
Do you have any personal experiences recrawling your website and discovering SEO issues? Do you have a great tool you use to recrawl your site? Let us know in the comments below.