In this article, you will learn the most common issues that can occur when running an Audit on your page. We try to prepare our crawler for all circumstances but there might be some cases when Surfer won’t be able to crawl the audited URL. Below you’ll find the most common issues that cause failed crawls.

If an Audit fails, we always recommend checking your performance in Lighthouse as if your page takes too long to load for our crawler it might also be too slow for the Google Bot.

  1. Javascript Redirects

  2. Broken scripts in code

  3. Crawler time out

  4. Elements blocked by Surfer’s ad block

  5. Redirect after a pop-up that a page displays

1. Javascript Redirects

Some URLs have scripts in the code that force redirects when users and bots enter them. For regular users, it doesn’t cause issues, but since our bot’s goal is to crawl your code, not just view the page, broken scripts confuse our crawler, and it returns with a failed crawl.

2. Broken scripts in the code

This case is similar - if you have broken scripts in your code our crawler might not be able to read it properly. When you look at your page you might not notice it, but our bot might not be able to analyze the code. It’s also important for SEO to ensure that your code isn’t bloated and doesn’t contain any broken scripts.

3. Crawler time out

Our crawler can only spend a limited amount of time reading through your code. After all, it also needs time to crawl your competitors and complete other Audits :) If your page is quite long, takes a large amount of time to load, or you have an infinite scrolling set up on your page our crawler might run out of time.

If an Audit fails, we always recommend checking your performance in Lighthouse as if your page takes too long to load for our crawler it might also be too slow for the Google Bot. Unfortunately, for now, we don’t have an option to audit pages with infinite scrolling, but you can always use our Domain Planner or Content Editor as a workaround.

4. Elements blocked by Surfer's ad block

Our crawler uses an ad block when crawling pages and some elements on your page, for example pop-ups can trigger our ad blocker and as a result, your Audit might fail.

5. Redirect after a pop-up that a page displays

Let’s say your page displays a welcome pop-up for visitors, and it causes a redirect on your page. Since our crawler masks as a regular visitor, when it visits your page, a redirect will be triggered as well. As a result, our crawler gets confused and returns a failed Audit. As a first step, we’d recommend opening your page and checking what happens when you visit it as a regular viewer.

The above tips can help you troubleshoot failed crawls, but it would be best to contact your web developer. They’ll know best what should be done to improve the performance of your page.

Did this answer your question?