Skip to main content
All CollectionsAuditTroubleshooting
Crawling / Query Creation Issues
Crawling / Query Creation Issues

Is your Audit query failing or taking too long to load? Let us guide you through troubleshooting!

Updated over a year ago

If you are having an issue with your query being stuck on loading or you are seeing an "Oops! We couldn’t get data for this query. Try again in a few minutes." prompt, keep reading for potential causes & solutions.

Occasional stall

We always recommend trying simply rerunning your query with the same parameters to exclude the possibility of the issue being just an occasional stall.

In general, if your query is loading for over 10 minutes, it will most likely not load at all, and it can be considered stalled. In this circumstance, you will need to reach out to our Support team for assistance in credit reimbursement.

If your new query is loaded with an error, your credit is automatically reimbursed, and you may need to follow the next steps in this article for resolution.

Page performance

If an Audit query fails and throws an error, we always recommend checking your page's performance in Lighthouse.

If your page takes too long to load for our crawler, it might also be too slow for the Google Bot, and Lighthouse will provide you with a detailed report on that.

If your Lighthouse report returns an error, takes a long time to load, or your page performance is low, that's what may be causing the issue. In this case, we recommend reaching out to your web developer.

Javascript redirects

Some URLs have scripts in the code that force redirects when users and bots enter them.

For regular users, it doesn’t cause issues, but since our bot’s goal is to crawl your code, not just to view the page, broken scripts confuse our crawler, and it returns with a failed crawl.

Broken scripts in the code

This case is similar - if you have broken scripts in your code, our crawler might not be able to read it properly. When you look at your page, you might not notice it, but our bot might not be able to analyze the code.

That's why it's important for SEO to ensure that your code isn’t bloated and doesn’t contain any broken scripts.

If your page contains improper code, be it either within regular HTML tags, <script> tags, or even Structured Data that is part of the <head> - your webpage might not be crawlable as a result.

Crawler time out

Our crawler can only spend a limited amount of time reading through your code. After all, it also needs time to crawl your competitors and complete other Audits.

If your page is quite long and heavy, takes a large amount of time to load, or you have an infinite scrolling set up on your page, our crawler might run out of time.

If an Audit fails, we always recommend checking your performance in Lighthouse as if your page takes too long to load for our crawler, it might also be too slow for the Google Bot. Unfortunately, for now, we don’t have the option to audit pages with infinite scrolling, but you can always use our Domain Planner or Content Editor as a workaround.

Elements blocked by Surfer's ad block

Our crawler uses an ad block when crawling pages and some elements on your page; for example, pop-ups can trigger our ad blocker, and as a result, your Audit might fail.

Pop-up leading to a redirect

Let’s say your page displays a welcome pop-up for visitors, and it then causes a redirect on your page. Since our crawler masks as a regular visitor, when it visits your page, a redirect will be triggered as well. As a result, our crawler gets confused and returns a failed Audit. As a first step, we’d recommend opening your page and checking what happens when you visit it as a regular viewer.

The tips above can help you troubleshoot failed crawls, but it would be best to contact your web developer. They’ll know best what should be done to improve the performance of your page.

Do you still need help? Don't worry! You can contact us at [email protected] or via live chat by clicking the icon in the bottom-right corner.

Did this answer your question?