If you are having an issue with your Audit being stuck on loading or you are seeing an "Oops! We couldn’t get data for this query. Try again in a few minutes." prompt, keep reading for potential causes & solutions.
If your pages are geo-restricted, our crawlers won't be able to analyze your page content properly because they're blocked on a per-region basis. You can verify this by taking a screenshot on SERP Analyzer to check how our crawlers see your page. From there, you can take proper measures to address it. You can ask our Support Team to name a bot for you to help address this issue. However, note that you can't whitelist us using IPs.
We always recommend simply rerunning your query with the same parameters to rule out the possibility that the issue is just an occasional stall. In general, if your query takes over 10 minutes to load, it will most likely not load at all and can be considered stalled. In this case, you will need to contact our Support team for assistance with a usage limit reimbursement. If your new query loads with an error, your limit is automatically reimbursed, and you may need to follow the next steps in this article to resolve it.
If an Audit query fails and throws an error, we always recommend checking your page's performance in Lighthouse. If your page takes too long to load for our crawler, it might also be too slow for the Google Bot, and Lighthouse will provide you with a detailed report on that. If your Lighthouse report returns an error, takes a long time to load, or your page performance is low, that's what may be causing the issue. In this case, we recommend reaching out to your web developer.
Chances are you'll also encounter this error message if you use branded keywords in your queries. This is because the SERPs could be dominated or saturated by this brand's domain. Surfer needs three to five unique domains within the top ten to properly assess guidelines, so you may want to reconsider your target keyword instead.
Some URLs have scripts in the code that force redirects when users and bots enter them. For regular users, it doesn’t cause issues, but since our bot’s goal is to crawl your code, not just to view the page, broken scripts confuse our crawler, and it returns with a failed crawl.
This case is similar - if you have broken scripts in your code, our crawler might not be able to read it properly. When you look at your page, you might not notice it, but our bot might not be able to analyze the code. That's why it's important for SEO to ensure that your code isn’t bloated and doesn’t contain any broken scripts. If your page contains improper code, be it either within regular HTML tags, <script> tags, or even Structured Data that is part of the <head> - your webpage might not be crawlable as a result.
Our crawler can only spend a limited amount of time reading through your code. After all, it also needs time to crawl your competitors and complete other Audits. If your page is quite long and heavy, takes a large amount of time to load, or you have an infinite scrolling set up on your page, our crawler might run out of time. If an Audit fails, we always recommend checking your performance in Lighthouse as if your page takes too long to load for our crawler, it might also be too slow for the Google Bot. Unfortunately, for now, we don’t have the option to audit pages with infinite scrolling, but you can always use our Domain Planner or Content Editor as a workaround.
Our crawler uses an ad block when crawling pages and some elements on your page; for example, pop-ups can trigger our ad blocker, and as a result, your Audit might fail.
Let’s say your page displays a welcome pop-up for visitors, and it then causes a redirect on your page. Since our crawler masks as a regular visitor, when it visits your page, a redirect will be triggered as well. As a result, our crawler gets confused and returns a failed Audit. As a first step, we’d recommend opening your page and checking what happens when you visit it as a regular viewer.