Here are a couple of workflows that can benefit from Surfer API integration via your custom app or 3rd party solutions like Zapier or Bubble.io:
Creating multiple queries and sharing access links
Via API, you can automate the creation of queries for Content Editor, Audit, and SERP Analyzer. Then, based on our responses, you can create web-app access links.
For example, you might have a Google Sheet with a list of keywords you want to rank for, and you want to create new articles using Content Editor.
In such an instance, you could:
Format your sheet so that each row contains ranking keywords and the location of your target audience.
For each row, extract keywords and location data and send them in using POST request to /api/v1/content_editors endpoint.
Tip: Remember that keywords should always be an array, even if you aim to optimize for only one keyword.
In response, you will get:
{
"state": "scheduled",
"permalink_hash": "kKi7n3pkRk7Gw5cxKDiBAbCAybnDTt2z",
"id": 5632898
}You can now create query access links by:
joining "https://app.surferseo.com/drafts/" + "5632898" strings - this will result in a private link that the account owner or organization member can access
joining "https://app.surferseo.com/drafts/s/" + "kKi7n3pkRk7Gw5cxKDiBA" strings will result in anyone accessing a public share link.
Share them with your team and external writers, or paste them in the original sheet you used for keyword input - whatever your preferred workflow requires.
Publishing written content to your CMS
Via API, you can get a clean HTML version of your Content Editor draft. You can then send it to Google Docs or your CMS via custom integration.
To get Editor query content, all you have to do is:
Locate the "id" of the Content Editor query you want to download content from.
Tip: If you didn't store ID for the query anywhere, just find it via web-app and copy the last https://app.surferseo.com/drafts/5632898Send a GET request to /api/v1/content_editors/:id/content specifying the "id" you found, here /api/v1/content_editors/5632898/content
Store the response.
Depending on your target text editor/CMS, you might need to parse the HTML we serve. For example, Wordpress and Shopify accept strings with HTML tags as "content", so no parsing/adjustments are necessary.
Getting Content Score for a specific URL
Using Audit endpoints, you can get Content Score for the URL you pick and up to 5 competitors we select automatically.
To do this:
Create a new query via POST request to /api/v1/audits endpoint.
Store "id" from our response:
{
"state": "scheduled",
"id": 767429
}Set a timeout for a couple of minutes - with a fast-loading, audited page and regular Surfer traffic query, it should take 5-10 minutes.
Send a GET request using /api/v1/audits/:id endpoint, here /api/v1/audits/767429
If "state" is still "scheduled," rerun the GET request again in a couple of minutes.In response, you should ultimately get:
{
"state": "completed",
"id": 482473,
"competitors_pages": [
{
"url": "https://www.competitor1.com/",
"content_score": 64
},
{
"url": "https://www.competitor2.com/",
"content_score": 49
},
{
"url": "https://www.competitor3.com/",
"content_score": 55
},
{
"url": "https://www.competitor4.com/",
"content_score": 78
},
{
"url": "https://www.competitor5.com/",
"content_score": 48
}
],
"audited_page": {
"url": "https://www.auditedURL.com/",
"content_score": 81
}
}Audited URL's Content Score is available under audited_page.content_score.
Gathering data about SERP competitors for your SEO analysis
If you want to get statistical data about competitors from Google or export a list of prominent terms we discovered, you can take advantage of these endpoints:
/api/v1/exports/csv/serp_analyzer/:id/search_results
/api/v1/exports/csv/serp_analyzer/:id/prominent_terms endpoints.
To do that, you can:
Create SERP Analyzer queries for keywords you want to analyze using a POST request to /api/v1/serp_analyzer or /api/v1/serp_analyzer/batches endpoints.
Store the response "id" value of each query you're interested in.
{
"state": "scheduled",
"id": 2800997
}Set a timeout for a couple of minutes - with regular Surfer traffic, a query should take 5-10 minutes.
If you ran more than 10 queries at once or in short succession, they might complete more slowly due to load-balancing done on our end.You can now run GET requests:
- to /api/v1/exports/csv/serp_analyzer/2800997/search_results endpoint, to get a list of SERP competitors and numerical data regarding their ranked URLs, including Content Score
- to /api/v1/exports/csv/serp_analyzer/2800997/prominent_terms endpoint, to get a list of prominent terms we discovered within SERP results.
Please note that these results are "raw data" and are not equal to what you'd see as Guidelines in Content Editor or Audit, which get additional processing.