Here are a couples of workflows that can benefit from Surfer API integration via your custom app or 3rd party solutions like Zapier or Bubble.io:
Creating multiple queries and sharing access links
Via API you can automate creating queries for Content Editor, Audit and SERP Analyzer.
Then, based on our responses, you can create web-app access links.
For example you might have a Google Sheet with a list of keywords you want to rank for and you want to create new articles using Content Editor.
In such instance you could:
Format your sheet so that each row would contain ranking keywords and location of your target audience.
For each row, extract keywords and location data and send them in using POST request to /api/v1/content_editors endpoint.
Tip: Remember that keywords should always be an array, even if you aim to optimize for only one keyword.
In response, you will get:
{
"state": "scheduled",
"permalink_hash": "kKi7n3pkRk7Gw5cxKDiBAbCAybnDTt2z",
"id": 5632898
}You can now create query access links by:
- joining "https://app.surferseo.com/drafts/" + "5632898" strings - this will result in a private link that account owner or organization member can access
- joining "https://app.surferseo.com/drafts/s/" + "kKi7n3pkRk7Gw5cxKDiBAbCAybnDTt2z" strings - this will result in a public share link anyone can access.Share them with your team, external writers or paste them in the original sheet you used for keyword input - whatever your preferred workflow requires.
Publishing written content to your CMS
Via API you can get a clean HTML version of your Content Editor draft. With it you can then send it further to Google Docs or your CMS via a custom integration.
To get Editor query content all you have to do is:
Locate the "id" of the Content Editor query you want to download content from.
βTip: If you didn't store ID for the query anywhere, just find it via web-app and copy the last https://app.surferseo.com/drafts/5632898Send GET request to /api/v1/content_editors/:id/content specifying the "id" you found, here /api/v1/content_editors/5632898/content
Store the response.
Depending on your target text editor/CMS you might need to parse the HTML we serve.
For example both:
- WordPress https://developer.wordpress.org/rest-api/reference/posts/#create-a-post
- Shopify https://shopify.dev/docs/api/admin-rest/2023-04/resources/article#post-blogs-blog-id-articles
Accepts String with HTML tags as "content", so no parsing/adjustments is necessary.
Getting Content Score for a specific URL
Using Audit endpoints you can get Content Score for the URL you pick and up to 5 competitors we select automatically.
To do this:
Create a new query via POST request to /api/v1/audits endpoint.
Store "id" from our response:
{
"state": "scheduled",
"id": 767429
}Set timeout for a couple of minutes - with fast-loading audited page and normal Surfer traffic query should take 5-10 minutes.
Send a GET request using /api/v1/audits/:id endpoint, here /api/v1/audits/767429
If "state" is still "scheduled" - rerun GET request again in a couple of minutes.In response you should ultimately get:
{
"state": "completed",
"id": 482473,
"competitors_pages": [
{
"url": "https://www.competitor1.com/",
"content_score": 64
},
{
"url": "https://www.competitor2.com/",
"content_score": 49
},
{
"url": "https://www.competitor3.com/",
"content_score": 55
},
{
"url": "https://www.competitor4.com/",
"content_score": 78
},
{
"url": "https://www.competitor5.com/",
"content_score": 48
}
],
"audited_page": {
"url": "https://www.auditedURL.com/",
"content_score": 81
}
}Audited URL's Content Score is available under audited_page.content_score.
β
Gathering data about SERP competitors for your own SEO analysis
If you want to get statistical data about competitors from Google or export list of prominent terms we discovered - you can take advantage of /api/v1/exports/csv/serp_analyzer/:id/search_results and /api/v1/exports/csv/serp_analyzer/:id/prominent_terms endpoints.
To do that you can:
Create SERP Analyzer queries for keywords you want to analyze using POST request to /api/v1/serp_analyzer or /api/v1/serp_analyzer/batches endpoints.
Store from the response "id" value of each query you're interested in.
{
"state": "scheduled",
"id": 2800997
}Set timeout for a couple of minutes - with normal Surfer traffic a query should take 5-10 minutes.
If you ran more than 10 queries at once or in short succession, they might complete slower due to load-balancing done on our end.You can now run GET requests:
- to /api/v1/exports/csv/serp_analyzer/2800997/search_results endpoint, to get list of SERP competitors and numerical data regarding their ranked URLs, including Content Score
- to /api/v1/exports/csv/serp_analyzer/2800997/prominent_terms endpoint, to get a list of prominent terms we discovered within SERP results.
β
Please note that these results are "raw data" and are not equal to what you'd see as Guidelines in Content Editor or Audit which get additional processing.