Join our community of websites already using SEOJuice to automate the boring SEO work.
See what our customers say and learn about sustainable SEO that drives long-term growth.
Explore the blog →TL;DR: Google Search Console gives away the same query, page, position, and impression data that SEMrush and Ahrefs charge $129–$449/mo to repackage. The GSC API has a free 1,200-req/min quota, a 25,000-row payload limit, and four endpoints that cover roughly 90% of what a solo founder needs. This piece walks through the service-account OAuth setup, the four endpoints worth knowing, a runnable Python script pulling a query-by-page CTR matrix, and three automation patterns (cron + Slack, Google Sheets, a lightweight Django view) that turn raw API rows into a usable dashboard. If you'd rather not maintain the integration yourself, SEOJuice handles the same wiring as a managed service.
The frustrating thing about a $449/mo SEO suite is that roughly 70% of what shows up in the dashboard is data Google already gives you free through Search Console. Paid tools add three things on top: third-party rank tracking from rented proxies, a backlink graph scraped from crawled web archives, and a UI that doesn't make you write SQL. The third is the only one a solo founder actually needs, and it's the cheapest to replicate.
Here's what the API exposes for any property you've verified ownership of:
What's missing is anything outside your own property: competitor rankings, backlink growth on rival domains, share-of-voice math. For most pre-PMF founders those metrics are vanity numbers that don't change what you ship next anyway. I wrote a longer piece on cutting an SEO stack down to two tools, and the conclusion held up: GSC plus one writing surface is enough signal to act on.
"The data in Search Console comes directly from Google. It's the most accurate source available for understanding how your site performs in Google Search."
From Google's own Search Console Help documentation.
There are two viable auth flows for the GSC API: user-flow OAuth (browser redirect, refresh tokens) and service-account OAuth (server-to-server, no browser). For a founder running a cron on a single VPS, service accounts are shorter and more durable — no refresh-token expiration, no consent screen to maintain.
Provisioning takes about 10 minutes and costs nothing:
gsc-dashboard-reader. No project-level roles needed.@<project-id>.iam.gserviceaccount.com). Restricted permission is enough for reading.That last step is the one founders miss. The service account is a Google identity, but unless you explicitly grant it access to your Search Console property, the API returns an authoritative-looking 403 user does not have sufficient permission for site error. Grant per-property; if you have multiple, repeat for each.

The GSC API surface is small. Four endpoints together cover query data, sitemap state, per-URL inspection, and the list of properties you have access to. Skip the rest of the API reference until you have a concrete reason to read it.
| Endpoint | What it returns | Quota cost | Useful for |
|---|---|---|---|
searchanalytics.query | Up to 25,000 rows of clicks, impressions, position, CTR by query, page, date, device, country | 1 unit / request | The core dashboard data; what queries drive traffic, which pages convert impressions to clicks |
sitemaps.list | All submitted sitemaps with status, last-fetch time, URL counts | 1 unit / request | Sitemap-health alerts; flagging dropped or partially-indexed sitemaps |
urlInspection.index.inspect | Per-URL: coverage state, last-crawl, canonical, mobile usability, AMP and structured data verdicts | 2,000 / day (separate quota) | Spot-checks on critical pages; automated indexing audits |
sites.list | All properties the auth identity can read | 1 unit / request | Multi-property dashboards; iterating over a portfolio |
The 1,200/min quota on read endpoints is effectively unlimited for solo-founder use. The 2,000/day cap on URL Inspection is the only real ceiling, and it covers a daily audit of every page on a 1,500-URL site.
The single most useful query you can run is a query-by-page breakdown over the last 28 days. It tells you which page is ranking for which query, and what CTR looks like at the intersection. Cells with high impressions and low CTR are your near-term optimization targets.
Install the two dependencies:
pip install google-auth google-api-python-client
The minimum viable script. Auth, query, print. Save your downloaded JSON key as gsc-credentials.json in the same directory:
from datetime import date, timedelta
from google.oauth2 import service_account
from googleapiclient.discovery import build
SCOPES = ["https://www.googleapis.com/auth/webmasters.readonly"]
SITE_URL = "sc-domain:example.com" # or "https://example.com/"
KEY_FILE = "gsc-credentials.json"
creds = service_account.Credentials.from_service_account_file(
KEY_FILE, scopes=SCOPES
)
service = build("searchconsole", "v1", credentials=creds)
end = date.today() - timedelta(days=2) # GSC lags ~2 days
start = end - timedelta(days=27)
request = {
"startDate": start.isoformat(),
"endDate": end.isoformat(),
"dimensions": ["query"],
"rowLimit": 25,
"orderBy": [{"field": "clicks", "descending": True}],
}
response = service.searchanalytics().query(
siteUrl=SITE_URL, body=request
).execute()
for row in response.get("rows", []):
q = row["keys"][0]
print(f"{row['clicks']:>5} {row['impressions']:>6} "
f"{row['ctr']*100:>5.1f}% pos={row['position']:>5.1f} {q}")
That's the entire integration. Run it and you'll see the top 25 queries over the last 28 days, sorted by clicks. Two notes: sc-domain:example.com is for domain properties (the recommended type); use the full URL form only for URL-prefix properties. GSC data lags ~2 days, which is why we end at today - 2.
The query-by-page matrix — the version that actually drives decisions — adds a second dimension and bumps the row limit:
request = {
"startDate": start.isoformat(),
"endDate": end.isoformat(),
"dimensions": ["query", "page"],
"rowLimit": 5000,
"orderBy": [{"field": "impressions", "descending": True}],
}
response = service.searchanalytics().query(
siteUrl=SITE_URL, body=request
).execute()
opportunities = []
for row in response.get("rows", []):
query, page = row["keys"]
impressions = row["impressions"]
ctr = row["ctr"]
position = row["position"]
# Cells with >500 impressions and CTR below 2% are CTR-leak candidates
if impressions > 500 and ctr < 0.02 and position < 15:
opportunities.append((impressions, query, page, position, ctr))
opportunities.sort(reverse=True)
for imp, q, p, pos, ctr in opportunities[:20]:
print(f"{imp:>6} imp pos={pos:>4.1f} ctr={ctr*100:>4.1f}% {q} → {p}")
The filter at the bottom (high impressions, low CTR, top-15 position) is the classic title-tag-rewrite candidate list. You're already ranking; the click-through is the leak. This is the same logic paid tools surface as "ranking opportunities", written in six lines.

Once you have the rows, the question is what to chart. Most paid SEO tools drown you in 40 widgets; you only need four to make weekly decisions. Build them in whatever charting library you already use: Chart.js for HTML, matplotlib for Jupyter, native chart blocks if you're piping into Google Sheets.

Pulling manually is fine for a one-off. The dashboard becomes useful when it runs itself. Three automation patterns, ranked by setup effort:
Pattern 1: Cron and Slack alerts. The cheapest option. A daily cron runs the script, and if any of three conditions trigger — clicks dropped >20% week-over-week, a top-10 query fell out of the top 20, or a previously-indexed page lost indexing — post to Slack. About 80 lines of Python including the webhook. Runs in 4 seconds on a $5 VPS. Your dashboard does one thing: shout at you when something changes.
import json, os, urllib.request
def post_slack(text):
payload = {"text": text}
req = urllib.request.Request(
os.environ["SLACK_WEBHOOK_URL"],
data=json.dumps(payload).encode(),
headers={"Content-Type": "application/json"},
)
urllib.request.urlopen(req, timeout=10).read()
# After computing wow_change from two consecutive 7-day GSC pulls:
if wow_change < -0.20:
post_slack(
f":warning: GSC clicks dropped {wow_change*100:.0f}% WoW "
f"({last_week} → {this_week}). Top falling queries: {falling[:3]}"
)
Pattern 2: Google Sheets sink. Pipe the rows into a Google Sheet via the Sheets API or the simpler gspread wrapper. The sheet becomes your dashboard: pivot tables for ad-hoc slicing, native charts, shareable with a non-technical co-founder. Roughly 30 lines on top of the GSC pull. Downside: refresh latency depends on your cron cadence, and Sheets gets sluggish past about 20,000 rows.
Pattern 3: Lightweight Django view. One view that runs the GSC pull, caches in Redis for 6 hours, and renders the four charts inline via Chart.js. Around 200 lines of Python and HTML. Worth the effort once a co-founder wants to glance at the dashboard mid-week. The cache is critical; without it, every pageview triggers a fresh GSC call and you'll exhaust the per-minute quota on a busy launch day.
"Cancelled my $329/mo Ahrefs sub after realizing the only widget I checked daily was 'top organic queries.' Wrote it against the Search Console API in an afternoon."
A pattern that shows up regularly in indie-hacker threads on Hacker News and the bootstrappers' corners of IndieHackers.
The comparison most founders actually want isn't between paid tools — it's between paying for one, building your own, or using the free GSC UI and accepting its limitations. Here's how the five options stack up:
| Option | Monthly cost | Setup time | Maintenance burden | Multi-property | Historical data |
|---|---|---|---|---|---|
| Free GSC UI | $0 | 0 hrs (already there) | None | Manual switching | 16 months, but slow exports |
| DIY GSC API + cron + Sheets | $0 to $5 (VPS) | 4 to 8 hrs first time | ~30 min/quarter (auth rotations, occasional API deprecation) | Trivial loop | 16 months, queryable in seconds |
| SEOJuice | $29 to $99 | ~10 min | None (managed) | Built-in | 16 months from GSC plus own crawl history |
| SEMrush | $139 to $499 | ~30 min | None (managed) | Project limits per plan | 2+ years on paid plans |
| Ahrefs | $129 to $449 | ~30 min | None (managed) | Project limits per plan | 2+ years on paid plans |
DIY wins on cost, breaks even on data depth for everything inside your own property, and loses on competitor and backlink intel. If you're early-stage and your SEO question is "what's working on my own site?", DIY is the right answer. If you're scaling, doing competitor research, or chasing backlinks, the paid tools earn their cost. SEOJuice's tools page sits in the middle: managed GSC integration plus audit and AI visibility tooling, without the enterprise overhead of SEMrush.
Ask ChatGPT or Gemini how to interpret GSC data and you'll get plausible-sounding answers that are wrong in three specific ways. Worth calling out: these errors propagate into every "AI-generated SEO report" tool on the market.
Wrong claim #1: "Position is the average rank you held for that query." It's the average of the highest position any of your URLs held in impressions where the query triggered a result including your site. If two URLs ranked for the same query in the same SERP, only the higher one counts. This is why "position" moves when you publish a new article outranking an older one: the older URL's position doesn't change, but the query-level one does.
Wrong claim #2: "CTR is calculated per impression." CTR is clicks ÷ impressions at whatever aggregation level you queried at. By date alone gives daily site-wide CTR; by query+page gives per-cell CTR. The numbers don't add up across aggregation levels because the denominator changes.
Wrong claim #3: "If a query has impressions but no clicks, the page ranks poorly." In the AI Overview era this is increasingly the signature of being cited in the Overview but not clicked through. Impressions held, clicks dropped is the canonical pattern. The page isn't ranking worse; the SERP changed shape. The AI Overview citations piece goes deeper on what to do about it.
Five sharp edges that bite first-time GSC API users. Knowing them upfront saves a week:
startRow. The google-api-python-client wrapper does not auto-paginate; loop until the response returns fewer than 25,000 rows.sc-domain:) and URL-prefix (https://...) properties report different numbers. Domain properties aggregate across all subdomains and protocols. If you have both registered, you'll see overlapping but non-identical data. Pick one as the source of truth.(anonymized) for privacy and can't be recovered. Plan top-queries analysis around the named ones.None of these are showstoppers, but each cost me an hour the first time. The official query reference documents them all, just not loudly.
How long does the GSC API keep historical data? 16 months. Past that, the data is gone; there's no archive endpoint. If you want longer-range trend data, snapshot the rolling window into your own storage. A 365-day window in Postgres is a few hundred MB per property.
Can I use the API to submit URLs for indexing? No. URL submission was removed in 2020 after spam abuse. The Indexing API still exists but only works for job postings and livestream events; using it for regular content is against Google's policy.
What's the difference between impressions and clicks? Impressions count every time your URL was shown on a SERP, whether or not the user scrolled to it. Clicks count user clicks. CTR is the ratio. The AI Overview era widens the gap because users see the Overview answer and don't click the cited sources.
Is there a free tier above the standard quota? Quota is the same for everyone: 1,200 requests/min per project on analytics endpoints, 2,000/day on URL Inspection. No paid upgrade path; quota bumps are requestable through Google Cloud Console but rarely needed for solo workloads.
Can I delegate access without sharing my Google account? Yes, that's exactly what the service account is for. Create one per integration, grant Restricted access to the property, and revoke without touching your own login.
What language bindings does the API support? Officially: Python, Node, Java, PHP, Ruby, Go, .NET. Unofficially: anything that can hit a REST endpoint with a Bearer token. JSON shapes are identical across languages.

The DIY dashboard pays for itself for as long as the data you need lives inside your own property. The moment the question becomes "why is my competitor outranking me?" or "who links to them but not to me?", you've hit the ceiling of what GSC exposes. From there the choice is paying SEMrush or Ahrefs $129–$449/mo for backlink and competitor data, or paying a smaller tool like SEOJuice for the same GSC integration plus managed crawl, audit, and AI visibility tracking, usually 15–25% of what the enterprise suites charge.
For most solo founders, the cutoff is somewhere between $500/mo MRR and $5K/mo MRR. Below the lower end, build it yourself; engineering time is cheaper than the subscription, and you'll learn the data along the way. Above the upper end, your time is too valuable to maintain auth rotations and quota retries. In between, it's a coin flip that depends on how much you enjoy writing Python at midnight.
The other path: build the cron + Slack version this weekend (six hours including OAuth setup), run it for a month, and see what you actually look at. If you only check the "top queries dropping" alert and ignore everything else, you've discovered you only needed one widget — keep building, or buy. The affordable SEO strategies piece covers the budget-tier playbook.
<script type="application/ld+json"> { "@context": "https://schema.org", "@type": "FAQPage", "mainEntity": [ { "@type": "Question", "name": "How long does the GSC API keep historical data?", "acceptedAnswer": { "@type": "Answer", "text": "16 months. Past that, the data is gone; there is no archive endpoint. If you want longer-range trend data, run a daily cron that snapshots the rolling window into your own storage." } }, { "@type": "Question", "name": "Can I use the GSC API to submit URLs for indexing?", "acceptedAnswer": { "@type": "Answer", "text": "No. URL submission was removed in 2020 after spam abuse. The Indexing API exists but only supports job postings and livestream events; using it for regular content is against Google policy." } }, { "@type": "Question", "name": "What is the difference between impressions and clicks in GSC?", "acceptedAnswer": { "@type": "Answer", "text": "Impressions count every time a URL was shown on a SERP. Clicks count user clicks on the URL. CTR is the ratio. In the AI Overview era the gap is widening because users see the Overview answer and do not click through." } }, { "@type": "Question", "name": "Is there a free tier above the standard GSC API quota?", "acceptedAnswer": { "@type": "Answer", "text": "The quota is the same for everyone: 1,200 requests/min per project on the analytics endpoints, 2,000/day on URL Inspection. There is no paid upgrade path; quota increases can be requested via the Google Cloud Console but are rarely needed for solo-founder workloads." } }, { "@type": "Question", "name": "Can I delegate GSC access without sharing my Google account?", "acceptedAnswer": { "@type": "Answer", "text": "Yes. That is what service accounts are for. Create one per integration, grant it Restricted access to the property, and revoke its access without touching your own login." } }, { "@type": "Question", "name": "What language bindings does the GSC API support?", "acceptedAnswer": { "@type": "Answer", "text": "Officially: Python, Node, Java, PHP, Ruby, Go, .NET. Unofficially: anything that can hit a REST endpoint with a Bearer token. JSON request shapes are identical across languages." } } ] } </script>no credit card required