seojuice

I Built a Google Search Console Dashboard for $0. Here's the Code.

Vadim Kravcenko
Vadim Kravcenko
May 17, 2026 · 15 min read

TL;DR: Google Search Console gives away the same query, page, position, and impression data that SEMrush and Ahrefs charge $129–$449/mo to repackage. The GSC API has a free 1,200-req/min quota, a 25,000-row payload limit, and four endpoints that cover roughly 90% of what a solo founder needs. This piece walks through the service-account OAuth setup, the four endpoints worth knowing, a runnable Python script pulling a query-by-page CTR matrix, and three automation patterns (cron + Slack, Google Sheets, a lightweight Django view) that turn raw API rows into a usable dashboard. If you'd rather not maintain the integration yourself, SEOJuice handles the same wiring as a managed service.

What GSC Already Gives You for Free (That Paid Tools Repackage)

The frustrating thing about a $449/mo SEO suite is that roughly 70% of what shows up in the dashboard is data Google already gives you free through Search Console. Paid tools add three things on top: third-party rank tracking from rented proxies, a backlink graph scraped from crawled web archives, and a UI that doesn't make you write SQL. The third is the only one a solo founder actually needs, and it's the cheapest to replicate.

Here's what the API exposes for any property you've verified ownership of:

  • 16 months of historical performance data: every query that brought a click or impression, with position, CTR, and date.
  • Page-level breakdown for the same window: which URLs ranked for which queries, and where the click distribution landed.
  • Index coverage for any URL: last crawl date, indexing verdict, canonical Google chose, mobile usability, structured data parsing.
  • Sitemap state: which sitemaps Google fetched, when, and how many URLs were discovered vs. indexed.
  • Device, country, and search appearance segmentation on the same query/page data.

What's missing is anything outside your own property: competitor rankings, backlink growth on rival domains, share-of-voice math. For most pre-PMF founders those metrics are vanity numbers that don't change what you ship next anyway. I wrote a longer piece on cutting an SEO stack down to two tools, and the conclusion held up: GSC plus one writing surface is enough signal to act on.

"The data in Search Console comes directly from Google. It's the most accurate source available for understanding how your site performs in Google Search."

From Google's own Search Console Help documentation.

Setting Up the GSC API With a Service Account

There are two viable auth flows for the GSC API: user-flow OAuth (browser redirect, refresh tokens) and service-account OAuth (server-to-server, no browser). For a founder running a cron on a single VPS, service accounts are shorter and more durable — no refresh-token expiration, no consent screen to maintain.

Provisioning takes about 10 minutes and costs nothing:

  1. Open the Google Cloud Console and create a new project (or reuse one). The free tier is fine; this never bills.
  2. Enable the Google Search Console API in the API library.
  3. Under IAM & Admin → Service Accounts, create a new service account named gsc-dashboard-reader. No project-level roles needed.
  4. Generate a JSON key on the new account and download it. Treat it like an API key; don't commit it.
  5. In Search Console, under Settings → Users and permissions, add the service account's email (ending in @<project-id>.iam.gserviceaccount.com). Restricted permission is enough for reading.

That last step is the one founders miss. The service account is a Google identity, but unless you explicitly grant it access to your Search Console property, the API returns an authoritative-looking 403 user does not have sufficient permission for site error. Grant per-property; if you have multiple, repeat for each.

Diagram of the service-account OAuth flow: GCP project, service account, JSON key, Search Console permission grant
Service-account OAuth flow for GSC. The credentials file authenticates to Google; the per-property permission grant inside Search Console authorizes access to data.

The Four Endpoints That Cover 90% of What You Need

The GSC API surface is small. Four endpoints together cover query data, sitemap state, per-URL inspection, and the list of properties you have access to. Skip the rest of the API reference until you have a concrete reason to read it.

EndpointWhat it returnsQuota costUseful for
searchanalytics.queryUp to 25,000 rows of clicks, impressions, position, CTR by query, page, date, device, country1 unit / requestThe core dashboard data; what queries drive traffic, which pages convert impressions to clicks
sitemaps.listAll submitted sitemaps with status, last-fetch time, URL counts1 unit / requestSitemap-health alerts; flagging dropped or partially-indexed sitemaps
urlInspection.index.inspectPer-URL: coverage state, last-crawl, canonical, mobile usability, AMP and structured data verdicts2,000 / day (separate quota)Spot-checks on critical pages; automated indexing audits
sites.listAll properties the auth identity can read1 unit / requestMulti-property dashboards; iterating over a portfolio

The 1,200/min quota on read endpoints is effectively unlimited for solo-founder use. The 2,000/day cap on URL Inspection is the only real ceiling, and it covers a daily audit of every page on a 1,500-URL site.

A Real Python Pull: Top Queries and the Query-by-Page Matrix

The single most useful query you can run is a query-by-page breakdown over the last 28 days. It tells you which page is ranking for which query, and what CTR looks like at the intersection. Cells with high impressions and low CTR are your near-term optimization targets.

Install the two dependencies:

pip install google-auth google-api-python-client

The minimum viable script. Auth, query, print. Save your downloaded JSON key as gsc-credentials.json in the same directory:

from datetime import date, timedelta
from google.oauth2 import service_account
from googleapiclient.discovery import build

SCOPES = ["https://www.googleapis.com/auth/webmasters.readonly"]
SITE_URL = "sc-domain:example.com"  # or "https://example.com/"
KEY_FILE = "gsc-credentials.json"

creds = service_account.Credentials.from_service_account_file(
    KEY_FILE, scopes=SCOPES
)
service = build("searchconsole", "v1", credentials=creds)

end = date.today() - timedelta(days=2)  # GSC lags ~2 days
start = end - timedelta(days=27)

request = {
    "startDate": start.isoformat(),
    "endDate": end.isoformat(),
    "dimensions": ["query"],
    "rowLimit": 25,
    "orderBy": [{"field": "clicks", "descending": True}],
}
response = service.searchanalytics().query(
    siteUrl=SITE_URL, body=request
).execute()

for row in response.get("rows", []):
    q = row["keys"][0]
    print(f"{row['clicks']:>5}  {row['impressions']:>6}  "
          f"{row['ctr']*100:>5.1f}%  pos={row['position']:>5.1f}  {q}")

That's the entire integration. Run it and you'll see the top 25 queries over the last 28 days, sorted by clicks. Two notes: sc-domain:example.com is for domain properties (the recommended type); use the full URL form only for URL-prefix properties. GSC data lags ~2 days, which is why we end at today - 2.

The query-by-page matrix — the version that actually drives decisions — adds a second dimension and bumps the row limit:

request = {
    "startDate": start.isoformat(),
    "endDate": end.isoformat(),
    "dimensions": ["query", "page"],
    "rowLimit": 5000,
    "orderBy": [{"field": "impressions", "descending": True}],
}
response = service.searchanalytics().query(
    siteUrl=SITE_URL, body=request
).execute()

opportunities = []
for row in response.get("rows", []):
    query, page = row["keys"]
    impressions = row["impressions"]
    ctr = row["ctr"]
    position = row["position"]
    # Cells with >500 impressions and CTR below 2% are CTR-leak candidates
    if impressions > 500 and ctr < 0.02 and position < 15:
        opportunities.append((impressions, query, page, position, ctr))

opportunities.sort(reverse=True)
for imp, q, p, pos, ctr in opportunities[:20]:
    print(f"{imp:>6} imp  pos={pos:>4.1f}  ctr={ctr*100:>4.1f}%  {q}  →  {p}")

The filter at the bottom (high impressions, low CTR, top-15 position) is the classic title-tag-rewrite candidate list. You're already ranking; the click-through is the leak. This is the same logic paid tools surface as "ranking opportunities", written in six lines.

Sample output of the query-by-page matrix showing impressions, CTR, position, query and target URL columns
Query-by-page matrix output. Each row is a query × URL intersection. High-impression, low-CTR rows are the title-rewrite candidates.

What to Chart: The Four Visualizations Worth the Effort

Once you have the rows, the question is what to chart. Most paid SEO tools drown you in 40 widgets; you only need four to make weekly decisions. Build them in whatever charting library you already use: Chart.js for HTML, matplotlib for Jupyter, native chart blocks if you're piping into Google Sheets.

  1. Daily clicks plus impressions line chart, 90-day window. The single most important chart. It tells you whether traffic is growing, flat, or decaying. Divergence (impressions up, clicks flat) is the AI Overview signature.
  2. Top 20 queries bar chart, position color-coded. Sorted by clicks descending. Green for positions 1–5, yellow for 6–10, red for 11+. Reveals which queries you're winning, which are at risk, which deserve a refresh.
  3. Position decay scatter plot. X-axis: position 4 weeks ago. Y-axis: position today. Diagonal is "no change." Points above the diagonal dropped; below the diagonal improved. The red-dot cluster in the top-right is your decay watchlist; see the content refresh strategy piece for the workflow.
  4. AI Overview impact estimate. Plot impressions and clicks as two lines on the same axis, normalized so they overlap on day 1. When they diverge (impressions holding, clicks dropping), the gap is, to a first approximation, AI Overview cannibalization. Not perfectly clean, but directionally useful.
Mockup of the four core GSC charts: daily clicks line, top queries bar, position decay scatter, AI Overview impact divergence
The four charts. Most paid SEO dashboards show 40+ widgets; for a solo founder, these four cover roughly 90% of the decisions.

Automation Patterns: Cron, Slack, Sheets, Django

Pulling manually is fine for a one-off. The dashboard becomes useful when it runs itself. Three automation patterns, ranked by setup effort:

Pattern 1: Cron and Slack alerts. The cheapest option. A daily cron runs the script, and if any of three conditions trigger — clicks dropped >20% week-over-week, a top-10 query fell out of the top 20, or a previously-indexed page lost indexing — post to Slack. About 80 lines of Python including the webhook. Runs in 4 seconds on a $5 VPS. Your dashboard does one thing: shout at you when something changes.

import json, os, urllib.request

def post_slack(text):
    payload = {"text": text}
    req = urllib.request.Request(
        os.environ["SLACK_WEBHOOK_URL"],
        data=json.dumps(payload).encode(),
        headers={"Content-Type": "application/json"},
    )
    urllib.request.urlopen(req, timeout=10).read()

# After computing wow_change from two consecutive 7-day GSC pulls:
if wow_change < -0.20:
    post_slack(
        f":warning: GSC clicks dropped {wow_change*100:.0f}% WoW "
        f"({last_week} → {this_week}). Top falling queries: {falling[:3]}"
    )

Pattern 2: Google Sheets sink. Pipe the rows into a Google Sheet via the Sheets API or the simpler gspread wrapper. The sheet becomes your dashboard: pivot tables for ad-hoc slicing, native charts, shareable with a non-technical co-founder. Roughly 30 lines on top of the GSC pull. Downside: refresh latency depends on your cron cadence, and Sheets gets sluggish past about 20,000 rows.

Pattern 3: Lightweight Django view. One view that runs the GSC pull, caches in Redis for 6 hours, and renders the four charts inline via Chart.js. Around 200 lines of Python and HTML. Worth the effort once a co-founder wants to glance at the dashboard mid-week. The cache is critical; without it, every pageview triggers a fresh GSC call and you'll exhaust the per-minute quota on a busy launch day.

"Cancelled my $329/mo Ahrefs sub after realizing the only widget I checked daily was 'top organic queries.' Wrote it against the Search Console API in an afternoon."

A pattern that shows up regularly in indie-hacker threads on Hacker News and the bootstrappers' corners of IndieHackers.

DIY vs SEMrush vs SEOJuice vs Ahrefs vs the Free GSC UI

The comparison most founders actually want isn't between paid tools — it's between paying for one, building your own, or using the free GSC UI and accepting its limitations. Here's how the five options stack up:

OptionMonthly costSetup timeMaintenance burdenMulti-propertyHistorical data
Free GSC UI$00 hrs (already there)NoneManual switching16 months, but slow exports
DIY GSC API + cron + Sheets$0 to $5 (VPS)4 to 8 hrs first time~30 min/quarter (auth rotations, occasional API deprecation)Trivial loop16 months, queryable in seconds
SEOJuice$29 to $99~10 minNone (managed)Built-in16 months from GSC plus own crawl history
SEMrush$139 to $499~30 minNone (managed)Project limits per plan2+ years on paid plans
Ahrefs$129 to $449~30 minNone (managed)Project limits per plan2+ years on paid plans

DIY wins on cost, breaks even on data depth for everything inside your own property, and loses on competitor and backlink intel. If you're early-stage and your SEO question is "what's working on my own site?", DIY is the right answer. If you're scaling, doing competitor research, or chasing backlinks, the paid tools earn their cost. SEOJuice's tools page sits in the middle: managed GSC integration plus audit and AI visibility tooling, without the enterprise overhead of SEMrush.

What AI Overviews Get Wrong About GSC Data

Ask ChatGPT or Gemini how to interpret GSC data and you'll get plausible-sounding answers that are wrong in three specific ways. Worth calling out: these errors propagate into every "AI-generated SEO report" tool on the market.

Wrong claim #1: "Position is the average rank you held for that query." It's the average of the highest position any of your URLs held in impressions where the query triggered a result including your site. If two URLs ranked for the same query in the same SERP, only the higher one counts. This is why "position" moves when you publish a new article outranking an older one: the older URL's position doesn't change, but the query-level one does.

Wrong claim #2: "CTR is calculated per impression." CTR is clicks ÷ impressions at whatever aggregation level you queried at. By date alone gives daily site-wide CTR; by query+page gives per-cell CTR. The numbers don't add up across aggregation levels because the denominator changes.

Wrong claim #3: "If a query has impressions but no clicks, the page ranks poorly." In the AI Overview era this is increasingly the signature of being cited in the Overview but not clicked through. Impressions held, clicks dropped is the canonical pattern. The page isn't ranking worse; the SERP changed shape. The AI Overview citations piece goes deeper on what to do about it.

Pitfalls to Avoid When You Build This Yourself

Five sharp edges that bite first-time GSC API users. Knowing them upfront saves a week:

  • The 25,000-row limit per request is hard. Past that, paginate with startRow. The google-api-python-client wrapper does not auto-paginate; loop until the response returns fewer than 25,000 rows.
  • Service-account keys do not expire, but Google revokes them if leaked. Treat the JSON file like a database password — secrets manager in production, never in git.
  • The 2-day data lag is per-day, not per-query. Today's data returns empty. Two days ago returns near-final numbers; yesterday is partial.
  • Domain (sc-domain:) and URL-prefix (https://...) properties report different numbers. Domain properties aggregate across all subdomains and protocols. If you have both registered, you'll see overlapping but non-identical data. Pick one as the source of truth.
  • The "anonymized queries" bucket eats roughly 15–25% of your tail data. Low-volume queries get rolled into (anonymized) for privacy and can't be recovered. Plan top-queries analysis around the named ones.

None of these are showstoppers, but each cost me an hour the first time. The official query reference documents them all, just not loudly.

FAQ

How long does the GSC API keep historical data? 16 months. Past that, the data is gone; there's no archive endpoint. If you want longer-range trend data, snapshot the rolling window into your own storage. A 365-day window in Postgres is a few hundred MB per property.

Can I use the API to submit URLs for indexing? No. URL submission was removed in 2020 after spam abuse. The Indexing API still exists but only works for job postings and livestream events; using it for regular content is against Google's policy.

What's the difference between impressions and clicks? Impressions count every time your URL was shown on a SERP, whether or not the user scrolled to it. Clicks count user clicks. CTR is the ratio. The AI Overview era widens the gap because users see the Overview answer and don't click the cited sources.

Is there a free tier above the standard quota? Quota is the same for everyone: 1,200 requests/min per project on analytics endpoints, 2,000/day on URL Inspection. No paid upgrade path; quota bumps are requestable through Google Cloud Console but rarely needed for solo workloads.

Can I delegate access without sharing my Google account? Yes, that's exactly what the service account is for. Create one per integration, grant Restricted access to the property, and revoke without touching your own login.

What language bindings does the API support? Officially: Python, Node, Java, PHP, Ruby, Go, .NET. Unofficially: anything that can hit a REST endpoint with a Bearer token. JSON shapes are identical across languages.

Side-by-side comparison of free GSC UI, DIY GSC API dashboard, SEOJuice, SEMrush, and Ahrefs across cost, setup, maintenance
The five options for a founder's SEO dashboard, plotted across cost and setup time. The DIY route is free but eats engineering hours.

When DIY Stops Being Worth It

The DIY dashboard pays for itself for as long as the data you need lives inside your own property. The moment the question becomes "why is my competitor outranking me?" or "who links to them but not to me?", you've hit the ceiling of what GSC exposes. From there the choice is paying SEMrush or Ahrefs $129–$449/mo for backlink and competitor data, or paying a smaller tool like SEOJuice for the same GSC integration plus managed crawl, audit, and AI visibility tracking, usually 15–25% of what the enterprise suites charge.

For most solo founders, the cutoff is somewhere between $500/mo MRR and $5K/mo MRR. Below the lower end, build it yourself; engineering time is cheaper than the subscription, and you'll learn the data along the way. Above the upper end, your time is too valuable to maintain auth rotations and quota retries. In between, it's a coin flip that depends on how much you enjoy writing Python at midnight.

The other path: build the cron + Slack version this weekend (six hours including OAuth setup), run it for a month, and see what you actually look at. If you only check the "top queries dropping" alert and ignore everything else, you've discovered you only needed one widget — keep building, or buy. The affordable SEO strategies piece covers the budget-tier playbook.

<script type="application/ld+json"> { "@context": "https://schema.org", "@type": "FAQPage", "mainEntity": [ { "@type": "Question", "name": "How long does the GSC API keep historical data?", "acceptedAnswer": { "@type": "Answer", "text": "16 months. Past that, the data is gone; there is no archive endpoint. If you want longer-range trend data, run a daily cron that snapshots the rolling window into your own storage." } }, { "@type": "Question", "name": "Can I use the GSC API to submit URLs for indexing?", "acceptedAnswer": { "@type": "Answer", "text": "No. URL submission was removed in 2020 after spam abuse. The Indexing API exists but only supports job postings and livestream events; using it for regular content is against Google policy." } }, { "@type": "Question", "name": "What is the difference between impressions and clicks in GSC?", "acceptedAnswer": { "@type": "Answer", "text": "Impressions count every time a URL was shown on a SERP. Clicks count user clicks on the URL. CTR is the ratio. In the AI Overview era the gap is widening because users see the Overview answer and do not click through." } }, { "@type": "Question", "name": "Is there a free tier above the standard GSC API quota?", "acceptedAnswer": { "@type": "Answer", "text": "The quota is the same for everyone: 1,200 requests/min per project on the analytics endpoints, 2,000/day on URL Inspection. There is no paid upgrade path; quota increases can be requested via the Google Cloud Console but are rarely needed for solo-founder workloads." } }, { "@type": "Question", "name": "Can I delegate GSC access without sharing my Google account?", "acceptedAnswer": { "@type": "Answer", "text": "Yes. That is what service accounts are for. Create one per integration, grant it Restricted access to the property, and revoke its access without touching your own login." } }, { "@type": "Question", "name": "What language bindings does the GSC API support?", "acceptedAnswer": { "@type": "Answer", "text": "Officially: Python, Node, Java, PHP, Ruby, Go, .NET. Unofficially: anything that can hit a REST endpoint with a Bearer token. JSON request shapes are identical across languages." } } ] } </script>
SEOJuice
Stay visible everywhere
Get discovered across Google and AI platforms with research-based optimizations.
Works with any CMS
Automated Internal Links
On-Page SEO Optimizations
Get Started Free

no credit card required