stripe.comgithub.iocss-tricks.comlobste.rsalgolia.comweb.dev

ahrefs vs crawlgraph — what you actually pay for

a $99 lifetime license vs $129/mo. broken down by feature, by api call, by what the data really shows.

pete the seo wizard
April 22, 2026 · 10 min read · 2,400 words
sharexlinkedin

ahrefs charges $129/mo. semrush charges $140/mo. crawlgraph charges $99, once. you can guess where this is going — but the interesting part is what you actually get for the difference.

this post breaks down four years of monthly invoices, then runs the same questions against an open dataset to see what you'd lose. spoiler: less than you'd think, and the math gets uncomfortable around month 14.

the bill, by month and year

here's the simple version. you can verify these numbers on each vendor's pricing page — they don't hide them.

ahrefssemrushcrawlgraph
starting price$129/mo$140/mo$99 once
backlinks indexed35.5T43T4.4B (open data)
unique domains~250M~810M120M
crawl frequency15 min30 minquarterly composite
unlimited domain checks✗ no✗ no✓ yes
rows per export query25k10k100k max
rank tracker included+addon+addon✗ no
site audit✓ yes✓ yes✗ no
true cost over 5 years$7,740$8,400$99

the row that does the work is the last one. $7,740 over five years for ahrefs at the lite plan; $8,400 for semrush; $99 for crawlgraph. that's a 78× delta on cost and the question is whether ahrefs gives you 78× more.

top sources by authoritycg-authority · 0–100
github.io
92
css-tricks.com
88
lobste.rs
86
algolia.com
84
web.dev
80
smashing.com
76
a16z.com
72
fig. 1 — top inbound domains for stripe.com, ranked by cg-authority. the same hit list ahrefs prints on their “referring domains” tab; pulled from common crawl, free.

ahrefs and semrush ship more product than we do. they bundle a rank tracker, a site auditor, and a content explorer. crawlgraph reports backlinks. if you need everything in one tab, this isn't the tool. if you need the backlink layer, keep reading.

feature-by-feature

where the enterprise tools earn their keep is in the data freshness, the polish, and the breadth of indexes outside backlinks. where they don't is in the part that actually drives ranking decisions: which domains link to whom, at what scale, and how many distinct hosts within each linking domain.

the open common crawl publishes a 4.4B-edge hyperlink graph monthly. that's the same input ahrefs's core index draws from, plus their proprietary recrawler. for 90% of the questions you'll ask in a typical audit, the open graph is sufficient.

livetry it on your own site

run this query against your domain — free

first 5 backlinks free. no signup required.

https://

where the data comes from

this is where it gets fun. ahrefs maintains its own crawler — call it ahrefsbot — which visits roughly 8 billion pages a day. semrush has a similar fleet. their indices are bigger because they recrawl more aggressively.

crawlgraph runs on the public common crawl release, indexed as a quarterly composite (the live footer reads CC-MAIN-2026-jan-feb-mar). the open graph covers 120M unique domains; ahrefs reports ~250M. the difference is mostly long-tail spam domains and parked pages that you'd filter out anyway.

“the index doesn't have to be complete. it has to cover the domains that matter to your decision.” — pete, in a slack we had at 2am

one api call that replaces the dashboard

ahrefs renders “referring domains” as a chart you stare at. lifetime customers on crawlgraph hit a single rest endpoint and get the same rows back as json — pipe to csv, paste into a sheet, throw it at a notebook. one call returns up to 100,000 rows of backlinks per query.

bash
curl -X POST https://crawlgraph.com/api/v1/backlinks \
  -H "Authorization: Bearer cg_live_…" \
  -H "Content-Type: application/json" \
  -d '{"domain": "yourdomain.com", "limit": 1000}'

the response is a flat list of linking domains plus a release id you can audit:

json
{
  "domain": "yourdomain.com",
  "release_id": "CC-MAIN-2026-jan-feb-mar",
  "total_linking_domains": 4821,
  "returned": 1000,
  "results": [
    { "linking_domain": "blog.foo.com", "num_hosts": 12, "tld": "com" },
    { "linking_domain": "news.bar.org", "num_hosts": 4,  "tld": "org" }
  ]
}

same call from python, if you'd rather pipe the rows into a notebook than through jq:

python
import requests

r = requests.post(
    "https://crawlgraph.com/api/v1/backlinks",
    headers={"Authorization": "Bearer cg_live_…"},
    json={"domain": "yourdomain.com", "limit": 1000},
)
data = r.json()
print(data["total_linking_domains"], "referring domains")
for row in data["results"][:10]:
    print(row["linking_domain"], row["num_hosts"])

full reference at /docs/api. the lifetime tier includes 1,000 backlinks calls per month and 50 gap-analysis calls per month. auth is a bearer token starting with cg_live_ that you mint from the account page.

why an api and not a console

we don't expose a sql console to customers. the underlying parquet would be painful to share access to safely, and a hosted query engine is a different product than a backlink tool. the api keeps the surface small: ask for backlinks, ask for a gap analysis, get rows back.

checkpoint

seen enough? run it on your site free.

5 backlinks free. $99 once for unlimited.

where ahrefs still wins

three places. they matter, and we won't pretend otherwise.

  • freshness. ahrefs recrawls every 15 minutes. common crawl publishes monthly and crawlgraph re-indexes on a quarterly composite. if you're tracking a fast-moving link-building campaign in real time, the lag matters.
  • rank tracking. we don't track serp positions. ahrefs and semrush both do, with regional segmentation. if that's central to your workflow, you'll need a separate tool.
  • site audit. the on-page audit (broken links, missing meta, duplicate content, etc.) lives in ahrefs's product, not in the link graph. crawlgraph doesn't replace it.

the verdict

if you want one dashboard that does everything, ahrefs and semrush are good products and they earn their fees from agencies who bill that cost back to clients.

if you want the backlink layer specifically — the rows, the exports, the unlimited domain checks via the web ui, the api — and you're willing to live with a quarterly-composite refresh cadence, $99 once is a defensible choice for the next decade of audits.

that's the pitch. the math is in the table. the api call is above. you can verify all of it before you spend a cent — the free tier returns the top 5 backlinks per domain with no signup, which is enough to sanity-check the index.

ahrefs · backlinkslocked
upgrade required · $129/mo
crawlgraph · live $99 once
G
github.io92
C
css-tricks.com88
L
lobste.rs86
A
algolia.com84
W
web.dev80
same data · one-time
$99$129/moonce
unlock the data →
stripe checkout · instant access
comparisons#ahrefs#pricing#methodology
sharexlinkedin
pete the seo wizard
author

writes the queries we run internally. ships one tactical post a week.

the dispatch
one post a week.

+ a free domain audit when you sign up.