Why the analogy matters and where it breaks.
SEO teams have a 15-year muscle memory around backlinks. Every team we audit still tracks referring domains, link velocity, anchor text diversity. The muscle memory transfers partly. Which is good. It also misleads, which is worse.
Both citations and backlinks are third-party signals that a source is worth attention. Both reward quality. Both reward freshness. Both punish link farms and AI-generated sludge. That is where the analogy holds.
The analogy breaks at scale. A backlink is a durable artifact on a third-party page, pointing to you, accumulated over time, measured in aggregate. A citation is a single instance of an LLM deciding, at the moment a user asks a question, that a source is credible enough to name. One is a stock. The other is a flow. The tactics you use to grow a stock (evergreen link-building, PR placements, broken-link outreach) only partially transfer to growing a flow.
Teams that treat citations as a backlink variant end up confused when their top-ranking pages do not get cited, or when pages they never optimized for Google start showing up in Claude's answers. The underlying ranking logic is different. Let us get specific about where.
Five structural differences between links and citations.
Difference 1: citations are query-conditioned, links are not.
A backlink to your page helps every search that might surface that page. A citation exists only in the context of a specific query. The same source can be cited for one question and ignored for the next, even if the topic is adjacent. This means citation share is measured per-query-cluster, not in aggregate.
Difference 2: citations are generated, not recorded.
Backlinks exist in a crawlable graph. Citations are produced at inference time by a model choosing which of its retrieved sources to attribute. You cannot scrape a definitive citation graph the way you can scrape a link graph. What you can do is run a representative query panel repeatedly and measure which sources get cited, how often, for which questions. That is citation monitoring, and it is more like polling than like SEO auditing.
Difference 3: anchor text is replaced by claim-context.
In SEO, the anchor text on an inbound link conveys topical relevance. In GEO, the equivalent is the claim-context around the citation: what sentence did the LLM write that your source was used to support. That context tells you what the model thinks you are authoritative about. It is a direct signal, sharper than anchor text ever was.
Difference 4: authority compounds differently.
Backlink authority compounds through PageRank-style transitive signals. One very authoritative site linking to you passes a lot. Citation authority compounds through consistency: the more often your source is cited with similar framing across topically related queries, the more the model learns to associate you with that topical neighborhood. It is less about single prestigious mentions and more about consistent presence in a cluster.
Difference 5: the attack surface is narrower.
PBNs (private blog networks), link exchanges, sponsored-post link drops, all the tactics that dragged backlink manipulation into a war of attrition between SEOs and Google, do not translate. An LLM is not crawling a link graph and counting edges. It is retrieving content, reading it, and deciding. Quality of source text matters in a way link quantity cannot substitute for. This is good news for operators who do honest work and bad news for the link-farm industry.
What gets cited: the three content shapes that win.
Across the 400-query citation study we ran, three patterns dominate the sources that got cited consistently. If you want citations, write content that falls into these shapes. If you are writing content outside these shapes and wondering why it does not get picked up, this is usually the reason.
Shape 1: primary-data pages.
Pages that present original data, original numbers, original research, original benchmarks. Not aggregations of other people's data. Models cite sources that appear to be the original, not the fifth rewrite. If you want citations on "what is the average conversion rate for SaaS landing pages," do not write a roundup article; run a study, publish the methodology, publish the data, and write the result.
Rate of citation in our dataset: primary-data pages get cited roughly 8x more often per topical impression than equivalently-ranked aggregation pages.
Shape 2: claim-evidence-source structure.
Pages whose structure makes it easy for a model to extract a claim, see the evidence that supports it, and identify the source. Well-constructed technical documentation, well-sourced essays, peer-reviewed content. If a paragraph makes a claim and the claim is immediately backed by a number or a quote with a named source, the claim is citable. If a paragraph makes a claim and moves on without support, it may be true but it is not citable.
Shape 3: canonical reference pages.
Pages that are widely understood as the authoritative reference on a topic. OWASP for web security. MDN for browser APIs. NIST for security frameworks. Not because they are prestigious in an abstract sense, but because enough adjacent content cites them as the authoritative source that the model has internalized the association. You can build a canonical reference in a niche by doing the work. The pattern: pick a narrow topic, write the most comprehensive reference on it, keep it updated, watch it get cited over 18 to 36 months.
What backlink tactics still work, and which die.
Let us be concrete about which SEO playbook moves transfer and which do not.
Still works: original research and data publishing.
Benchmarks, surveys, case studies, industry reports. These earned backlinks and they earn citations. The multiplier across the two channels is actually higher than either alone, because a report that earns backlinks tends to also earn citations, and vice versa.
Still works: authoritative long-form writing.
Deep essays, definitive guides, expert explainers. These rank well organically and they also get cited in LLM answers. The work is the same; the reward is now two channels instead of one.
Partially works: digital PR and thought-leadership placements.
A placed op-ed on a high-authority publication still earns a backlink. Whether it earns citations depends on whether the publication is in the model's high-trust reference set. Most industry trade press is. Most branded content platforms are not. Evaluate before you spend.
Dies: link exchange and PBNs.
This was already dying under Google's newer algorithms. LLMs kill it the rest of the way. Models are not counting edges.
Dies: paid guest-post networks.
The "pay a service $200 and they will place your post on 50 no-name blogs" tactic. Does not move the citation needle. Does not even reliably move backlink rankings anymore.
Dies: AI-generated doorway content at scale.
Sites that published a thousand thin pages to capture long-tail backlinks are not the sources that LLMs cite. The model preferences we see in our citation data show clear selection against low-information-density content, regardless of whether it technically ranks on page 1.
How we measure citation share for clients.
Every client on an SEO or GEO retainer has a citation panel: 80 to 200 queries, monitored weekly across Claude, ChatGPT, Perplexity, and Gemini. The queries are a mix of "where do we currently rank organically" and "what do we want to be cited for in the next 12 months."
Metrics we report:
- Citation share per cluster. Of the N queries in a topical cluster, on how many does the client's domain appear in the cited-sources list. Tracked per engine.
- Citation context score. When cited, is the claim-context framing the client positively, neutrally, or as a cautionary example. LLM-as-judge with rubric.
- Cluster dominance. Across all queries in a cluster, what share of total citations does the client hold vs. top competitors.
- New-query penetration. Queries the client is newly being cited on this month vs. last.
The panel data lives in a dashboard shape we have published as a sample. Clients get their own view, updated weekly.
The headline to internalize: citation share is the real scoreboard now. Organic rankings still matter for attribution and for the long tail of human searches, but the decisive traffic is increasingly the session that ends in an answer, not a click. If you are not measuring citation share, you are not measuring your discoverability. You are measuring yesterday's proxy for it.
Citations are the new backlinks. The currency has shifted. The tactics have shifted. The measurement has shifted. The firms that figure this out in the next 12 months will own the share-of-answer in their categories for the next five years.
Operator-tone writing on Applied AI, Security, SEO, and Economics.
One essay per week. No hype. No tracking pixels. Unsubscribe in one click.