Write Insight Newsletter · · 8 min read

The silent threat of citation hacking to academic integrity

(And how you can fight back)

Dark ninja facing a spirit.
The ninjas are coming for your honest citations.

I still remember the email the email like it was yesterday. “Congratulations, your paper has been accepted pending minor revisions.” After months of hard research and writing, the validation felt incredible—until I scrolled down to the reviewer comments.

“Please consider including references to the following papers…”

While I don’t mind including a good pointer to relevant related research that I might have missed, something felt off about this request. Only one of the suggested references was directly related to the research in the paper. The rest of them: Not relevant. And all of them had the same author in the author list. The air was thick with the aroma of scientific shenanigans.

Alas, I had just encountered citation hacking, and I wasn’t alone. Existing evidence points to nearly 16% of scientists engaging in reference list manipulation to some degree. What began as my personal frustration has become an epidemic threatening the very foundation of scientific discourse. And I’m not having it.

What exactly is citation hacking?

Citation hacking goes beyond the occasional self-citation. (Hey, we’ve all been a lonely postdoc, who knew they were doing good work in an area where not many people are paying attention. It’s ok to give yourself a little lift as long as it’s relevant citations to your more obscure work.) But citation hacking is different. And it’s systematic. It’s the deliberate manipulation of academic citations to artificially inflate impact metrics, which are often highly relevant to career advancement. This manipulation takes several forms:

  1. Self-citation inflation. Excessively citing your own work beyond what’s relevant. Germans have a word for what you are trying to be: Platzhirsch. (Meaning: a dominant stag in its territory, or as the Americans say: Top Dog. Woof.)
  2. Citation cartels. Groups of researchers who agree to cite each other’s papers regardless of relevance. Sure, they ain’t quite Sicarios, but still pretty f%$cking scary for junior researchers.
  3. Coercive citation. Reviewers or editors demanding their work be cited as a condition for publication. We’ve all seen the famous sentence “As strongly requested by the reviewers, here we cite some references [35–47] although they are completely irrelevant to the present work.” from a retracted article (and I wrote about it on LinkedIn). This is a real threat that is too common.
  4. Metadata manipulation. The newest threat—inserting references in metadata that don’t appear in the actual text.

But why does this happen? The academic reward system itself creates perverse incentives. When scientists face evaluation based on simple citation counts rather than quality or impact of their work, some inevitably game the system.

How to handle citation requests

Challenge coercive citation requests

That day when I received those reviewer comments, I panicked. I mean I was a postdoc. My first instinct was to comply. Just add the citations and get it published. And I did and many people do this every day because it’s frictionless and gets your article published. But that’s exactly what perpetuates the problem. So, the next time I faced such a request, I was a bit more senior and I took a different approach:

  1. Document everything. Save all communications showing citation demands. I didn’t need it, but you might.
  2. Contact the handling editor (or editor-in-chief) directly. I wrote a detailed explanation of why the suggested citations weren’t relevant, providing specific reasons rather than general objections.
  3. Propose alternatives. I also suggested more relevant citations from a few authors that would strengthen the paper.

The result? The handling editor overrode the reviewer’s demands (and in my heart, I hope they eventually blacklisted them if this kept happening). My experience taught me that editors often stand with authors who provide well-reasoned explanations against citation coercion. So, you don’t have to give in to such requests.

Maintain your citation integrity

When writing your own papers, apply what I call the “stranger test” to every citation: “Would I cite this paper if it was written by someone I didn’t know?”

This simple question cuts through potential bias, especially for self-citations. Self-citation itself isn’t inherently problematic, I believe, because your current research likely builds on your previous work. That’s normal and nothing to be afraid of. The key distinction lies in relevance and proportion. You want to keep things balanced.

I’ve found that keeping self-citations under 20% of total references provides a good rule of thumb. When I go beyond that threshold, I reconsider each citation critically. Is it really necessary or am I just making myself feel good here?

Systemic safeguards for journal editors

Your citation metrics may look great until your realize they are artificially inflated. Don’t get a wake-up call about this too late and then face serious integrity challenges for your entire journal. Here is my suggestions of some safeguards you can implement right away if you’re an editor-in-chief for a scientific journal:

Effective and explicit policies

Create explicit policies prohibiting citation manipulation. The most effective policies do this:

  1. Define manipulation clearly. Specify that it includes artificially inflating citations for benefits other than situating the work in the literature. For example, a sentence like “practices aimed at artificially boosting citation numbers for individual benefit.”
  2. Differentiate legitimate from illegitimate practices. Acknowledge that self-citation may be necessary to avoid self-plagiarism or provide research context, but establish clear boundaries.
  3. Implement zero-tolerance for coercion. When the editor-in-chief discovers a reviewer demanding citation of their papers, they should immediately blacklist the reviewer. You want to send a powerful message with a decisive action to establish the reputation of your journal. (Realistically, I don’t see this happening often, as we face a crisis of qualified reviewers as it is.)

Practical monitoring techniques

Excellent guidelines for preventing coercive citations exist. I liked the idea of a checkbox that says “I have requested citation(s) to my own research” to quickly identify reviewers, who request citations to their work and check whether they are legitimate. Another idea is to create a simple algorithm that flags any paper receiving more than three citations from a single reviewer. I think AI could easily help with this, too.

Other effective checks for editor include:

  1. Screen for citation anomalies between submission versions. Look for sudden increases in citations to particular authors or journals after a review.
  2. Cross-check reviewer-suggested citations against the reviewer’s publication history. This simple step shows you potentially self-serving citation requests.
  3. Implement citation analysis tools that detect unusual citation patterns. These tools should flag statistically improbable citation distributions that warrant further investigation. (Given how much money publishers make, these tools should already exist.)

Verification that can’t be gamed

Require justification for suggested citations during review. When reviewers recommend additional citations, ask them to explain specifically how each citation strengthens the manuscript. This single practice dramatically reduces frivolous citation requests.

Verify that citations added during revisions actually support the claims they’re attached to. Random spot-checking of citations catches many instances of inappropriate addition.

What’s really at stake

The real measure of research impact isn’t citation count. It’s how your work advances knowledge. Citation hacking distorts not only metrics but the very nature of scientific progress.

It’s clear that they are a result of the career system we are locked into, but once you stop chasing citations, you’ll realize that you’ll have all this new headspace to ask the questions that really matter to your field. Not having this space to find more creative solutions is the deeper cost of our citation obsession. And, I’m sure, finding those answers why you became a scientist in the first place.

We need systemic change in how we evaluate research quality. The current overemphasis on citation metrics creates conditions where citation hacking thrives. Journal editors and tenure committees can lead this change by:

  1. Publishing editorials that publicly condemn citation manipulation
  2. Highlighting exemplary papers with focused, relevant citation practices
  3. Implementing alternative impact metrics beyond raw citation counts

Break the cycle

When I now mentor young researchers, I discuss with them that citation integrity is essential to our research practice. Education is our most powerful preventative tool. Research from Indonesia found that training in citation practices significantly reduced manipulation among students. The same principles apply to established researchers.

Citation hacking threatens the integrity of academic discourse. But we can combat it through vigilance and systemic change. As researchers, we must stand firm against coercive practices. Journal editors need to implement robust policies and verification mechanisms. And honestly, academic publishers in their often infinite greed, should be throwing money at this.

Beyond these specific strategies, the academic community must reconsider how it measures and rewards research impact. When tenure committees stop obsessing over citation counts and focus on genuine knowledge advancement, we remove the incentives that drive citation manipulation in the first place. But this is hard work, because there are no easy metrics for assessment and I know senior academics are overworked as it is.

What citation practices have you witnessed in your field? Have you encountered citation hacking personally? Let me know.

Video and Cheat Sheet

Here is the video of my keynote talk “Who is even a researcher in the post-AI world?” at the Litmaps conference where I talk about the challenges of doing ethical research work in the post-AI world. I am also attaching a PDF cheat sheet as a defense guide for researchers facing citation hacking.

Read next