Hacker News new | past | comments | ask | show | jobs | submit login
Elsevier may wish they had checked the revision a bit more carefully (nodebb.org)
84 points by sndean 2 days ago | hide | past | favorite | 29 comments





> Clear beneficiary is Alex V. Trukhanov, publishing around 50 papers per year since 2018. Notice in particular how he has 17,528 citations in only 5,601 citing papers: more than 3 citations to him in each citing paper, on average. This is a clear indication of either a giant and seminal figure in their field, or citation manipulation.

https://x.com/Dr_5GH/status/1855306578293068005

https://pubpeer.com/publications/1924F147DE045B97261004EB238...

Also note there's a Sergei V Trukhanov on most artificially cited papers, too.


So, am I reading this correctly?

Researchers try to get paper published, but publisher asks for paper authors to add citations that weren't used in the research, presumably to increase citations of certain 'blessed' papers.

Authors add the citations, but also a note that the citations were not used and were asked for by the publisher.

If this is all correct, I'm curious what the publisher's relationship is to those cited papers. Were the citations paid for? Something else?


The publisher operates a peer-reviewed journal. The process is that the author(s) submit a paper to the journal which is briefly evaluated by an editor and then undergoes a review process. Journal editors maintain a list of reviewers comprised of past reviewers, authors, etc. For example, if you are publishing a paper on topic X then the editor will try to assign it to reviewers who are knowledgeable and specialized in topic X.

Papers are usually reviewed by ~3 reviewers who can ask for revisions. Reviewers are typically anonymous to the authors, although the authors are usually not anonymous to the reviewers. If the reviewers ask for revisions (most common), the authors can revise the paper. This can go back and forth numerous times.

Reviewers can be professors, PhD students, etc. and are paid by the journal for their time. There are many ways to manipulate the system. Reviewers can block or slow the publication of a rival, or they can suggest changes that benefit themselves (e.g. quid pro quo). Often this isn't so blatant and the line can be very blurry.

The publisher and editor typically don't care much about the politics and conflicts of interest.


> Reviewers can be professors, PhD students, etc. and are paid by the journal for their time.

Most journals do not pay for reviewers time. There are some experiments that allow for this, but I would say 99% of the reviews out there are done... "out of the goodness of the reviewer's heart".

There are indirect benefits to being a reviewer, such as early access to unpublished work, "goodwill" with the editor, etc.


Reviewers don’t get paid (it’s only the Editor-in-chief, usually, who gets some salary).

This is often one of the more blatant ways the review process is de-anonymized. If a reviewer comes back saying you need to cite 3 papers all with the same author, they're probably that author, especially if they're only tenuously connected to the paper.

No not the publisher. The reviewers - other researchers.

On other hand the editor employed(or at least appointed) by Elsevier did pass on these...

Thats what makes it fun. Reviewer-coerced citations are almost ubiquitous - at least in the mildest forms. But it's rarely admitted like here. I bet many editors never reread papers, and just go by the reviewers word. Bet they were told that the paper is fine for publication with some added citations. The authors added them, editor and reviewers see the citations in the reference list and off it goes to be published.

LGTM!

Trust the science! Is this peer reviewed?

Lmao


The usual points

- Elsevier is bad

- Reviewers are not paid 99% of a time

- Editors choose reviewers

- Editors should be managing conflict and avoid that

- Editors in chief is probably the only paid position in peer review process

- Academic publishing industry have higher margin than Tech industry

- Publishers don't have enough incentives to minimize number of publications for quality.

- Again, Elsevier is evil


This sort of "citation for acceptance" trading happens everywhere. It's not just Elsevier.

Where did I point out that this is just elsevier?

any source for "everywhere" ? aka "science can't be trusted", hinting at "anywhere" ?

https://en.wikipedia.org/wiki/Sealioning

Stop asking for sources to try to prove that someone is wrong. It proves nothing.

Every time I have submitted a paper, I have dealt with this. Every scientist I know has dealt with this.


TIL "Sealioning"

which it wasn't. we're discussing scientific culture.

OP is pointing out a problem.

and now you say always and everywhere which is a strong claim. happens sometimes, ok. happens a lot, okay.

but "everywhere", "always"?

also it would be genuinely interesting to get pointers on meta research, evidence based statements about the trustworthiness of X as in scientists, institutions, papers, publishers, ...

I ask for that and you respond "Sealioning"

shrug

have a grand evening


I've argued on the internet enough to know where the "source?" discussion goes. Nipping bad faith arguments in the bud is a good idea, in general, and the best way to do that is to call the argument out for what it is.

It's also a good idea not to try to start them, but I don't control your behavior. Doubling down on your bad faith arguments is not a good look, either.

If you would like to have a discussion with long-form essays and cited sources, start by bringing some. In other words, if you want other people to put in effort, put some in first.

Here's my anecdata: I have submitted 6 papers for review and had 7 reviewers request a citation of one sort or another.


I get the logic and I get where you come from.

I come from a place where of all "information sources" scholar.google.com is still one of the slightly more trustworthy ones. the review process has its quirks, but it's better than no peer review at all.

in that world, where academic peer reviews appears to be a last "bastion" of reliability, a statement like the one that started this thread is something I feel the urge to "nib the bud".

if we were in a room I'd suggest we have beer of sorts and a good laugh about the irony of this thread.

I think you need to be careful to not initiate the very rhetoric game you want to fight. if your top level statement had been your anecdata, and maybe a question about the pervasiveness of this phenomenon and if there is nuance? like, if the journal asks an expert in the field about a review and they do know relevant papers, maybe indeed some of their own, a citation request may be just fine?

I'd said nothing. just nothing.

but the thread starter overgeneralized the missing trustworthiness of the scientific peer process as a whole.

and there I dared to ask: is that really so? should we stop trusting published science, in general, and "do our own research" ;-)

do you see, where I come from? do you see the irony of this thread?

and can we have a figurative beer?


Personally I dislike links that force me to do a bunch of work to figure out the gist of the post. In this case, it seems like a link to a link to a comment on an academic paper in which the author is frustrated about irrelevant citation requests? Maybe I misunderstood? I would love for submissions like this to include an explanatory snippet to provide context.

It appears a reviewer for the paper requested that their own works be added as citations to the paper for no justifiable reason beyond elevating their own citation count.

Goodhart's law on citation count... Too easy to game. It seems many people do this, and it works for them, because even if their peers just know who is doing it, they're still making bank, see Didier Raoult's paper mill.

See also: The Retraction Watch Leaderboard. Who has the most retractions?

https://retractionwatch.com/the-retraction-watch-leaderboard...


Almost nobody is “making bank”.

Not in dollars, no, but a surprisingly large number of tenure decisions boil down to "does this researcher have a lot of citations to their work?" as a measure of the quality of their research.

Very true! And for the unfortunate ones who get tenture, well they miss out on making bank like the ones who leave academia for industry positions.

Fine, they are fighting like dogs over scraps. Is that a better idiom? Either way the system is broken.

Relevant context[1] of the underlying meta.

[1] https://technicaeditorial.com/cash-for-citations-the-newest-...


This particular form of citation farming by asking for citations in peer review (in exchange for a favorable review of the paper) is considered "acceptable" despite being scummy. Citation selling, which seems to be broadly equivalent, is very much not. This method is also a lot older and more widespread.

This example is one of the entries into the rabbit hole called "science".



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: