The extent to which
predatory journals can harm scientific practice increases as the numbers
of such journals expand, in so far as they undermine scientific
integrity, quality, and credibility, especially if those journals leak
into prestigious databases. Journal Citation Reports (JCRs), a reference
for the assessment of researchers and for grant-making decisions, is
used as a standard whitelist, in so far as the selectivity of a
JCR-indexed journal adds a legitimacy of sorts to the articles that the
journal publishes. The Multidisciplinary Digital Publishing Institute
(MDPI) once included on Beall’s list of potential, possible or probable
predatory scholarly open-access publishers, had 53 journals ranked in
the 2018 JCRs annual report. These journals are analysed, not only to
contrast the formal criteria for the identification of predatory
journals, but taking a step further, their background is also analysed
with regard to self-citations and the source of those self-citations in
2018 and 2019. The results showed that the self-citation rates increased
and was very much higher than those of the leading journals in the JCR
category. Besides, an increasingly high rate of citations from other
MDPI-journals was observed. The formal criteria together with the
analysis of the citation patterns of the 53 journals under analysis all
singled them out as predatory journals. Hence, specific recommendations
are given to researchers, educational institutions and prestigious
databases advising them to review their working relations with those
sorts of journals.
Introduction
The journal Nature recently published a definition of the predatory journal (Grudniewicz et al. 2019),
a milestone that highlights the increasing concern within academia of
these pernicious journals that are exploiting the gold open-access
publication model to their upmost, generating enormous financial gain
‘which appears to be the main criteria for publication’ (Frandsen 2017). Predatory journals, harmful to academia and science, ‘sow confusion, promote shoddy scholarship and waste resources’ (Grudniewicz et al. 2019)
and therefore jeopardize integrity in science. Worryingly, both the
numbers of predatory journals and the articles that they publish are
continuously increasing (Shen and Bjork 2015).
In
the gold open-access model, reading the publications is free and the
publication costs, collected through the Article Processing Charge
(APC), are incurred by the authors, their institutions, and funding
bodies. A predatory journal will exploit this model to its own benefit
with an inexistent or practically inexistent peer-review process (Beall 2015; Frandsen 2017; Demir 2018),
which permits the rapid publication of academic papers without due
guarantees, with an associated risk of publishing pseudo-science. At the
same time, if there is a lack of awareness of predatory journals among
scientists, then they will evaluate those publications as if they were
legitimate and may naively send lawful papers to predatory journals. At
worst, however, authors may send them intentionally with the double
effect of ‘polluting the scientific records and perversely advancing the
careers of researchers’ (Cortegiani et al. 2020).
Selective
databases, such as Scopus, PubMed, and Journal Citation Reports (JCRs),
form an index of journals, a sort of whitelist that is used for the
purposes of assessing researchers and taking decisions on grant funding (Cortegiani et al. 2020; Siler 2020). However, some articles from some predatory journals are in fact indexed, both in PubMed (Manca et al. 2017a, b)—an alarmingly high number of them in the opinion of Manca et al. (2020)—and in Scopus (Hedding 2019; Cortegiani et al. 2020b).
Their new found legitimacy means that any citations will, in
consequence, raise the productivity metrics (e.g. h-index) of their
authors, generating ‘inflated curricula and doped academic careers’ (Cortegiani, Manca and Giarratano, 2020a).
This
investigation is centred on JCRs, perhaps the most prestigious and best
recognized database in academia with the widest use at a global level,
in order to analyse the Multidisciplinary Digital Publishing Institute
(MDPI). This mega-publisher appeared on Beall’s list and was
subsequently excluded. Moreover, the Norwegian Register for Scientific
Journals, Series, and Publishers downgraded MDPI to 0 in 2019 and later
upgraded it to 1 again. These facts suggest that MDPI has been open to
question, a dubious publisher that has been moving within a ‘grey zone’.
It is deserving of further analysis that will help us to determine
whether it is ‘using a broad range of questionable tactics that are
neither illegal nor easy to detect’ (Manca, Cugusi and Deriu 2019).
Against
that backdrop, the objective of this study is to analyse the behaviour
of 53 MDPI-journals that were JCR indexed in 2019, in order to shed
light on their qualification and to elucidate whether these journals are
in fact predatory. Their characteristics are therefore examined to see
whether they are equatable with certain definitions of predatory
journals. No longer merely a medium for dissemination, scientific
journals are now a key foundation for appointments and funding in
scientific research (Shu et al. 2018).
The use of JCR has been extended, both for the evaluation of academics
and institutions of all types, legitimizing the journals that are
indexed, which evaluate the publications included in scholarly records
when taking decisions on promotion, tenure, grants, etc. because it is
used as a proxy for both quality and integrity. This analysis of the
practices of MDPI is of relevance to researchers and for research
institutions and funding bodies as well as for JCR itself, which could
see its prestige compromised, if it incorporated predatory journals
among its indexed journals.
Predatory journals
Although some have proposed alternative terms, such as pseudo-journals (Laine and Winker 2017; Elmore and Weston 2020), fake journals (Demir 2018), deceptive journals (Elmore and Weston 2020), and opportunistic journals (Bond et al. 2019), the term predatory journal is undoubtedly the most extensive in academia and appropriately describes this malpractice (Manca et al. 2020).
The librarian, Jeffrey Beall, while at the University of Colorado and
now in retirement, coined the term to identify journals that,
overlooking quality peer-review processes, seek to generate income
exclusively through the APCs that the authors are expected to pay and
who are then sent misleading information on citation indexes and
spam-related marketing (Beall 2012; Laine and Winker 2017).
Predatory journals are a global threat to science (Harvey and Weinstein 2017; Grudniewicz et al. 2019; Strong 2019), because they undermine its integrity (Vogel 2017; Abad-García 2019), its quality, and its credibility (Bond et al. 2019).
They are, in all, a threat to society as a whole, because whenever the
articles that they publish are indexed in selective databases, which is
the case of PubMed, ‘the items achieve global exposure and are
interpreted by readers, including patients, as trustworthy’ (Manca et al. 2019), without those articles having undergone an acceptable editorial and peer-review process. Cortegiani et al. (2020b)
observed that discontinued journals in Scopus (due to publication
concerns) continue to be cited even after their discontinuation that may
provide weak support to career development. In addition, publication in
a predatory journal implies the squandering of valuable resources:
people, animals, and money, as Moher et al. (2017)
have reminded us. Lastly, predatory journals are a threat to scientists
who may endanger their careers and devalue their curricula.
The
alarming increase in the number of predatory journals (from 1,800 to
8,000 over the period 2010–4) and the exponential growth (from 53,000 to
420,000 between 2010 and 2014) of the articles that they publish (Shen and Bjork 2015)
have rendered futile any effort to keep white and blacklists updated.
These lists very soon become outdated and incomplete, especially if the
resources to keep them updated are scarce. Even so, the identification
of predatory journals is still a crucial aspect in the maintenance of
quality and scientific integrity. However, the reality is that this
process is by no means simple, as Aromataris and Stern (2020)
accurately indicated, particularly because ‘predatory publishers have
continued to evolve their undesirable art form into sophisticated
operations that appear to be, at face value, legitimate’ to the point
where ‘certain journals and publishers may blatantly exploit “gray”
strategies given that downmarket niches can be lucrative’ (Siler 2020).
The
first attempt at identifying predatory journals was Beall’s list,
although it eventually disappeared in January 2017 (a cached copy with a
new updated section is maintained anonymously at https://beallslist.net/).
Given the immense difficulties of keeping a list of predatory journals
updated, the use of one from among the very many abundant checklists,
such as ‘Think.Check.Submit’ (https://thinkchecksubmit.org/), is encouraged1. Likewise, Cabells’ blacklist and whitelist, now referred to as predatory journals and analytics https://blog.cabells.com/2020/06/08/announcement/, listed more than 12,000 predatory journals in October 2019 (https://blog.cabells.com/2019/10/02/the-journal-blacklist-surpasses-the-12000-journals-listed-mark/). Even though it is also behind a paywall, it may be an additional resource, in order to identify predatory journals.
In
any case, the first step towards identifying predatory journals is to
have a clear definition for their definitive identification. The
criteria for the identification of a predatory journal and a list of
suspicious items are lengthy: journal names may be very similar to
prestigious journals; the web page may contain spelling errors and
questionable grammatical constructions and/or low quality images; the
language on the journal webpage may resemble a ‘hard sell’ that targets
academic authors; the journal may include articles outside its stated
scope or may have a very broad scope; submission can be by email instead
of a manuscript management system; the editor-in-chief might also act
as the editor-in-chief of another journal with a widely different scope,
predominance of editorial board members from developing countries;
time-lines for publication and fast-track peer-review processes might
appear unrealistic; APCs can be low; impact-factor metrics may be
unknown; spam emails may invite academics to submit papers; despite the
open-access approach, transfer of copyright may be required; and,
finally, non-professional or non-journal affiliated contact information
may be given for the editorial office (Manca et al. 2018; Committee on Publication Ethics 2019; Gades and Toth 2019; Kisely 2019; Vakil 2019; Elmore and Weston 2020; Kratochvíl et al. 2020).
The
problem is that these criteria, above all if taken in an isolated way,
are questionable. For example, the APC can be higher than 1,000 USD (as
happens for OMICS), there is no specific limit to the number of
editorial board members from developing countries that is considered a
proper way of distinguishing between legitimate and predatory journals,
the content of the web page appears dubious, and titles may inevitably
be mimicked when the journal specialism is very narrow (Kratochvíl et al. 2020).
It
is therefore essential to define the concept. The Committee on
Publication Ethics (COPE) (2019) clarified that predatory publishing
‘generally refers to the systematic for-profit publication of
purportedly scholarly content (in journals and articles, monographs,
books, or conference proceedings) in a deceptive or fraudulent way and
without any regard for quality assurance [… so] these journals exist
solely for profit without any commitment to publication ethics or
integrity of any kind’.
The COPE definition of predatory journals is no different in essence to the definition of Grudniewicz et al. (2019):
‘predatory journals and publishers are entities that prioritize
self-interest at the expense of scholarship and are characterized by
false or misleading information, deviation from best editorial and
publication practices, a lack of transparency, and/or the use of
aggressive and indiscriminate solicitation practices’. It should be
pointed out that, despite the significant advance in the definition
proposed by Grudniewicz et al. (2019),
so as to recognize predatory journals (and not to fall prey to them),
it nevertheless omits an express reference to the quality of peer
revision. In spite of its important role in science, it was considered
too subjective an aspect—partly because, as with journal quality and
deceitfulness, it is impossible to assess—(Grudniewicz et al. 2019; Cukier et al. 2020) for inclusion in an objective definition.
It
is essential that researchers correctly identify predatory journals, so
as to avoid both serious personnel setbacks (at-risk reputation,
disqualifying marks for tenure, responsibility for unethical publishing,
resources wasted on APCs, loss of legitimate data and research results,
and, in relation to medical publishing, even placing patient safety at
risk) and scientific consequences (dilution and distortion of evidence
in systematic reviews, deterioration of scientific credibility and
integrity, doping of academic careers, loss or return of research
funding) (COPE 2019; Gades and Toth 2019; Pearson 2019; Cortegiani et al. 2020; Hayden 2020).
Multidisciplinary DIGITAL publishing institute (MDPI)
The MDPI, with its headquarters in Basel (Switzerland), formerly known as Molecular Diversity Preservation International (https://www.mdpi.com/about/history)
that launched its first two journals (Molecules and Mathematical and
Computational Applications) in 1996, operates a gold open- access
framework. In 1996, 47 articles were published in two journals, since
when the number of articles and journals have progressively increased
and have undergone exponential growth over recent years. By 2019,
106,152 articles had been published in its 218 journals, an increase of
64.1% over 2018. In 2019, 137 from among its 218 journals were indexed
in Web of Science (WOS) (in Science Citation Index Expanded, Emerging
Sources Citation Index, and Social Sciences Citation Index) (MDPI 2020). Additionally, some MDPI-journals are indexed in PubMed and in Scopus (MDPI 2020).
According to the MDPI Annual Report 2019 (MDPI 2020),
these 218 journals are supported by 67,207 editors (an increase of
55.78% over 2018) with a median time from submission to publication of
39 days (22% decrease over 2018) and APCs ranging from 300 to 2,000 CHF
(1 Swiss Franc is approximately equal to 0.92 Euros) with a median of
1.525 CHF. MDPI founder and current president is Shu-Kun Lin, Ph.D (https://www.mdpi.com/about/team).
This
mega-publisher was initially incorporated on Beall’s list and was
subsequently excluded on 28th October ‘as a result of a formal appeal
made by MDPI and assessed by four members of Mr Beall's Appeals Board’ (https://www.mdpi.com/about/announcements/534). According to Mr Beall (2017),
a massive email campaign from MDPI directed at different managerial
staff at Colorado University had the aim of excluding the editorial from
the list. At present, MDPI is not included as a predatory publisher on
Beall’s list (https://beallslist.net/),
although it draws attention to possible ethical problems with the
editorial. Besides, the Norwegian Register for Scientific Journals,
Series and Publishers—jointly operated by The National Board of
Scholarly Publishing and the Norwegian Centre for Research Data (NSD)—
in the framework of the NSD downgraded MDPI to 0 over various months in
2019 and later upgraded to 1 again2. (https://dbh.nsd.uib.no/publiseringskanaler/KanalForlagInfo.action?id=26778andbibsys=false).
Recently, Copiello (2019)
focussed attention on the analysis of journal self-citations and
publisher self-citations published in the MDPI-journal Sustainability,
revealing a form of post-production misconduct, due to the manipulation
of citations, which affected both the impact factor of the journal, its
visibility and its influence. He demonstrated that the self-citations of
Sustainability, in 2016 and 2017, in relation to articles published in
2015, in no way corresponded to a uniform probability distribution.
It
may therefore be appreciated that the reputation of MDPI Publisher has
undergone ups and downs over the past few years and has both its critics
and supporters, which makes it an interesting case study. The aim of
this investigation is to provide objective data, in order to verify
whether MDPI-journals indexed in JCR fit the definitions of a predatory
journal that Grudniewicz et al. (2019) and COPE (2019) have established.
SEE THE REST OF THE PAPER HERE:
https://academic.oup.com/rev/advance-article/doi/10.1093/reseval/rvab020/6348133