Showing posts with label JCR. Show all posts
Showing posts with label JCR. Show all posts

Friday, August 19, 2022

Highlights of the 2022 Journal Citation Reports

At the end of June, Clarivate released the 2022 Journal Citation Reports (JCR). JCR is an important annual publication in the scholarly communication industry and is eagerly awaited for the scientometric information it provides about academic journals—especially journal impact factors (JIFs). Since it is a publisher-neutral resource, these metrics are often heavily relied on when evaluating journals. Researchers, research organizations, and libraries use them to judge the quality of journals, and publishers use them to promote their journals to the academic community.

Here are some of the key features of the 2022 report:

  • Journal coverage: JCR is based on journals listed in the Web of Science Core Collection. Until last year, JCR covered natural sciences and social sciences, but now, it also covers the top journals in the arts and humanities. The 2022 report includes 12,828 science journals, 6,691 social science journals, and 3,092 arts and humanities journals.1
  • Citation coverage: The data in the report covered Web of Science citations from 2021, including 2.7 million citable items (original articles, reviews) and 145 million cited references indexed in the Web of Science Core Collection.2
  • JIF calculation: For 2021, the JIF for a journal was calculated as a ratio between the following:
    • Numerator: Total number of citations the journal received in 2021 for articles it published in 2019 and 2020
    • Denominator: Total number of citable items published by the journal in 2019 and 2020
  • New entrants: Over 190 journals have received their first JIFs.3
  • Other metrics: JCR provides metrics other than JIF, which can be used as supplementary or alternative ways to evaluate journals. Two of these are
    • Journal citation indicator: This metric was introduced last year to account for discipline-wise differences in publication and citation frequencies. It is essentially a discipline-normalized JIF that can offer a more effective way to compare journal quality across fields than the JIF does.
    • Immediacy index: This is the number of citations received by a journal in a specific year divided by the number of articles it published in that year. It can be a useful performance indicator for new journals that have been included in JCR but which have not yet received JIFs.
  • Inclusion of early access content: Sometimes, journals make a version of record available early, before the article is finally published in an issue. A policy change this year was to include citations to the early access content as well. If an article had an early access date in one calendar year (say, 2019) and a final publication date in another (2020), the early access date was considered.4
  • JIF suppression: If the citation pattern for a journal is observed to be aberrant, the JIF of the journal is suppressed for one year (i.e., the JIF of the journal is excluded although the journal title itself may still feature in the report). In the 2022 report, the JIFs of three journals were suppressed.To decide whether a JIF should be suppressed, Clarivate examines different types of citation distortion: journal self-citation (far too many articles published in a journal cite other articles published in the same journal) and citation stacking (articles in journal A cite those in journal B far too often, and vice versa).4 From 2023 onward, JCR will consider one more type of distortion to suppress a JIF: journal self-stacking (the majority of a journal’s citations within the JIF window are to its own articles).
  • Effect of COVID-19: COVID-19 caused a massive global shift in research focus—a shift reflected in JIF trends as well. Last year’s report already showed an unusual increase in the number of journals whose JIF increased within a single year (2019 to 2020).2 This may have been driven by the amount of COVID-related work being published and cited in that period. This year too, the JIFs of some journals in fields related to medicine and public health are 10 times higher than they were before.2 All of the seven journals that crossed a JIF of 100 for the first time published a large number of COVID-related articles.5 COVID-19 is expected to continue influencing citation and JIF patterns in the near future.

References

  1. Clarivate. Web of Science Journal Citation Reports 2022 Infographic. Web of Science Group https://clarivate.com/webofsciencegroup/web-of-science-journal-citation-reports-2022-infographic/.
  2. McVeigh, M. Journal Citation Reports 2022: A preview. Clarivate https://clarivate.com/blog/journal-citation-reports-2022-a-preview/ (2022).
  3. Clarivate. First time Journal Citation Reports inclusion list 2022. Web of Science Group https://clarivate.com/webofsciencegroup/first-time-journal-citation-reports-inclusion-list-2022/.
  4. Clarivate. Journal Citation ReportsTM: Reference Guide. https://clarivate.com/wp-content/uploads/dlm_uploads/2022/06/JCR-2022-Reference-Guide.pdf (2022).
  5. Nandita Quaderi. Journal Citation Reports 2022: COVID-19 research continues to drive increased citation impact. Clarivate https://clarivate.com/blog/journal-citation-reports-2022-covid-19-research-continues-to-drive-increased-citation-impact/ (2022).

Tuesday, August 17, 2021

Journal citation reports and the definition of a predatory journal: The case of the Multidisciplinary Digital Publishing Institute (MDPI)

 M Ángeles Oviedo-García

Research Evaluation, rvab020, https://doi.org/10.1093/reseval/rvab020
Published: 11 August 2021

Abstract

The extent to which predatory journals can harm scientific practice increases as the numbers of such journals expand, in so far as they undermine scientific integrity, quality, and credibility, especially if those journals leak into prestigious databases. Journal Citation Reports (JCRs), a reference for the assessment of researchers and for grant-making decisions, is used as a standard whitelist, in so far as the selectivity of a JCR-indexed journal adds a legitimacy of sorts to the articles that the journal publishes. The Multidisciplinary Digital Publishing Institute (MDPI) once included on Beall’s list of potential, possible or probable predatory scholarly open-access publishers, had 53 journals ranked in the 2018 JCRs annual report. These journals are analysed, not only to contrast the formal criteria for the identification of predatory journals, but taking a step further, their background is also analysed with regard to self-citations and the source of those self-citations in 2018 and 2019. The results showed that the self-citation rates increased and was very much higher than those of the leading journals in the JCR category. Besides, an increasingly high rate of citations from other MDPI-journals was observed. The formal criteria together with the analysis of the citation patterns of the 53 journals under analysis all singled them out as predatory journals. Hence, specific recommendations are given to researchers, educational institutions and prestigious databases advising them to review their working relations with those sorts of journals.

Introduction

The journal Nature recently published a definition of the predatory journal (Grudniewicz et al. 2019), a milestone that highlights the increasing concern within academia of these pernicious journals that are exploiting the gold open-access publication model to their upmost, generating enormous financial gain ‘which appears to be the main criteria for publication’ (Frandsen 2017). Predatory journals, harmful to academia and science, ‘sow confusion, promote shoddy scholarship and waste resources’ (Grudniewicz et al. 2019) and therefore jeopardize integrity in science. Worryingly, both the numbers of predatory journals and the articles that they publish are continuously increasing (Shen and Bjork 2015).

In the gold open-access model, reading the publications is free and the publication costs, collected through the Article Processing Charge (APC), are incurred by the authors, their institutions, and funding bodies. A predatory journal will exploit this model to its own benefit with an inexistent or practically inexistent peer-review process (Beall 2015; Frandsen 2017; Demir 2018), which permits the rapid publication of academic papers without due guarantees, with an associated risk of publishing pseudo-science. At the same time, if there is a lack of awareness of predatory journals among scientists, then they will evaluate those publications as if they were legitimate and may naively send lawful papers to predatory journals. At worst, however, authors may send them intentionally with the double effect of ‘polluting the scientific records and perversely advancing the careers of researchers’ (Cortegiani et al. 2020).

Selective databases, such as Scopus, PubMed, and Journal Citation Reports (JCRs), form an index of journals, a sort of whitelist that is used for the purposes of assessing researchers and taking decisions on grant funding (Cortegiani et al. 2020; Siler 2020). However, some articles from some predatory journals are in fact indexed, both in PubMed (Manca et al. 2017a, b)—an alarmingly high number of them in the opinion of Manca et al. (2020)—and in Scopus (Hedding 2019; Cortegiani et al. 2020b). Their new found legitimacy means that any citations will, in consequence, raise the productivity metrics (e.g. h-index) of their authors, generating ‘inflated curricula and doped academic careers’ (Cortegiani, Manca and Giarratano, 2020a).

This investigation is centred on JCRs, perhaps the most prestigious and best recognized database in academia with the widest use at a global level, in order to analyse the Multidisciplinary Digital Publishing Institute (MDPI). This mega-publisher appeared on Beall’s list and was subsequently excluded. Moreover, the Norwegian Register for Scientific Journals, Series, and Publishers downgraded MDPI to 0 in 2019 and later upgraded it to 1 again. These facts suggest that MDPI has been open to question, a dubious publisher that has been moving within a ‘grey zone’. It is deserving of further analysis that will help us to determine whether it is ‘using a broad range of questionable tactics that are neither illegal nor easy to detect’ (Manca, Cugusi and Deriu 2019).

Against that backdrop, the objective of this study is to analyse the behaviour of 53 MDPI-journals that were JCR indexed in 2019, in order to shed light on their qualification and to elucidate whether these journals are in fact predatory. Their characteristics are therefore examined to see whether they are equatable with certain definitions of predatory journals. No longer merely a medium for dissemination, scientific journals are now a key foundation for appointments and funding in scientific research (Shu et al. 2018). The use of JCR has been extended, both for the evaluation of academics and institutions of all types, legitimizing the journals that are indexed, which evaluate the publications included in scholarly records when taking decisions on promotion, tenure, grants, etc. because it is used as a proxy for both quality and integrity. This analysis of the practices of MDPI is of relevance to researchers and for research institutions and funding bodies as well as for JCR itself, which could see its prestige compromised, if it incorporated predatory journals among its indexed journals.

Predatory journals

Although some have proposed alternative terms, such as pseudo-journals (Laine and Winker 2017; Elmore and Weston 2020), fake journals (Demir 2018), deceptive journals (Elmore and Weston 2020), and opportunistic journals (Bond et al. 2019), the term predatory journal is undoubtedly the most extensive in academia and appropriately describes this malpractice (Manca et al. 2020). The librarian, Jeffrey Beall, while at the University of Colorado and now in retirement, coined the term to identify journals that, overlooking quality peer-review processes, seek to generate income exclusively through the APCs that the authors are expected to pay and who are then sent misleading information on citation indexes and spam-related marketing (Beall 2012; Laine and Winker 2017).

Predatory journals are a global threat to science (Harvey and Weinstein 2017; Grudniewicz et al. 2019; Strong 2019), because they undermine its integrity (Vogel 2017; Abad-García 2019), its quality, and its credibility (Bond et al. 2019). They are, in all, a threat to society as a whole, because whenever the articles that they publish are indexed in selective databases, which is the case of PubMed, ‘the items achieve global exposure and are interpreted by readers, including patients, as trustworthy’ (Manca et al. 2019), without those articles having undergone an acceptable editorial and peer-review process. Cortegiani et al. (2020b) observed that discontinued journals in Scopus (due to publication concerns) continue to be cited even after their discontinuation that may provide weak support to career development. In addition, publication in a predatory journal implies the squandering of valuable resources: people, animals, and money, as Moher et al. (2017) have reminded us. Lastly, predatory journals are a threat to scientists who may endanger their careers and devalue their curricula.

The alarming increase in the number of predatory journals (from 1,800 to 8,000 over the period 2010–4) and the exponential growth (from 53,000 to 420,000 between 2010 and 2014) of the articles that they publish (Shen and Bjork 2015) have rendered futile any effort to keep white and blacklists updated. These lists very soon become outdated and incomplete, especially if the resources to keep them updated are scarce. Even so, the identification of predatory journals is still a crucial aspect in the maintenance of quality and scientific integrity. However, the reality is that this process is by no means simple, as Aromataris and Stern (2020) accurately indicated, particularly because ‘predatory publishers have continued to evolve their undesirable art form into sophisticated operations that appear to be, at face value, legitimate’ to the point where ‘certain journals and publishers may blatantly exploit “gray” strategies given that downmarket niches can be lucrative’ (Siler 2020).

The first attempt at identifying predatory journals was Beall’s list, although it eventually disappeared in January 2017 (a cached copy with a new updated section is maintained anonymously at https://beallslist.net/). Given the immense difficulties of keeping a list of predatory journals updated, the use of one from among the very many abundant checklists, such as ‘Think.Check.Submit’ (https://thinkchecksubmit.org/), is encouraged1. Likewise, Cabells’ blacklist and whitelist, now referred to as predatory journals and analytics https://blog.cabells.com/2020/06/08/announcement/, listed more than 12,000 predatory journals in October 2019 (https://blog.cabells.com/2019/10/02/the-journal-blacklist-surpasses-the-12000-journals-listed-mark/). Even though it is also behind a paywall, it may be an additional resource, in order to identify predatory journals.

In any case, the first step towards identifying predatory journals is to have a clear definition for their definitive identification. The criteria for the identification of a predatory journal and a list of suspicious items are lengthy: journal names may be very similar to prestigious journals; the web page may contain spelling errors and questionable grammatical constructions and/or low quality images; the language on the journal webpage may resemble a ‘hard sell’ that targets academic authors; the journal may include articles outside its stated scope or may have a very broad scope; submission can be by email instead of a manuscript management system; the editor-in-chief might also act as the editor-in-chief of another journal with a widely different scope, predominance of editorial board members from developing countries; time-lines for publication and fast-track peer-review processes might appear unrealistic; APCs can be low; impact-factor metrics may be unknown; spam emails may invite academics to submit papers; despite the open-access approach, transfer of copyright may be required; and, finally, non-professional or non-journal affiliated contact information may be given for the editorial office (Manca et al. 2018; Committee on Publication Ethics 2019; Gades and Toth 2019; Kisely 2019; Vakil 2019; Elmore and Weston 2020; Kratochvíl et al. 2020).

The problem is that these criteria, above all if taken in an isolated way, are questionable. For example, the APC can be higher than 1,000 USD (as happens for OMICS), there is no specific limit to the number of editorial board members from developing countries that is considered a proper way of distinguishing between legitimate and predatory journals, the content of the web page appears dubious, and titles may inevitably be mimicked when the journal specialism is very narrow (Kratochvíl et al. 2020).

It is therefore essential to define the concept. The Committee on Publication Ethics (COPE) (2019) clarified that predatory publishing ‘generally refers to the systematic for-profit publication of purportedly scholarly content (in journals and articles, monographs, books, or conference proceedings) in a deceptive or fraudulent way and without any regard for quality assurance [… so] these journals exist solely for profit without any commitment to publication ethics or integrity of any kind’.

The COPE definition of predatory journals is no different in essence to the definition of Grudniewicz et al. (2019): ‘predatory journals and publishers are entities that prioritize self-interest at the expense of scholarship and are characterized by false or misleading information, deviation from best editorial and publication practices, a lack of transparency, and/or the use of aggressive and indiscriminate solicitation practices’. It should be pointed out that, despite the significant advance in the definition proposed by Grudniewicz et al. (2019), so as to recognize predatory journals (and not to fall prey to them), it nevertheless omits an express reference to the quality of peer revision. In spite of its important role in science, it was considered too subjective an aspect—partly because, as with journal quality and deceitfulness, it is impossible to assess—(Grudniewicz et al. 2019; Cukier et al. 2020) for inclusion in an objective definition.

It is essential that researchers correctly identify predatory journals, so as to avoid both serious personnel setbacks (at-risk reputation, disqualifying marks for tenure, responsibility for unethical publishing, resources wasted on APCs, loss of legitimate data and research results, and, in relation to medical publishing, even placing patient safety at risk) and scientific consequences (dilution and distortion of evidence in systematic reviews, deterioration of scientific credibility and integrity, doping of academic careers, loss or return of research funding) (COPE 2019; Gades and Toth 2019; Pearson 2019; Cortegiani et al. 2020; Hayden 2020).

Multidisciplinary DIGITAL publishing institute (MDPI)

The MDPI, with its headquarters in Basel (Switzerland), formerly known as Molecular Diversity Preservation International (https://www.mdpi.com/about/history) that launched its first two journals (Molecules and Mathematical and Computational Applications) in 1996, operates a gold open- access framework. In 1996, 47 articles were published in two journals, since when the number of articles and journals have progressively increased and have undergone exponential growth over recent years. By 2019, 106,152 articles had been published in its 218 journals, an increase of 64.1% over 2018. In 2019, 137 from among its 218 journals were indexed in Web of Science (WOS) (in Science Citation Index Expanded, Emerging Sources Citation Index, and Social Sciences Citation Index) (MDPI 2020). Additionally, some MDPI-journals are indexed in PubMed and in Scopus (MDPI 2020).

According to the MDPI Annual Report 2019 (MDPI 2020), these 218 journals are supported by 67,207 editors (an increase of 55.78% over 2018) with a median time from submission to publication of 39 days (22% decrease over 2018) and APCs ranging from 300 to 2,000 CHF (1 Swiss Franc is approximately equal to 0.92 Euros) with a median of 1.525 CHF. MDPI founder and current president is Shu-Kun Lin, Ph.D (https://www.mdpi.com/about/team).

This mega-publisher was initially incorporated on Beall’s list and was subsequently excluded on 28th October ‘as a result of a formal appeal made by MDPI and assessed by four members of Mr Beall's Appeals Board’ (https://www.mdpi.com/about/announcements/534). According to Mr Beall (2017), a massive email campaign from MDPI directed at different managerial staff at Colorado University had the aim of excluding the editorial from the list. At present, MDPI is not included as a predatory publisher on Beall’s list (https://beallslist.net/), although it draws attention to possible ethical problems with the editorial. Besides, the Norwegian Register for Scientific Journals, Series and Publishers—jointly operated by The National Board of Scholarly Publishing and the Norwegian Centre for Research Data (NSD)— in the framework of the NSD downgraded MDPI to 0 over various months in 2019 and later upgraded to 1 again2. (https://dbh.nsd.uib.no/publiseringskanaler/KanalForlagInfo.action?id=26778andbibsys=false).

Recently, Copiello (2019) focussed attention on the analysis of journal self-citations and publisher self-citations published in the MDPI-journal Sustainability, revealing a form of post-production misconduct, due to the manipulation of citations, which affected both the impact factor of the journal, its visibility and its influence. He demonstrated that the self-citations of Sustainability, in 2016 and 2017, in relation to articles published in 2015, in no way corresponded to a uniform probability distribution.

It may therefore be appreciated that the reputation of MDPI Publisher has undergone ups and downs over the past few years and has both its critics and supporters, which makes it an interesting case study. The aim of this investigation is to provide objective data, in order to verify whether MDPI-journals indexed in JCR fit the definitions of a predatory journal that Grudniewicz et al. (2019) and COPE (2019) have established.

SEE THE REST OF THE PAPER HERE:

https://academic.oup.com/rev/advance-article/doi/10.1093/reseval/rvab020/6348133

Thursday, August 12, 2021

Journal Citation Indicator

Martin Szomszor

Director, Institute for Scientific Information

Clarivate

This is the second in a series of updates to provide information on the launch of the 2021 Journal Citation Reports release.

In a recent blog post we discussed refinements in this year’s forthcoming release of the Journal Citation Reports (JCR)™, describing the addition of new content and hinting at a new metric for measuring the citation impact of a journal’s recent publications.

I’m now pleased to fully introduce the Journal Citation Indicator. By normalizing for different fields of research and their widely varying rates of publication and citation, the Journal Citation Indicator provides a single journal-level metric that can be easily interpreted and compared across disciplines.

The Journal Citation Indicator will be calculated for all journals in the Web of Science Core Collection™ – including those that do not have a Journal Impact Factor (JIF)™ – and published in the 2021 JCR in June.

 “The Journal Citation Indicator provides a single journal-level metric that can be easily interpreted and compared across disciplines.”

 Beyond mere citation counts

Citations serve as an immediate, valid marker of research influence and significance, reflecting the judgments that researchers themselves make when acknowledging important work. Nevertheless, citations must be considered carefully and in context. For validity in assessing the impact of published research, citation analysis must control for such variables as subject field, document type and year of publication.

The new Journal Citation Indicator meets this requirement for journal evaluation, providing a single number that accounts for the specific characteristics of different fields and their publications. Although the calculations behind the Journal Citation Indicator are complex, requiring considerable computing power, the end result is simple:  a single value that is easy to interpret and compare, complementing current journal metrics and further supporting responsible use.

In its calculation for a given journal, the Journal Citation Indicator harnesses another Clarivate measure: Category Normalized Citation Impact (CNCI), a metric found in the analytic and benchmarking tool InCites™. The value of the Journal Citation Indicator is the mean CNCI for all articles and reviews published in a journal in the preceding three years. (For example, for the 2020 Journal Citation Indicator value, the years under analysis are 2017, 2018 and 2019.)

As in the CNCI measurement, the Journal Citation Indicator calculation controls for different fields, document types (articles, reviews, etc.) and year of publication. The resulting number represents the relative citation impact of a given paper as the ratio of citations compared to a global baseline. A value of 1.0 represents world average, with values higher than 1.0 denoting higher-than-average citation impact (2.0 being twice the average) and lower than 1.0 indicating less than average.

In essence, the Journal Citation Indicator provides a field-normalized measure of citation impact where a value of 1.0 means that, across the journal, published papers received a number of citations equal to the average citation count in that subject category.

 

Comparing the Journal Citation Indicator and the Journal Impact Factor

The Journal Citation Indicator is designed to complement the JIF – the original and longstanding metric for journal evaluation – and other metrics currently used in the research community. In addition to the use of normalization, there are several key differences between the Journal Citation Indicator and the JIF.

For example, the Journal Citation Indicator’s calculation on three years of publications contrasts with the two-year window employed for the JIF. This three-year calculation enables the Journal Citation Indicator to be as current as possible, while also allowing more time for publications to accrue citations.

Also, the JIF calculation is based on citations made in the current year, while the Journal Citation Indicator counts citations from any time period following publication, up to the end of the current year.

The table below summarizes how the Journal Citation Indicator compares to the JIF in various measurements.

Table 1 – Comparison of Journal Citation Indicator to JIF

Feature Journal Impact Factor Journal Citation Indicator
All Web of Science Core Collection journals N Y
Field-normalized citation metric N Y
Fixed dataset Y Y
Counts citations from the entire Core Collection Y Y
Counts citations from the current year only Y N
Includes Early Access (EA) content from 2020 onward Y Y
Includes unlinked citations Y N
Fractional counting N N

 

Required: Responsible, informed interpretation

Despite the increased uniformity and comparability afforded by the Journal Citation Indicator, as with any metric, interpretation must be instilled with judgment. Closely adjacent fields – e.g. those in the physical sciences – can be compared fairly readily. On the other hand, comparing journals in physical-science fields with, say, those in the arts and humanities, would not be advisable, as publication output, citation dynamics and other elements tend to differ so sharply between those areas.

The Journal Citation Indicator will bring citation impact metrics to the full range of journals indexed in the Web of Science Core Collection, increasing the utility of the JCR as it expands its coverage to more than 21,000 scholarly publications. Providing this information for around 7,000 journals in the ESCI will increase exposure to journals from all disciplines, helping users to understand how they compare to more established sources of scholarly content. By incorporating field normalization into the calculation, the Journal Citation Indicator will also allow users to compare citation impact between disciplines more easily and fairly. When used responsibly it can support more nuanced research assessment.

The debut of the Journal Citation Indicator represents only the latest development in the long evolution of the JCR – a continuum that has recently seen the addition of Open Access data, Early Access content and more.

What’s more, the evolution continues: watch this space for details on further refinements in the new release that will transform the JCR user experience.

Read the full white paper  for a detailed discussion of the Journal Citation Indicator, its calculation and its implications.

Find out more about the Journal Citation Reports here.

Wednesday, July 14, 2021

Material science journal InfoMat receives first impact factor of 25.405

July 13, 2021 - Hoboken, NJ - Wiley, a global leader in research and education, today announced that its open access journal InfoMat received its first impact factor of 25.405. Launched in 2019 in partnership with a double first-class university in China, InfoMat addresses the growing scientific interest in new materials and their applications in the rapid development of information technology.

The Journal Citation report (JCR), published annually by Clarivate, evaluates more than 20,000 of the world's highest-quality academic journals using a range of indicators, descriptive data and visualizations. The 2021 report, published in July, recognized InfoMat within the top 5% (11/335) of journals indexed in the materials science, multidisciplinary category of the Journal Citation Report.

With a dedicated global team of editors, InfoMat is internationally broad and welcomes innovative interdisciplinary research. InfoMat provides a platform linking fundamental research with industrial applications across academia and industry.

"We're proud to be a partner to the scientific community to deliver this world-class journal, and are glad to see InfoMat recognized on the global stage for its excellence," said Philip Kisray, Wiley SVP & GM, International Education and Country Leader, China. "InfoMat is one of several new OA partnership journals with China's scientific community, and we look forward to seeing more high-impact titles across a range of subjects and interdisciplinary areas in the future."

InfoMat is among Wiley's more than 400 gold open access journals. Open access is a rapidly growing scholarly publishing model that allows peer-reviewed articles to be read and shared immediately, making important research broadly available. In addition to its open access portfolio, Wiley continues to advance its open access strategy through transitional agreements with libraries and consortia. The 14 such agreements Wiley has signed globally serve more than 1,000 institutions - and result in the publication of more than 20,000 open access articles per year.

For more information, please visit InfoMat on WOL or Wiley.com.

###

About Wiley

Wiley (NYSE: JWA) is a global leader in research and education, unlocking human potential by enabling discovery, powering education, and shaping workforces. For over 200 years, Wiley has fueled the world's knowledge ecosystem. Today, our high-impact content, platforms, and services help researchers, learners, institutions, and corporations achieve their goals in an ever-changing world. Visit us at Wiley.com, Like us on Facebook, and follow us on Twitter and LinkedIn.

Media Contact: Geena De Rose, Wiley / gderose@wiley.com

Sunday, June 20, 2021

Discontined by Scopus-Journal of Critical Reviews (JCR)

SCImago Journal & Country Rank

 Journal Name: Journal of Critical Reviews

Short name: JCR

Subject Area and Category: They seem to take a wide spectrum of papers.

Country: India

Review date: 2021.04.22 Updated: 2021.06.20

SJR Quartile: Scopus/added to SJR but no quartile yet





ISSN: 23945125

Publisher: Innovare Academics Sciences Pvt. Ltd

Contact Email:

APC: The APC section is 224 words long, but not once is the amount of the APC fee stated. Why the secrecy?












Editor(s):

Beall Listed: NO

Scopus Discontinued List: YES

Frequency: bimonthly

Template: Very sloppy author guidelines! Did someone get paid to do this webpage? Papers stated to be limited to 18 pages.

Style: Very complex reference system used.

Copyright:

Similarity threshold:

Submission process:

Journal Web Page Comments:

Handbook comments: We suggest another journal.