Monday, December 27, 2021

China leads world in 4 scientific fields

 

http://www.ecns.cn/news/2021-12-28/detail-ihauemxn3937472.shtml

    1
2021-12-28 09:08:19China DailyEditor : Li Yan

China is leading the world in four scientific fields in terms of the total number of citations generated by the country's international academic papers published over the past decade, according to annual reports published on Monday by the Institute of Scientific and Technical Information of China, an organization affiliated with the Ministry of Science and Technology.

These four fields are materials science, chemistry, engineering technology and computer science, with computer science being added to the list this year, the reports said. China also holds second place in 10 other fields, including agricultural science, biology and biochemistry, and environmental and ecological sciences.

The data for the reports was collected from global scientific literature databases including the Science Citation Index, Scopus and Ei Compendex.

Academic citation is one of the key indicators of a paper's quality and influence, and a country's total number of citations in a scientific field is commonly seen as a reflection of its overall research capability in that discipline.

Last year, China published 1,833 research papers and articles in 15 of the world's top scientific publications, including Nature, Science and Cell. This feat placed China two spots higher on the ranking than 2019, reaching second place last year behind the United States.

Zhao Zhiyun, director of the institute, said these achievements meant China's scientific literature has made noticeable progress in producing an increasing number of high-quality papers with global influence.

"We hope our reports can provide timely insights for science workers, administrators and policymakers to identify the latest trend in China's scientific literature and publication sector," she said.

Experts said that China climbed in the global ranking partly due to Chinese scientists publishing a large quantity of papers regarding the COVID-19 pandemic, as well as co-authoring and publishing more papers with international peers.

Among them, Huang Chaolin, president of Wuhan Jinyintan Hospital, published a paper in January last year in the journal The Lancet that described the clinical features of patients infected with the novel coronavirus in Wuhan, Hubei province. It had been cited over 13,000 times as of September, Zhao said.

Last year, Chinese scientists co-wrote around 144,500 scientific papers, 14,400 more than in 2019, with collaborators from 169 countries and regions. China's total number of co-authored papers had increased by 11.1 percent in 2020 compared with the previous year, with the US being China's biggest partner in scientific literature.

China also published 485 megacollaboration scientific papers last year, each involving at least 100 scientists and 50 research institutions, in fields such as particle physics, astrophysics and nuclear physics.

Guo Tiecheng, deputy director of the institute, said that from 2011 to October, China had published over 3.36 million international scientific papers, and these papers had been cited over 43.32 million times within that period. The US remains the world's top producer of scientific papers and has the largest number of citations.

However, China's average number of citations per paper is around 12.8 within the 10-year period, which is lower than the global average of 13.6, Guo said.

Thursday, December 23, 2021

Beall's List of potentially predatory publishers and journals

 https://www.sciencedirect.com/science/article/pii/S0099133321001750


Abstract

Although there are at least six dimensions of journal quality, Beall's List identifies predatory Open Access journals based almost entirely on their adherence to procedural norms. The journals identified as predatory by one standard may be regarded as legitimate by other standards. This study examines the scholarly impact of the 58 accounting journals on Beall's List, calculating citations per article and estimating CiteScore percentile using Google Scholar data for more than 13,000 articles published from 2015 through 2018. Most Beall's List accounting journals have only modest citation impact, with an average estimated CiteScore in the 11th percentile among Scopus accounting journals. Some have a substantially greater impact, however. Six journals have estimated CiteScores at or above the 25th percentile, and two have scores at or above the 30th percentile. Moreover, there is considerable variation in citation impact among the articles within each journal, and high-impact articles (cited up to several hundred times) have appeared even in some of the Beall's List accounting journals with low citation rates. Further research is needed to determine how well the citing journals are integrated into the disciplinary citation network—whether the citing journals are themselves reputable or not.

Introduction

Beall's List (Beall et al., 2020) is the oldest and best known list of potentially predatory Open Access (OA) journals.1 It identifies more than 1300 publishers and nearly 1500 additional journals that authors are encouraged to avoid. (If a publisher appears on the list, all its journals are considered suspect.) The List was intended as a means of identifying predatory journals and publishers—those that pose as peer-reviewed outlets but accept nearly all submissions in order to maximize revenue from the article processing charges (APCs) paid by authors, their institutions, and their funding agencies. However, publishers' intentions, predatory or otherwise, are difficult to gauge, and the compilers of the List have relied on a set of subjective criteria that represent departures from the established norms of scholarly communication. For instance, the 54 warning signs identified by Beall (2015) include unusually rapid peer review, failure to identify editors and board members, boastful language, misleading claims about index coverage or citation impact, lack of transparency in editorial operations, absence of a correction/retraction policy, poor copyediting, assignment of copyright to the publisher rather than the author despite the journal's OA status, and the use of spam e-mail to solicit authors or board members.

The use of these criteria is not unreasonable, given the number of OA journals and the difficulty of evaluating scholarly content or peer review practices across a wide range of academic disciplines. Arguably, however, a comprehensive assessment would require the consideration of at least six distinct dimensions of journal quality2:

1.

Editors' and publishers' intentions, legitimate or predatory. Intentions are difficult to evaluate, however, since organizations with limited resources and little publishing experience may be earnest in their efforts to provide for rapid publication in a high-growth field, to support a small but distinctive research community, or to address topics of interest to populations whose needs may be overlooked by the larger commercial publishers, scholarly societies, and university presses.

2.

Adherence to established norms of peer review such as anonymous review, use of expert reviewers, absence of bias in reviewer selection, adequate time for review, a reasonable acceptance rate, and reviews intended to improve the paper through revision. This standard refers to the integrity of the process rather than its outcomes.

3.

Adherence to norms of scholarly publishing other than peer review: e.g., transparency, economic sustainability, provisions for long-term preservation of content, reasonable fees, professionalism in presentation and web design, and a web interface that facilitates discovery and access.

4.

Scholarly quality, as assessed by expert evaluators. The evaluators' assessments may account for factors such as clarity of research questions and goals, data quality, appropriateness of methods, statistical rigor, empirical grounding of interpretations and conclusions, extent to which the results can be generalized to other contexts, uniqueness, innovation, and importance within the framework of previous research.

5.

Impact on subsequent scholarship: e.g., number of times cited, outlets in which the journal is cited, rate at which citations accrue, multidisciplinary impact, and extent to which the theories, perspectives, and methods introduced in the journal are incorporated into later work. Although scholarly impact is related to quality, it is also influenced by other factors (Bornmann et al., 2012Bornmann & Leydesdorff, 2015Tahamtan et al., 2016).

6.

Impact on teaching and practice, as shown by citations in textbooks, citations in students' papers, inclusion of articles in course syllabi and reading lists, and influence on professional norms and standards.

The criteria used to evaluate journals for Beall's List focus almost exclusively on dimensions 2 and 3, neither of which directly represents the scholarly quality of the papers published in the journals. For instance, sending spam e-mail to potential authors is unproductive and likely to generate a negative reaction (Kozak et al., 2016Lund & Wang, 2020), but it does not tell us anything about the journal's quality or impact. Moreover, even journals established solely to generate revenue may publish work that is legitimate and innovative. That is, a publication outlet may be useful in functional terms even if the publishers' intentions are predatory.

This study uses Google Scholar (GS) data to evaluate the scholarly impact of the 58 accounting journals on Beall's List as well as a comparison group of 61 presumably non-predatory accounting journals indexed in Scopus. Citations per article and CiteScore percentile are calculated or estimated for each journal based on data for more than 13,000 articles published from 2015 through 2018. The results focus on four research questions:

1.

Is inclusion in Beall's List necessarily associated with low citation impact? Are quality dimensions 2 and 3 (the criteria used by Beall) good indicators of a journal's status with regard to dimension 5, which is widely accepted in other contexts as an indicator of scholarly merit?

2.

Where would each of the Beall's List accounting journals fall within the hierarchy of Scopus accounting journals if they were included in Scopus? Do some have substantially higher citation rates than others?

3.

How extensive is the variation in citation impact among the articles within the Beall's List journals? Are the more highly cited articles concentrated in the more highly cited journals, or can they be found in the less cited journals as well?

4.

Do Google Scholar citation data provide an effective means of gauging the citation impact of journals not included in Web of Science or Scopus? For the journals included in both GS and Scopus, is citations per article closely related to CiteScore?

Context and previous research

The methods and results of this study are informed by research on (1) predatory journals and journal lists, (2) the impact and quality of predatory journals, and (3) journal ratings and rankings in accounting and related fields.

Predatory journals and journal lists

Beall's definition of predatory publishers is grounded in the publisher's motives (Beall, 2012Cobey et al., 2018Krawczyk & Kulczycki, 2021). Motives are not always easy to discern, however, and some legitimate journals display idiosyncrasies that might lead potential authors to question their legitimacy (Eriksson & Helgesson, 2018Grudniewicz et al., 2019Siler, 2020). For instance, Olivarez et al. (2018) discovered that several well-regarded library and information science (LIS) journals display at least some predatory characteristics, and Shamseer et al. (2017) found that the most distinctive characteristics of predatory journals are attributes unrelated to the content of the published articles, such as poor web site design.

Beall's List and its closest equivalents, such as Cabell's Predatory Reports, are valuable tools in the fight against predatory journals. Nonetheless, these lists have been criticized for imprecise evaluation criteria, inconsistent application of those criteria, lack of transparency, infrequent updates, bias against publishers in developing countries, and the absence of systematic mechanisms for the re-evaluation of publishers and journals (Berger & Cirasella, 2015Chen, 2019Davis, 2013Dony et al., 2020Esposito, 2013Kakamad et al., 2020Kscien Organization, 2021). Moreover, nearly all such lists focus on just one or two dimensions of journal quality. Of the 15 predatory journal lists identified by Koerber et al. (2020) and Strinzel et al. (2019), none evaluate journals based on a systematic examination of their scholarly quality or impact. This is perhaps not surprising, since content-based reviews require the detailed evaluation of individual articles.

Impact and quality of predatory journals

Three recent investigations have examined the citation impact of the articles published in predatory journals. The most thorough is that of Björk et al. (2020), who compared the five-year GS citation counts of articles in 250 predatory journals (from Cabell's watchlist) and 1000 presumably non-predatory journals (from Scopus)—one article from each journal. Both sets of journals covered a range of subject areas. The watchlist articles had an average of 2.6 citations, and 56% were uncited after five years. In contrast, the Scopus articles had an average of 18.1 citations, and only 9% were uncited after five years.3 Investigating further, Björk and associates found that just 10 predatory journals (10 articles) accounted for nearly half the citations, and that at least 4 of the 10 would not be deemed predatory based on their current characteristics. They concluded that although most predatory journals have “little scientific impact,” some do include articles that could have been placed in much better journals if the authors hadn't “opted for the fast track and easier option of a predatory journal” (Björk et al., 2020: 1, 8). Similar findings have been reported by Bagues et al. (2019), who reviewed the papers published by Italian academics in Beall's List journals from 2002 to 2012, and by Nwagwu and Ojemeni (2015), who compiled information on the papers published in 32 predatory biomedical journals based in Nigeria.

The meaning assigned to these citation statistics is not always straightforward, however. Not all citations have the same purpose, and not all reflect well on the cited work. A paper may be cited to highlight an important finding, to honor the groundbreaking work of early investigators, to point out the flaws in a previous study, to draw support from the dominant perspectives of a field or subfield, to critique a poor research design, or simply to acknowledge the existence of a research project or report (Beed & Beed, 1996Ha et al., 2015Nisonger, 2004). Likewise, a predatory journal's relatively high citation rate may be interpreted in multiple ways.

The current analysis was undertaken based on the assumption that relatively high citation counts would reflect favorably on the Beall's List journals—that a higher citation rate for a “predatory” journal demonstrates that it contributes to the literature despite the factors working against it. In terms of evaluators' perceptions, the deck is stacked against Open Access journals and against the developing countries where many Beall's List journals operate (Albu et al., 2015Berger & Cirasella, 2015Frandsen, 2017Nwagwu & Ojemeni, 2015Shen & Björk, 2015Xia et al., 2015Yan & Li, 2018). Articles in Beall's List journals are generally excluded from the foremost bibliographic databases, so they are presumably harder to find, and it is reasonable to assume that authors will be biased against citing any journal that has been publicly labeled as predatory. Within this framework, citations to an article in a Beall's List journal indicate that subsequent authors have identified it as a genuine contribution despite the biases that make them inclined not to cite it.

There is an alternative interpretation, however. Several authors have argued that citations to the papers in predatory journals do not necessarily indicate that the articles are legitimate (Akça & Akbulut, 2021Anderson, 2019Frandsen, 2017Nelson & Huffman, 2015). In fact, they regard these citations as potentially harmful—as indicators that the scholarly literature has been polluted with flawed methods and potentially false results. Viewed from this alternative perspective, the more highly cited Beall's List journals have been successful at claiming legitimacy for papers that may be inaccurate, biased, or otherwise misleading.

The actual scholarly quality of the papers in predatory journals is therefore central to understanding the implications of high or low citation rates. However, just two studies have directly addressed this issue. McCutcheon et al. (2016) compared 25 articles published in Beall's List psychology journals with 25 published in Scopus journals of intermediate citation impact. Each article was blinded, then reviewed by the research team on the basis of five criteria. Although the team found more than four times as many statistical and grammatical errors in the Beall's List articles, the score differentials for the other three criteria (literature review, research methods, and overall contribution to science.) were not nearly as pronounced. Overall, McCutcheon and associates were struck by the variations in quality among the Beall's List articles. Some were of uniformly low quality while others received high marks in every area. Likewise, a review of 358 articles in Beall's List nursing journals revealed that 48% of the papers were poor rather than average (48%) or excellent (4%), and that many had numerous errors in writing or presentation. Nonetheless, only 5% reported findings “potentially harmful to patients or others” (Oermann et al., 2018: 8). That analysis may have been biased, however, since all the assessors knew in advance that the papers had been published in predatory journals.

Journal ratings and rankings in accounting and related fields

A clear distinction can be made between the quality of a paper and the quality of the journal in which it appears.4 The evaluation of every paper is not always feasible, however, so journal ratings and rankings remain important to researchers, evaluation committees, universities, policymakers, and funding agencies.

Citation-based rankings of accounting journals can be found within each of the three large, multidisciplinary citation databases: Scopus, Journal Citation Reports (part of Web of Science), and Google Scholar (Walters, 2017). Additional ratings or rankings of accounting journals have appeared in the scholarly literature. These include at least five investigations based on actual behaviors such as publishing, indexing, and citing (Beattie & Goodacre, 2006Chan et al., 2009Chan et al., 2012Chang & McAleer, 2016Guthrie et al., 2012) as well as several based on surveys that ask respondents about their opinions, choices, or hypothetical behaviors (Australian Business Deans Council, 2019Bonner et al., 2006Chartered Association of Business Schools, 2018Harzing, 2020, June 24Lamp, 2010). For scholars interested in the relative standing of the journals on Beall's List, all these information sources have a major disadvantage: they seldom include the journals at the lower end of the prestige hierarchy. Of the 58 active accounting journals on Beall's List, just three are included in Scopus and none are included in Journal Citation Reports. No more than three are included in the journal ratings of the Australian Business Deans Council (2019), the Australian Research Council (Lamp, 2010), or the Chartered Association of Business Schools (2018), and none are included in any of the other publications mentioned here.

Fortunately, Google Scholar does cover most of the journals ranked lower in the hierarchy. It therefore allows us to compare the accounting journals on Beall's List with the presumably non-predatory accounting journals indexed by Scopus. Although GS has been criticized for bibliographic errors that have limited its effectiveness as a citation database (Bar-Ilan, 2006Bauer & Bakkalbasi, 2005Jacsó, 2005aJacsó, 2005bJacsó, 2006), these errors have become less common over time (Doğan et al., 2016). Recent studies have shown that Google Scholar's coverage of the scholarly literature is comprehensive and unbiased (Chen, 2010Delgado López-Cózar et al., 2019Harzing, 2013Harzing, 2014Martín-Martín et al., 2017Walters, 2007), and that its citation counts are consistent with those obtained from Scopus and Web of Science (Harzing, 2013Harzing & Alakangas, 2016Harzing & van der Wal, 2009Martín-Martín et al., 2018Prins et al., 2016). Within the field of accounting, Rosenstreich and Wooliscroft (2009) and Solomon and Eddy (2019) recommend GS for citation analysis due to its comprehensive coverage.

READ THE REST HERE


Wednesday, December 22, 2021

Beware "Design Engineering" - many papers being sent to the fake/hijacked site

SCImago Journal & Country Rank
For this SJR ranking and ISSN (00119342) and SJR listed publisher (Rogers Media Publishing),
this is the journal's correct URL: https://www.design-engineering.com/digital-edition/

I also suggest authors read the many comments about the problems with this journal at the SJR portal site for DE at: 

The following URL is to the fake/hijacked site. Please note that on this site there is no ISSN listed which is technically impossible for a Scopus indexed journal as they must contain their ISSN on their home page: http://www.thedesignengineering.com/index.php/DE

On the fake site, I also note there are 19 Thai papers as of 2021.Dec.21.
These authors have been scammed if their intent was to publish their paper in an Scopus/SJR indexed journal.

Also, in the Scopus source list, there are three listings for journals which have the name "Design Engineering". Two of these are now listed as "inactive".







This email is from the hijacked journal site. This site is not Scopus indexed. 

researchpublication66@gmail.com


Dear Researchers,

 

We feel happy to invite you for our upcoming special issue in Journal of Design Engineering (Toronto) Issues Scopus Index- Q4, Ei Compendex.

 

Design is a huge and multi-faceted activity in Canada, covering virtually every industry and process. As Canada’s leading voice for design and in-plant engineers our mission is to provide an effective forum for the exchange of news, new ideas, product information and real-world applications. 


Design Engineering offers a complete and balanced approach to editorial features and new product information. Our two abiding principles are editorial integrity and publishing excellence. Our readers demand and deserve nothing less. Our readers also demand value-based resources that meet their industry specific needs.
 

Papers will be published in 20 days and the publication fee is 400 euros.

  

Best Regards,

Team Design Engineering (Toronto)

 

Journal website: http://www.thedesignengineering.com/index.php/DE

Scopus ID: https://www.scopus.com/sourceid/28687












 

Tuesday, December 21, 2021

Revealed: The inner workings of a paper mill

 

In 2019, Retraction Watch ran an exclusive story of a Russian paper mill operating under the business name “International Publisher LLC”.  Since then, Retraction Watch and  other scientific news and blogging sites have continued to report on the activities of research paper mills, including International Publisher  and its primary website, 123mi.ru.  These mills provide an array of fraudulent services to researchers and academics seeking to publish articles in peer-reviewed journals.  The services they provide include ghostwriting, brokering authorship positions on papers accepted for publication, and falsifying data.

Our project  augments this stream of reports about paper mills as we focus on the activities of International Publisher and the papers brokered through 123mi.ru.  As part of this project we are curating  a database of all the papers and authorship positions that have been advertised on this website.  Our database consists of roughly 2,353 unique article titles with 8,928 authorship positions.  While the majority of the known paper mill activity has been in the biomedical sciences, our work on just this one paper mill demonstrates that paper mill products have infiltrated multiple scientific disciplines in which career advancement is heavily reliant on academic publications. 

So far, we have identified nearly 200 published articles that may have been brokered through this paper mill and which cross disciplines including (but not limited to) humanities, social sciences, nursing, and education.  We also observe numerous papers on COVID-19 that have been or currently are advertised for sale.  

Our project is far from complete, but we thought it important to report on our methods and preliminary findings via Retraction Watch.  In doing so, we hope to raise awareness of a serious and potentially widespread problem, along with strategies to help detect and possibly prevent fraudulent activities.  

Overview of International Publisher LLC

International Publisher LLC has a very well-established online presence.  Their main site is http://123mi.ru, with several other mirror sites that use the International Publisher LLC branding, like some type of franchise with outposts in different countries.  

In 2019, Clarivate’s Web of Science Group examined the site and found 344 articles were available for sale.  Clarivate sent a cease-and-desist letter to International Publisher.  Retraction Watch followed up, asking for a comment.  A person who identifies as Ksenia Badziun responded: 

Regarding the quantity of the manuscripts we have published, I want to confirm that it grows every time. It could be a pleasure for me to show all the list of the manuscripts we published, but due to the policy of our company and contracts between the authors/publishers and our company, I simply cannot do it. On the other hand, I want to inform that we have our own system and program with all the records and story on each particular manuscript sent to us from the authors. The access to our program is provided to some of the editors-in-chief, publishers and other our Partners.

Ksenia Badziun has a LinkedIn page that identifies her as the Chief Editor of International Publisher.    

We did not discover this paper mill.  A simple Google search of their website (http://123mi.ru) provides dozens of links to articles and blog posts describing this paper mill.  Using a browser plug-in to translate the text from Russian to English, you can easily review hundreds of papers and authorship positions that are currently available for sale.   

Note: Translated from Russian to English
Note: Translated from Russian to English

As papers are sold their titles are taken offline, and new titles quickly appear.  These titles likely come from a range of sources including direct solicitation of established researchers. For example, an ethics presentation by Ronald Gilman, PhD (an experimental nuclear physicist at Rutgers University), displayed  an email inquiry from Badziun, who asked Professor Gilman:  

I suppose, that you have some manuscripts that were sent to the journals but not accepted yet.  In this case, you might be interested to add a co-author to such a manuscript to get some profit.

Professor Gilman provided us with an original copy of the email, which confirmed that International Publisher is actively involved in brokering authorship positions.   

Curating data on fraudulent activity

One of the challenges of investigating this paper mill is that, as the authorship positions for a given paper are sold, the paper is removed from the website.  Our interest is identifying the thousands of articles that were previously made available for sale and may now be in the scientific literature, but  on the website we can only see paper titles that are currently available for sale. We then discovered that the contracts for each authorship position for all the papers that have been advertised on 123mi.ru are on their web server, even though they do not have links that make them directly accessible through their website.  Thus, we scraped the data using a script in Python.  This process involves crawling through thousands of links and making copies of the contracts that are in HTML format.  We parsed the HTML to extract the contract details, which were saved in a format that can be easily analyzed.  Using these procedures, we obtained contracts for  8,928 authorship positions that were associated with 2,353 unique papers.  These contracts contained the paper titles, target indexing (Scopus or Web of Science), cost of the authorship position, and target publication date.  We note that these contracts did not include any additional information, such as availability dates or names.  

Among these paper titles, 961 articles are currently available for sale, suggesting the remaining 1,392 papers may be in the scientific literature or, alternatively, could not be sold.  We sought to locate published papers which we could link back to a sold paper with a reasonable degree of certainty.  We did this by using the paper titles as initial search terms and seeking matches—both exact and close matches, since titles are sometimes edited during the review process and whatever was advertised could have changed before going into print. An obvious problem with this search strategy is coincidental matches; some areas of study are highly nuanced, so titles with a close or even exact match that were not produced by the paper mill are entirely plausible. To solve this problem, we used matching titles as only one of many different indicators in our search for fraudulent papers.  

Once we had a tentative list of published articles, we investigated whether the authors had a background in the paper’s topic area and/or had a history of working together. We also considered whether the collaborations were multi-institutional and/or multi-national and whether the corresponding authors used non-institutional email addresses (e.g., Gmail, Hotmail, Yahoo, QQ).  Non-institutional emails allow the paper mill to impersonate an author to manage submissions and communicate directly with anybody who raises questions.  We also looked at the outlet for publication. Articles published in open-access journals, especially those with high volumes of output, were considered as potential targets of fraud.  In sum, we used professional judgment—particularly what we know from our own experiences in academic publishing as well as our knowledge of paper mill publications—to flag potentially fraudulent papers. However, we also required the presence of one additional strict criterion in our flagging procedures: temporal precedence.

Temporal precedence

To meet a high evidentiary standard of potential fraud, we must establish that the paper in question was made available for purchase before the article was published—that is, we need evidence of temporal precedence. Otherwise, somebody could copy a title from an article and make it available for sale.  This seems to be the case with Teziran, which is another research paper mill that was reported by Retraction Watch that appears to be reselling published articles.  We wanted to ensure our claims are based not only on the indicators we describe, but also on temporal precedence.  Since the contracts we scraped did not have a timestamp indicating when the paper title was made available for purchase, we relied on the Internet Archive to establish temporal precedence.  

The Internet Archive is a nonprofit digital library that takes historical snapshots of the World Wide Web, which are publicly available through their service called the Wayback Machine.  Over 100 snapshots of the 123mi.ru are currently available on the Wayback Machine, allowing most of their online activity to be recreated from 2016 to the present. The records indicate that 123mi.ru did not start listing paper titles until late 2018, when they introduced a new site layout with a menu of options for buying and selling papers. 

The Wayback Machine is a key data source for a couple of reasons.  For temporal precedence, we can look at the article’s publication date (and the submission date, if available) and establish whether the paper was previously made available for sale. For example, if a paper was made available for purchase on 123mi.ru before the first online publication in a journal, we have evidence of temporal precedence. Additionally, each Wayback Machine snapshot can be scraped in the same manner that was scraped from the 123mi.ru web server.  That means that  even if these contracts are removed from the web server, most of the data can be independently reconstructed from the Wayback Machine.  Unfortunately, because the Wayback Machine does not take snapshots at fixed intervals, some data may not be recoverable. 

Preliminary findings

So far, we have flagged and are actively investigating nearly 200 published papers that may have been sold by International Publisher.  We  plan to release our database of published paper titles that have likely been obtained from International Publisher but will not make that database publicly available until all individual authors have had the opportunity to comment on the evidence we have compiled.  Instead, we provide two case examples to highlight the nature of the potentially fraudulent papers we have identified. The authors and publishers referenced in these examples were provided with an opportunity to comment.  

Case example 1: International Journal of Emerging Technologies in Learning

As we were scanning the literature trying to find potential matches, we found a  Russian freelance site known as freelancehunt that was seeking a writer for one of the titles we scraped from the mi123.ru website.  The freelance posting indicated that the “authors of the article” were from Beihang University and Russian State Vocational Pedagogical University.  

Graphical user interface, text, application

Description automatically generated

This discovery is significant.  First, we found an article published in the International Journal of Emerging Technologies (iJET) with a near-identical title.  Two of the authors were affiliated with the universities noted in the freelancehunt advertisement.  We also established temporal precedence with the Wayback Machine. We therefore have multiple indicators of potentially fraudulent publishing.  Moreover, the freelance post reveals that International Publisher not only brokers authorship positions but also provides ghostwriting services.  

We then found more published articles in iJET, which led us to do a more thorough review of this journal.  We identified a total of 29 papers published by iJET within the last two years that have exact or close matches with titles scraped from 123mi.ru.  As we were establishing temporal precedence with the Wayback Machine, we discovered a highly unusual note from a June 29, 2020 snapshot.  The translated version of the advertisement is shown below:

Graphical user interface, text, application, email

Description automatically generated

This advertisement indicates that this article was scheduled for publication as part of a collection of 10 papers.  Because the advertisement includes the IDs of other paper titles (#1081 – #1090), we were able to easily link those to our collection of contracts obtained from the 123mi.ru web server.  Nine of the 10 articles listed in this advertisement were published in the iJET, Volume 16(2), 2021.  As this journal publishes two volumes every month, the release of this issue coincides with the advertisement.  Moreover, among the nine published articles, eight articles had at least one author affiliated with a specific medical institution.  Taken together, these findings raised many questions.  So, our next steps involved contacting the authors of the 29 papers we flagged in iJET, along with the editors.  

iJET:  Author responses

In contacting the corresponding authors of these 29 papers, we received responses from three authors.  The three authors provided different comments and explanations of the evidence.  Because we are unable to confirm the person with whom we were communicating, we are not releasing their names but have provided the editors of Retraction Watch with copies of our correspondence for verification.    

One author insisted that the paper is, in fact, his own work and has no knowledge of the matter and is likely a coincidence.  After this author was provided with our evidence of a second article, he remarked: 

I am very surprised, why the title of my article is the same as this website. It may be because before I submit the article, I will check the repetition rate of this article on the duplicate check website. When I checked the article, the hacker stole my article, so our article title will be the same.  I think this is very possible.

We then discovered a third potentially fraudulent article that was in an entirely different field with different co-authors.  This person blocked our emails, preventing us from obtaining any further comments.  The response from another author was longer and described the difficulties of publishing research in an English language journal and the difficult financial circumstances he is facing.  However, this person vehemently denied any involvement with 123mi.ru:  

I really know nothing about the website you said. My first perception is that there may be colleagues or competitors who slander me or want to amuse me. In order to get a promotion faster than me, some competitors will do anything (such as whether they will steal my paper U disk, publish it on the web page and set up a game to frame me?) we can’t know that the hearts of competitors are dangerous. 

The third respondent also denied our involvement with the website.

It seems to me that your newsletter is a fraud and, possibly, an attempt to steal personal data, which I will report to the advising services.

This person also directed us to their research profile to establish that the publication is legitimate.  We followed up with this person with the historical snapshot and link to the Internet Archive showing that a paper with a near identical title was made available for sale before the publication was online.  We did not receive a response.   

iJET:  Editor’s response

On Aug. 3, 2021, we provided the editor-in-chief of iJET, Dominik May, PhD, preliminary evidence of 19 potentially fraudulent publications that were published in the journal.  Dr. May responded on Aug. 5, 2021, and cc’d the executive editor, Michael Auer, PhD, on the communication.  Drs. Auer and May are the president and vice president, respectively, of the International Association of Online Engineering, which manages seven open-access journals, including iJET.    

Dr. May promptly replied, indicating, “we did not know about such fraudulent activities and are pretty much shocked about it.”  We exchanged numerous emails with Dr. May over the next two months discussing additional evidence that emerged.  On Sept. 27, 2021, Dr. May provided the following official comment:  

For over 15 years, the interdisciplinary International Journal of Emerging Technologies in Learning (iJET) aims to focus on the exchange of relevant trends and research results as well as the presentation of practical experiences gained while developing and testing elements of technology-enhanced learning. So it aims to bridge the gap between pure academic research journals and more practical publications. In this field, the journal’s integrity is our focal concern to ensure that both authors and readers can trust and learn from the work published in our journal. When we were contacted by Dr. Perron’s team and learned about the working results concerning a professional paper mill undermining good scientific practice, we were at the same time astonished about the fraudulent procedure and thankful for getting to know about it. The effort and dedication to detail Dr. Perron and his team put into the investigation is invaluable for us as a journal and goes beyond the realms of possibilities in terms of background checks we have as editors. However, each of the manuscripts published in our journal undergoes a double-blind peer-review process and goes through a process of three editorial checks. The fact that it was still possible to place potentially fraudulent papers in the journal, deeply upsets us. We as the journal leadership can only condemn such procedures and dissociate ourselves from any person or entity, which is part of it! However, the revelations have made us rethink our internal process substantially. In reaction to the unveiled procedures, we immediately informed our editorial board members and shared a list with potential hints to identify further fraudulent manuscripts. Furthermore, we checked every submission in the current review or editorial process and rejected manuscripts matching these hints. To prevent any further critical submissions and address fraud, the additional steps we are taking include but are not limited to run internal workshops with editors and reviewers to detect fraudulent submissions, develop a blacklist of individuals and institutions that are clearly connected to the paper mill, and put a particular focus on checking authors groups’ credentials. Following these steps, we are very positive to detect further fraudulent submissions and ensure our journal’s integrity.

Case example 2:  Energies

Our second example involves a paper title advertised as “Financial Audit Methodology of Energy Corporations.”  After performing a keyword search, we flagged an article published in the MDPI journal Energies. The title was a close match, but not exact: “Impact of Non-Financial Factors on the Effectiveness of Audits in Energy Companies.” A few indicators of potential fraud were observed, such as the article’s being published in an open-access journal and no history of collaboration between the lead author and coauthors located in other countries. However, the strongest evidence came from the Wayback Machine.  Specifically, the Wayback Machine shows that this article was advertised on mi123.ru as early as April 29, 2020, with an abstract that has a remarkably close match with the article that was first received by Energies on Nov. 7, 2020.  We reached out to all the authors of this paper on Sept. 20, 2021, asking for an official comment.  To date, we have not received a reply.  

Graphical user interface, text, application, Word

Description automatically generated

In addition to this article, we identified five more MDPI articles that have potential links to International Publisher.  The collection of evidence for these papers was submitted to MDPI on Aug. 27, 2021, for review and comment. 

Damaris Critchlow, the head of publication ethics at MDPI, acknowledged receipt and arranged an online meeting on Sept. 21, 2021, with a member of our research team (BEP). In this meeting, Critchlow acknowledged that MDPI has been working actively to address fraudulent activity by paper mills specifically in the biomedical sciences.  Critchlow was “surprised and disturbed” to see fraudulent activity in other areas of science.  She stated that looking at only a single paper in isolation makes fraudulent activity difficult to detect; but, when looking across papers, the indicators of fraud are more apparent.

Regarding the articles we submitted to MDPI, Critchlow agreed that they do have indicators of potential fraud and the collaborations among the authors were very peculiar.  Critchlow stated that MDPI is actively investigating the cases submitted by our team and exploring ways to address paper mill activity more broadly.

Lessons learned

Our process of curating data from 123mi.ru and searching for fraudulent paper titles is described linearly. However, our searches were highly iterative and relied on a variety of strategies.  We uncovered different types of evidence of fraud that were unexpected and served as further indicators to locate other potentially fraudulent papers.  Some of the data we extracted from the contracts also included nuggets of information that helped us understand the business of scientific misconduct, which is essential for detection and prevention.  

Our evidence reveals a diverse set of fraudulent strategies that International Publisher is using to promote fraudulent publications. These include late-game authorships, ghost-writing services, and fake peer-reviews.  We also observed dozens of advertisements of articles sold that are associated with “special issues,” which is a potential mechanism of academic fraud that has already reared its head and deserves further investigation..  Our claims are based on a collection of indicators based on a collection of paper titles obtained from the mi123.ru webserver, in addition to the review of “services” brazenly described on the mi123mi.ru website.  We want to emphasize that at this time we have only indirect and circumstantial evidence for potential fraud.  As we regard scientific knowledge as a public good, we think raising awareness of these instances is critical to ensuring scientific integrity.  

Based on our findings thus far, we think the majority of paper mill publications are primarily intended to satisfy a career requirement rather than making a scientific splash. Many of the publications we have flagged as potentially fraudulent are published in open-access journals that have received the predatory journal moniker in the scientific blogosphere. However, some scholars have very specific institutional requirements for promotion, such as publishing with journals included in the Science Citation Index or the Social Science Citation Index.  

Most of the potential fraud in our investigation involves authors at Chinese and Russian institutions. These institutions typically have onerous publication requirements in English-language journals for obtaining degrees or promotions, although the Chinese government has recently banned such practices.  Many of these scholars do not have the requisite research preparation, financial resources, time, and English language skills to meet the publishing demands of the institution.  Thus, for these scholars, they may be relying on the paper mills as a career survival strategy, or what they consider to be a reasonable response to an unreasonable requirement.  From this perspective, we consider these institutions to be culpable in promoting these fraudulent practices.  

Understanding paper mill fraud is an important step toward detection and prevention.  Unfortunately, academic publishing is highly decentralized, and efforts to uncover fraud require a significant investment of time, technical skills, and content expertise.  Thus, the most effective responses to this problem are likely to come from collaborative activities and efforts within a scientific discipline.  Journal editors and reviewers are perhaps the first line of defense against paper mill fraud.  Editors should take an active role in educating reviewers on the problem of paper mills while establishing clear and visible policies on the journal’s website, including the following:  

  • Changes in authorship after submission must receive the editor’s approval and be made only under exceptional circumstances; 
  • Corresponding authors who indicate an institutional affiliation must be verified by the author or supervisor using an email address from that institution; 
  • All papers must be reviewed and approved by at least one reviewer that has a formal affiliation with the journal’s reviewer board or pool; 
  • Authors must include a statement of their contributions, which is published with the final article; and
  • Each author  must sign a statement that their submission has met all ethical standards of the journal.

Since we commenced our investigation, the ostensibly complete collection of contracts were removed from the 123mi.ru web server.  The only contracts that are accessible from the web server are for articles that are currently available for sale.  However, the complete collection of data has been shared with Retraction Watch for fact checking.  We are currently reviewing the historical snapshots on the Internet Archive to obtain persistent links to these data.  We will continue investigating potentially fraudulent publications and anticipate releasing the curated data from 123mi.ru web server and our computer code in  an open-access format.  

Brian E. Perron, PhD, is a professor of social work at the University of Michigan. Oliver T. Hiltz-Perron is a student at Community High School in Ann Arbor, Michigan. Bryan G. Victor, PhD, is an assistant professor of social work at Wayne State University.