News Blog
Is MDPI Predatory?
MDPI as a publisher of open-access scientific journals was spun off from the Molecular Diversity Preservation International organization. It was formally registered by Shu-Kun Lin and Dietrich Rordorf in May 2010 with its official headquarters in Basel, Switzerland. Including Switzerland, MDPI has editorial offices in 11 countries, with five offices in China, two offices in both Romania and Serbia, and offices in the United Kingdom, Canada, Spain, Poland, Japan, Thailand, and Singapore. As of April 2023, MDPI publishes 426 journals.
The number of published papers has been growing significantly in the last decade with year over year growth of over 50% in 2017, 2018 and 2019, with 106,200 papers published in 2019, and MDPI reported publishing 303,200 papers in 2022. As of 2020, MDPI was the largest publisher of open access papers in the world and the 5th largest publisher overall in terms of journal paper output.
See the list of All MDPI predatory journals
Controversies
MDPI was included on Jeffrey Beall’s list of predatory open access publishing companies in February 2014. Beall’s concern was that “MDPI’s warehouse journals contain hundreds of lightly-reviewed articles that are mainly written and published for promotion and tenure purposes rather than to communicate science.” Beall also claimed that MDPI used email spam to solicit manuscripts and that the company listed researchers, including Nobel laureates, on their editorial boards without their knowledge.
In August 2018, 10 senior editors (including the editor-in-chief) of the journal Nutrients resigned, alleging that MDPI forced the replacement of the editor-in-chief because of his high editorial standards and for resisting pressure to “accept manuscripts of mediocre quality and importance.”
In December 2020, the Chinese Academy of Sciences published a list of journals that may suffer from issues of scientific quality and other risk characteristics. There were 22 MDPI group journals in the 65 journals given in its initial list. MDPI responded to the list promising to communicate with the academy and improve its journals’ parameters to remove the affected journals from the list as soon as possible. The list was updated in December 2021 and reduced to only 41 journals, of which seven MDPI journals were included.
In 2021, the executive committee of the National Publication Committee announced the creation of a new level X for possibly predatory journals and publishers, and linked the creation of the new level specifically to the many expressions of concern regarding MDPI. Currently, 2 MDPI journals have a rating of “level X” and 10 have a rating of “non-academic” (level 0).
MDPI's Exponential Growth
A predatory publisher is a journal that would publish almost anything... usually in return for money. MDPI acceptance rates make this argument hard to deny, with steady increase in their acceptance rates.
MDPI is using some of the same techniques of predatory journals. So the question is simple: if you are a scientist, should you work with MDPI?
MDPI has had an impressive growth rate in the last years, going from 218 journals in 2019 to 426 in 2023. It went from publishing 36,000 articles in 2017 to 303,000 in 2022. MDPI follows the APC publishing model, whereby accepted articles have to pay an Article Processing Charge (APC) before they are published. The APC has increased over time at MDPI. It can go up to more than 2000 CHF. MDPI reports the average APC per article in 2020 amounted to 1180 CHF. Their revenue increased from 14 millions CHF in 2015 to 191 millions CHF in 2020.
Traditional journals have a fixed number of issues per year and then a low to very low number of special issues, that can cover a particular topic, celebrate the work of an important scholar, or collect papers from conferences. MDPI journals tend to follow the same model, only that the number of special issues has increased in time, to the point of dwarfing the number of normal issues. Moreover, special issues are usually proposed by a group of scientists to the editors of the journal, who accept or reject the offer. At MDPI, it is the publisher who sends out invitations for Special Issue, and it is unclear which role, if any, the editorial board of the normal issues has in the process. MDPI even asked Jeffrey Beall, the author of Beall’s list of predatory publishers, to edit a Special Issue in a field that is not his own.
The figure below shows the growth in articles for 74 journals with an Impact Factor (IF) published by MDPI, dividing them between articles published in normal issues, special issues, collections and sections.
Special Issues already covered the majority of papers in 2017, but grew rapidly from then on. While the number of normal issue articles increased 2.6 times between 2016 and 2020, the number of Special Issues articles increased 7.5 times. At the same time, the number of articles in Sections increased 9.6 times, while Collections increased by 1.4 times. Articles in Special Issues now account for 68.3% of all articles published in these 74 journals with IF.
Across these 74 journals, there were 388 Special Issues in 2013, about five per journal. In 2020, there were 6756 Special Issues, almost a hundred per journal! Special Issues increased at all journals, in some cases exponentially. Some have unbelievably high numbers. In March 2021, Sustainability had 3303 open Special Issues (compared to 24 normal issues). These are 9 Special Issues per day, just for Sustainability. 32 MDPI journals have more than 1 open Special Issue in 2021 per day, including Saturday and Sundays. The figure below shows the growth in the number of Special Issues.
Fast Publication Time
Traditional publishers can be extremely sluggish in their turnaround. Scientists share horror stories about papers stuck in review for years. The situation is particularly bad in some fields but it is generally less than optimal. MDPI prides itself on its very fast turnaround times. In the Annual Report 2022 MDPI reports a median time to a first decision of 17 days (in 2020 it was 20 days). And the median time from submission to publication is 38 days! This is extremely fast and with a tendency to decrease. A paper after submission must be assigned to an editor; this editor must find an associate editor (or not), and then find referees. It is hard to find the correct people to review a paper, and these might be not available. Once the referees have been found and have accepted, they need time to make their report. Then the editor must read the reports and make a decision. 17 days is really fast!
Searching in the MDPI website, it was possible to find aggregated data about submission date, the date when the revised version was received, and the acceptance date. The aggregate results are shown in the figure below. Three main takeaways:
There is not much difference between normal, special issues, sections and collections.
MDPI managed to halve the turnaround times from 2016 to 2020.
The variance across journals has gone down at the same time as the mean.
Below is the raincloud plot of the overall distribution (cut at 150 days, for the sake of visualization. This leaves out about 3% of the papers in 2016, but a further indication of the shrinking of turnaround times, only 0.3% of papers in 2020). On the left, each point is a paper. On the right, you see the kernel density estimation. There is heterogeneity, but it is rather low, and it is being dramatically reduced. The rather flat distribution of 2016 has been replaced by a very concentrated distribution in 2020. The distributions for normal and Special Issues are similar, with some more variance for Special Issues articles.
Are there differences by journal, or field? After all we are talking here of different people, from different fields, and thinking back to the SI explosion, at an army of heterogeneous and uncoordinated guest editors. Below you find the distribution of turnaround times for the main MDPI journals (cut to 150 days).
The really striking finding here is that there is virtually no heterogeneity left. The picture from 2016 is as you’d expect: fields differ, journals differ. Some journals have faster, other slower turnaround times. Distributions are quite flat, and anything can happen, from a very fast acceptance to a very long time spent in review. Fast forward to 2020, and it is a very different world. Each and every journal’s turnaround times’ distribution shrinks, and hits an upper bound around 60 days. The mean is similar everywhere, the variance not far off.
This convergence cannot happen without strong, effective coordination. The data unambiguously suggest that MDPI was very effective at imposing fast turnaround times for every one of their leading journals. And this despite the fact that given the Special Issue model, the vast majority of papers are edited by a heterogeneous set of hundreds of different guest editors, which should increase heterogeneity.
Hundreds of different editorial teams, on dozens of different topics, and one common feature: about 35 days from submission to acceptance, including revisions. The fact that revisions are included makes the result even more striking. About 17% of all papers submitted to MDPI in 2020 (25,000 papers) are accepted within 20 days of submissions, including revisions. About 45% of submitted papers (66,000 papers) were accepted within 30 days.
MDPI strategy boils down to exploiting their rent, as fast and aggressively as they can, through the Special Issue model. The original journals’ reputations are used as bait to lure in hundreds of Special Issues, and thousands of papers, in a growth with no limit in sight.
The promotion of questionable special issues is one of the identifying characteristics of predatory journals. Certainly, the number of special issues published or scheduled yearly reveals no quality-related information, although the fact that the number of special issues in MDPI journals is so much higher than the number of ordinary issues per year inevitably awakens suspicions of a lucrative business aim. Since APC-based Open Access publishing involves remunerating publishers based on how many articles they publish, this can underpin perverse incentives to accept as many articles as possible to maximize revenue, so predatory journals operate in such a manner, eschewing legitimate peer review or other types of quality control and prompting an excessive publication of articles, often of inferior quality.
Among the practices are publications in record time, acceptance of papers after the deadline, and publication of papers in the SI by the editors themselves. Recently, what has been called Review Mill at MDPI was also discovered, a series of dozens of very similar fake reviews spread across several of the publisher's journals.
MDPI Self-Citation Problem
In the gold open-access model, reading the publications is free and the publication costs, collected through the APC, are incurred by the authors, their institutions, and funding bodies. A predatory journal will exploit this model to its own benefit with an inexistent or practically inexistent peer-review process. Selective databases, such as Scopus, PubMed, and Web of Science, form an index of journals, a sort of whitelist that is used for the purposes of assessing researchers and taking decisions on grant funding. However, some articles from some predatory journals are in fact indexed, both in PubMed and in Scopus.
This investigation is centred on Journal Citation Reports (JCR), an annual report on the citation impact of a defined set of journals, perhaps the most prestigious and best recognized index in academia with the widest use at a global level, in order to analyse MDPI. MDPI has been open to question, a dubious publisher that has been moving within a ‘grey zone’. It is deserving of further analysis that will help us to determine whether it is using a broad range of questionable tactics that are neither illegal nor easy to detect.
JCR-indexed MDPI journals (edition 2018) were selected for the analysis (53 out of 218). As a control group for comparison, leading journals in each category were selected. The first relevant fact of the analysis is that some journals use very similar names to the journals with established reputations, one of the characteristics of predatory journals, with which they prey upon younger researchers and those less well informed on the subject of predatory journals. This is the case, e.g. of the MDPI journals Cells, Cancers, Polymers, Remote Sensing, Animals and Genes, remarkably similar to other journals established earlier and edited by Elsevier (Cell, Polymer, Gene, Remote Sensing of Environment), Wiley (Cancer), and Cambridge University Press (Animal).
The impact factors of all journals were reduced when self-citations were removed. The drop in the impact factor ranged between 38.96% in the case of Sustainability to 0.68% in Medicina with an average reduction of 14.8% in the value of the journal impact factor following the removal of self-citations. According to Clarivate, self-citation in the Web of Science ranges from 0% to 15%. Sustainability and Electronics journals showed exceptionally high self-citation rates (27.69% and 27.46%, respectively) followed by Minerals (26.15%). All journals, except the International Journal of Molecular Science, increased self-citation rates between 2018 and 2019 (between 2.7 and 16.9 points).
Comparing MDPI with the main scientific publishers, the impact of self-citation on the Journal Impact Factor (JIF) is evident.
In the data used in the image above, all 98 MDPI journals that have JIF (February, 2023) were considered. From the other publishers a sample of similar size was selected. It is clear the JIF's difference between MDPI and other publishers, known for being of excellence.
Citations serve as an immediate marker of research influence and significance, reflecting the judgments that researchers themselves make when acknowledging important work. Nevertheless, citations must be considered carefully and in context. For validity in assessing the impact of published research, citation analysis must control for such variables as subject field, document type and year of publication.
The new Journal Citation Indicator (JCI) meets this requirement for journal evaluation, providing a single number that accounts for the specific characteristics of different fields and their publications. The JCI calculation controls for different fields, document types (articles, reviews, etc.) and year of publication. The resulting number represents the relative citation impact of a given paper as the ratio of citations compared to a global baseline. A value of 1.0 represents world average, with values higher than 1.0 denoting higher-than-average citation impact (2.0 being twice the average) and lower than 1.0 indicating less than average.
The image below shows the JCI comparison for various publishers, with the average for MDPI well below 1.
To calculate the JCI, the same sample of 98 journals was used. The median MDPI JCI calculated value was 0.76. Elsevier has the highest JCI value: 1.6. The second publisher with the lowest JCI value is Taylor and Francis, with a value of 0.895. All others have a value greater than 1.0.
In essence, the Journal Citation Indicator provides a field-normalized measure of citation impact where a value of 1.0 means that, across the journal, published papers received a number of citations equal to the average citation count in that subject category.
In 2023, the International Journal of Environmental Research and Public Health (IJERPH), one of the MDPI's flagships, with an IF of 4.614, was removed from Clarivate's JCR (the Journal of Personalized Medicine -- IF: 3.508 -- was also removed), losing its Impact Factor. From Clarivate's JCR 2023 we can see that the journal that most cites IJERPH and is most cited by it is IJERPH itself, a clear self-citation problem.
A picture is worth a thousand words...
Top 20 journals citing IJERPH
Top 20 journals cited by IJERPH
Compared to a reference in the area, we have the example of the journal The Lancet, which presents a very distinct citation pattern.
Remove our publisher from your list
Over the years I published my blog and its lists, publishers and standalone journals constantly tried various means of getting off the lists. Over time the requests to remove journals and publishers increased in number, as more and more universities recommended the lists or used them as official blacklists. Also, the methods publishers used became more intense.
Often owners of predatory publishing operations would email me, extolling the virtues of their journals, describing the rigor of their peer review and the credentials of their esteemed editorial boards. Some of them did a self-analysis using the criteria document I used and made available, and without exception these self-analyses found that the publisher didn’t meet any of the criteria – not even close – and deserved to be removed from the list immediately.
Others used more aggressive strategies. Some publishers, especially the publishers of standalone mega-journals, would send me emails making false accusations about my work, my ethics, and my ability to make judgments about journals and publishers. The publishers were driven by money, competition, and greed, and they sought to remove any obstacle standing in the way of increased revenue, and my lists was one such obstacle.
Still others tried different strategies. Some tried annoying me with numerous emails and letters, often sent as PDF attachments, with fancy letterhead, informing how I was hurting its reputation. They kept sending these emails. They tried to be as annoying as possible so that I would get so tired of the emails that they would silence me just to make them stop. The publisher MDPI used this strategy.
I was also always surprised at the extent to which researchers who had published in one or more of a predatory publisher’s journals became the publisher’s biggest defender. It’s as if they felt a sense of loyalty to the publisher. I think this was because most of these predatory publisher defenders had numerous articles rejected many times from legitimate scholarly publishers. Upon finally finding a publisher willing to accept and publish their work, they become elated and did everything possible to protect and defend the publisher – especially defend the publisher against its inclusion on blacklists. Researchers love publishers who accept and publish their papers, especially researchers whose work is consistently rejected by journals from the respected publishers, that is to say, journals that carry out a rigorous peer review.
Is MDPI a predatory publisher? The ChatGPT's answer:
"The publisher MDPI (Multidisciplinary Digital Publishing Institute) has been the subject of criticism and controversy regarding its publishing practices, particularly in the past. Some researchers and academics have accused MDPI of being a predatory publisher due to concerns about its peer review process, editorial standards, and publishing fees.
However, it is worth noting that MDPI has made efforts to address these concerns and improve its practices in recent years. For example, the publisher has introduced more rigorous editorial policies and established partnerships with other reputable publishers. Additionally, MDPI is a member of the Committee on Publication Ethics (COPE) and has been recognized by the Directory of Open Access Journals (DOAJ).
As with any publisher, it is important to carefully evaluate the quality and reputation of individual journals within MDPI's portfolio before submitting your work. You should also consider the fees associated with publishing in these journals and ensure that they are reasonable and transparent. Ultimately, it is up to each individual researcher to decide whether or not they feel comfortable publishing with MDPI." (February, 2023)
Conclusions
The results showed that MDPI journals under analysis possess some of the criteria for the identification of predatory journals and deviate from best editorial and publication practices when e.g. mimicking names and claiming rapid publication. The COPE/DOAJ/OASPA/WAME Principles for Transparency and Best Practices in Scholarly Publishing stipulate that journal names should not be easily confused with another journal and that journal websites should not guarantee very short peer-review times.
Additionally, the constant and quite exceptional increase in the number of articles published in MDPI journals between 2018 and 2019, reinforced by an exponential increase in the number of special issues, which easily outweigh the number of regular publications, together with an opportune increase in APC fees all raises serious concerns over the legitimacy of MDPI as a publisher, at the very least because its APC-based business model alters the economic and scientific incentives in academic publishing.
It is well known that the direct relation between income and the number of manuscripts that are accepted prompts predatory journals to conduct cursory peer reviews, in such a way that the rejection rate is minimal, so that ample economic returns are still guaranteed. These results showed that the MDPI journals under analysis fitted the definition of predatory journals, as their behaviour indicated that they prioritize self-interest, forsaking the best editorial and publication practices within the scope of self-citations and citations from other journals of the same editorial.
You can also access our List of Predatory journals here. We also provide a search tool here.
References
Beall, J. (2012). Predatory publishers are corrupting open access. Nature,(489)179. https://doi.org/10.1038/489179a
Beall, J. (2017). What I learned from predatory publishers. Biochem Med (Zagreb), 27(2):273-278. https://doi.org/10.11613/BM.2017.029
Crosetto, P. (2021). Is MDPI a predatory publisher? Available at: https://paolocrosetto.wordpress.com/2021/04/12/is-mdpi-a-predatory-publisher/
Oviedo-García, M. A. (2021). Journal citation reports and the definition of a predatory journal: The case of the Multidisciplinary Digital Publishing Institute (MDPI), Research Evaluation, 30(3):405-419. https://doi.org/10.1093/reseval/rvab020
Shen, C., Björk, B. C. (2015). 'Predatory' open access: a longitudinal study of article volumes and market characteristics. BMC Med, (13)230. https://doi.org/10.1186/s12916-015-0469-2
Sorokowski, P., Kulczycki, E., Sorokowska, A. et al. (2017) Predatory journals recruit fake editor. Nature, 543:481-483. https://doi.org/10.1038/543481a
Wellen, R. (2013). Open Access, Megajournals, and MOOCs: On the Political Economy of Academic Unbundling. SAGE Open, 3(4). https://doi.org/10.1177/2158244013507271
Xia, J., Harmon, J. L., Connolly, K. G. et al. (2015). Who publishes in “predatory” journals? Journal of the Association for Information Science and Technology (JASIST), 66:1406-1417. https://doi.org/10.1002/asi.23265
See our full list of predatory journals and publishers: