Abstract
BACKGROUND AND PURPOSE: Self-plagiarism is a form of research misconduct that can dilute the credibility and reputation of a scientific journal, as well as the represented specialty. Journal editors are aware of this problem when reviewing submissions and use on-line plagiarism-analysis programs to facilitate detection. The American Journal of Neuroradiology (AJNR) uses iThenticate to screen several submitted original research manuscripts selected for review per issue and retrospectively assesses 3 issues per year. The prevalence of self-plagiarism in AJNR was compared with that in Radiology; the necessity and cost of more extensive screening in AJNR were evaluated.
MATERIALS AND METHODS: The self-duplication rate in AJNR original research articles was compared with that in Radiology, a general imaging journal that screens all submitted original research manuscripts selected for review by using iThenticate. The rate of self-duplication in original research articles from 2 randomly selected 2012 AJNR issues was compared with the rate in the prior year to gauge the need for more extensive screening. A cost analysis of screening all submitted original research manuscripts selected for review by using iThenticate was performed.
RESULTS: Using an empiric 15% single-source duplication threshold, we found that the rate of significant self-plagiarism in original research articles was low for both journals. While AJNR had more articles exceeding this threshold, most instances were insignificant. Analyzing 2 randomly chosen issues of AJNR for single-source duplication of >15% in original research articles yielded no significant differences compared with an entire year. The approximate annual cost of screening all submitted original research manuscripts selected for review was US $6800.00.
CONCLUSIONS: While the rate of self-plagiarism was low in AJNR and similar to that in Radiology, its potential cost in negative impact on AJNR and the subspecialty of neuroradiology justifies the costs of broader screening.
ABBREVIATIONS:
- AJNR
- American Journal of Neuroradiology
- ORA
- original research article
- SORMSR
- submitted original research manuscript selected for review
The appropriation of previously published ideas or words without proper credit is known as “plagiarism” and is considered a major breach of ethics in scientific publications. In 2005, the National Science Foundation identified plagiarism in nearly 66% of suspected cases of fraud.1 The same institution implemented a plagiarism checking system and discovered that 1.5% of the nearly 8000 grants accepted in 2011 contained significant plagiarism.2 The recognized significance of plagiarism has merited its inclusion in the federal definition of “research misconduct,” defined as, “fabrication, falsification, or plagiarism in proposing, performing, or reviewing research, or in reporting research results.”3 While plagiarism may involve the use of the work of others, the use of one's own work without acknowledging its source is known as “redundant publication” or “self-plagiarism” and is no less significant.4 The pressure of academic productivity as measured by the number of one's publications or how often one is cited creates an incentive for authors to publish a greater volume of work, which may increase the likelihood of self-plagiarism.5,6
The presence of sporadic phrases reproduced from an author's own prior work, particularly from “Materials and Methods” sections of manuscripts based on similar research methodologies, may be acceptable at the discretion of an editor and not necessarily suspected as intentional duplication.7 However, copying entire sections or illustrations of one's own prior work without appropriate attribution is not acceptable and constitutes self-plagiarism. While there are various different classifications for self-plagiarism, a well-accepted one developed by Miguel Roig defines 4 distinct types: recycling of text, copyright infringement, “salami slicing” or dividing 1 study into multiple publications with the intent of apparently increasing productivity, and duplicate publication.8⇓–10
Editors of scientific journals are aware of the potential of self-plagiarism when reviewing submissions, and the use of electronic plagiarism analysis software has greatly facilitated its detection. Widely used plagiarism-analysis programs are iThenticate (http://www.ithenticate.com) in scientific publications and Turnitin (http://turnitin.com) in general education.11 These programs are similar in their comparison of selected documents with data bases of published or submitted articles including “CrossCheck” for iThenticate and “OriginalityCheck” for Turnitin.12,13 The iThenticate data base enables manuscripts to be compared with >37 billion Web pages, >92 million published works from on-line and off-line research publications and data bases, and >37 million scholarly articles through the “CrossCheck” data base, the largest comparison data base of scientific, technical, and medical publications in the world.14,15 Analysis of documents by one of these programs generates an overall “similarity index” with works in the data base as well as an interactive summary of individual sources with which a document demonstrates sameness.16,17 In one study, plagiarism was detected in 3% of assignments submitted by adult learners when manually evaluated for significant overlap, but when evaluated with Turnitin, plagiarism was identified in 13%.18
Given the awareness of the practice of plagiarism and self-plagiarism as well as the ease of use and availability of electronic plagiarism analysis, it is convenient for scientific journals to use it for evaluation. At the time of this study, Radiology used iThenticate to screen all manuscripts for the presence of duplications since February 2013 (D. Levine, MD, personal e-mail communication, October 2014), and the American Journal of Neuroradiology (AJNR) selectively used the iThenticate service for approximately 3 submitted original research manuscripts selected for review (SORMSRs) per issue since 2012. AJNR publishes about 300–350 original articles per year, and of these, 3 issues are now randomly retrospectively selected for review with iThenticate to evaluate the presence of duplication. In addition, all manuscripts originating from authors who have been associated with previous fraudulent behavior are prospectively assessed with iThenticate before undergoing the peer review process, representing approximately 3 SORMSRs per issue.
When one analyzes manuscript submissions by using electronic plagiarism software, the acceptable percentage of similarity between scientific publications is not well-defined. According to the Turnitin Web site instructions for the interpretation of the “similarity index,” there are no clear rules that define when plagiarism has taken place. However, the company previously disseminated guidelines to academic institutions suggesting a 15% “similarity index” for alerting to the possibility of plagiarism.16,19⇓⇓–22 Organizations involved in promoting ethics and best practices of scientific editing (International Committee of Medical Journal Editors, International Society of Managing and Technical Editors, Committee on Publication Ethics) do not define a specific numeric “cutoff” for the identification of plagiarism.23⇓–25 While AJNR has adopted the 15% guideline, other journals set it as low as 10%.7
In this study, we first sought to evaluate the rate of duplications in published AJNR original research articles to see if it is comparable with that in the general radiology literature and to evaluate whether screening all SORMSRs versus a random sample is sufficient. We also performed an analysis of the costs of screening all SORMSRs for potential duplication with iThenticate to assess the economic impact on AJNR.
Materials and Methods
Part 1
We selected articles published in AJNR and Radiology in 2011 under the heading of “Original Research.” Review articles, editorials, commentaries, vignettes, case reports, clinical reports, and so forth were excluded because these do not constitute most articles in an imaging journal and were thought to be more susceptible to differences in opinion regarding the significance of flagged content. All “Original Research” articles published in Radiology and AJNR in 2011 were uploaded into iThenticate in a .pdf format, and an analysis was run. iThenticate parameters were set to exclude quotations and bibliographic entries from comparison and to ignore single-source duplications of ≤3%, given the likelihood of random similarity at or below this threshold. The 15% single-source “similarity index” threshold was set for flagging manuscripts needing further evaluation.16,19⇓⇓–22 When an article exceeded this 15%, 2 individuals (Editor-in-Chief, Managing Editor) separately assessed the duplications and rated them as being significant or insignificant. The presence of plagiarism and self-plagiarism was evaluated by comparing the authors of the SORMSRs with the authors of prior publications that were identified by the iThenticate program as having >15% total similarity. If any author of the SORMSR was an author of the flagged prior publication, self-plagiarism was suspected and the manuscript was further assessed for significance. The presence of sporadic phrases from prior works was ignored, while the presence of large blocks of text was further judged as significant or insignificant.
A significant duplication was characterized as duplicate publication of the same article with no or minor modifications or unreasonable duplication of large or small blocks of text from any portion of the manuscript without appropriate modification or citation of the prior publication. This characterization was consistent with published recommendations for classification of significant (major or minor) duplications.26,27 When differences occurred between the 2 evaluators in the judgment of duplication as significant or insignificant, these were resolved by consensus. Total significant single-source duplications of original research articles (ORAs) in AJNR and Radiology were compared by using the Mann-Whitney U test, with the null hypothesis that there is no difference between the ranks of the significant single-source duplications of ORAs in AJNR and Radiology. Statistical analysis was performed by using MedCalc for Windows, Version 14.10.2, 64-bit (MedCalc Software, Mariakerke, Belgium).
Part 2
We assessed the value of analyzing all versus a small sample of submitted manuscripts for the presence of duplication by evaluating all “Original Research” articles in 2 randomly selected issues of AJNR in 2012 (March and September) and comparing the results with those obtained for an entire year (2011).
Part 3
An analysis was performed to assess the total current and potential future annual cost of using iThenticate to evaluate all SORMSRs for self-plagiarism, factoring in the fixed costs (iThenticate annual subscription fee) and variable costs (iThenticate manuscript analysis fee and time of the AJNR staff) of screening.
Results
Part 1
In 2011, AJNR was published 11 times and contained a total of 302 original research articles (x̄ = 27.5 per issue, σ = 2.73), while Radiology was published monthly with a total of 343 original research articles (x̄ = 28.6 per issue, σ = 1.31) (Table 1).
In AJNR, iThenticate found 54 articles (17.9%) that showed >15% duplication rates, while 12 (3.5%) such articles were found in Radiology. In AJNR, 12 articles (4%) were found to have duplication rates above 15% from a single source, while in Radiology, only 2 (0.6%) such articles were found. After individual evaluations of these articles, only 2 in AJNR and 1 in Radiology were judged by consensus agreement to contain significant “single-source” duplications consistent with self-plagiarism. In AJNR, this finding led the editors to contact the authors and request explanations, concluding that most duplication arose from similarities in the “Materials and Methods.”
The total number of identified “single-source” duplications in original research articles in both journals was compared (Table 2).
Table 2 reveals the results of Mann-Whitney U test for the number of significant single-source duplications in original research articles published in AJNR and Radiology in 2011, which did not show any statistical difference (Z = 0.606; P = .5447). The rank average of the number of single-source duplications in original research articles published in AJNR was 12.59, and in Radiology, it was 11.45. The similar rank averages demonstrate that the number of significant single-source duplications in original research articles in AJNR and Radiology was nearly equal.
Part 2
Two AJNR issues from 2012, selected at random (March and September), contained 25 and 27 original research articles each. In these 2 issues, 5 and 4 articles surpassed our 15% duplication threshold, respectively, compared with an average of 4.9 (σ = 1.92) articles in all of 2011. The fact that this value was within the SD led us to believe that the rate of articles surpassing the 15% minimum threshold is likely constant throughout the year. The number of articles in which the 15% duplication rate originated from a single source was 3 in March and 1 in September (x̄ = 1.09 per issue in 2011, σ = 1.04). Individual evaluation of these 4 articles demonstrated no significant self-plagiarism that accounted for the duplication rate above the 15% threshold.
Part 3
We evaluated the cost of analyzing an entire year of submitted manuscripts, factoring in the financial cost of using iThenticate and wage hours spent by AJNR staff to perform this task (Table 3).
In 2013, an average of 3 articles per issue (36 articles per year) were prospectively flagged for evaluation by using iThenticate due to previous duplication problems with authors at an annual cost of US $840.72. If AJNR were to evaluate all submitted articles by using iThenticate, the annual cost would rise by US $5963.76 to a total of US $6804.48.
Discussion
In comparing AJNR and Radiology, our results indicate that both journals published about the same number of articles under the category of “Original Research” per issue but that the rate of nonsignificant duplication was higher in AJNR. When the origin of the nonsignificant duplications was evaluated in AJNR, it was found to be mostly secondary to similarities in research methodologies used in previously published research by the same authors, and the discrepancy between the 2 journals may reflect differences in journal standards regarding screening of SORMSRs for the presence of self-duplication and re-publication of nonsignificant duplication of research methodologies. When these articles were individually studied for significant duplications, only 2 in AJNR and 1 in Radiology were considered exceeding the 15% threshold (0.66% of articles in AJNR and 0.29% of articles in Radiology). As a result of identified duplications, AJNR editors contacted the authors and requested an explanation (M. Castillo, MD, AJNR e-mail correspondence with manuscript authors regarding duplication, 2012). After careful deliberation by an internal committee, it was concluded that most similarities arose mostly from the “Materials and Methods” and matters were not pursued further.
In cases of proven self-plagiarism, the Committee on Publication Ethics guidelines recommends taking further action, which may lead to informing the author's institution (chairperson, dean of the medical school, institutional review board, and so forth), flagging the authors, scrutinizing their future submissions, and/or potentially prohibiting them from publishing in that journal for a time or indefinitely, and even retracting the article in question from PubMed.28,29 The economic costs of dealing with research misconduct are nonnegligible; while iThenticate has reported an estimate of between $10,000 and $50,000 in capital losses due to incidents of research misconduct in nearly 200 organizational clients surveyed, the direct costs of an investigation of a case of research misconduct have been estimated to approach $525,000.30,31 The additional costs of published research misconduct are characterized by broader detrimental effects on the credibility and reputation of a scientific journal and the specialty that it represents.32 While these effects are difficult to quantify in monetary terms, their potentially long-standing and irreversible effects may be devastating.
Ideally, this sequence of events could be avoided by prospectively checking for duplications when articles are initially received. This may, however, delay the review process and be financially impossible for smaller journals and, in the case of AJNR, would result in checking about 75%–80% of SORMSRs that are ultimately rejected. Because of this, the second question we sought to answer was whether randomly studying a sample of published articles yields similar results to studying all articles published in 1 year.
In 2012, AJNR published an average of 27.45 original articles in each of its 11 issues. The average number of articles per issue surpassing the 15% duplication rate was 4.91, which nearly matched the number of articles in the same category in the 2 months randomly chosen (March: 5 and September: 4). Overall, the average monthly number of articles containing a >15% overlap from a single source (1.09) also nearly matched that found in the 2 individual months (March: 3 and September: 1). Therefore, in our study, sampling only 2 randomly selected months yields similar results to analyzing an entire year.
While the financial and time costs of prospectively evaluating all AJNR submissions (approximately US $6800 per annum) may not seem feasible for smaller journals or justified by the low rate of duplication identified in this study, these are relatively low compared with the potential economic costs of dealing with incidents of research misconduct, broader detrimental effects on journal credibility and reputation, and the professional and public perception and trust of the represented subspecialty, which may be long-standing and irreversible.
Conclusions
Using the suggested empiric duplication threshold of 15%, which is used by AJNR, the number of original articles found to have significant duplicated content was low for AJNR and Radiology. While AJNR had a greater number of articles showing >15% overlap compared with Radiology, most duplications were not considered significant. When these articles were individually studied, self-plagiarism was found to account for all of the duplicated content, and most of it arose from the “Materials and Methods” and was ultimately considered unimportant. Last, analyzing 2 randomly chosen issues of AJNR yielded no significant differences in articles with duplications of >15% compared with those published during an entire year. However, in light of the potentially enormous economic cost of dealing with incidents of research misconduct and adverse effects on the credibility of the journal, its reputation, and perception and trust of its represented subspecialty, more extensive screening of SORMSRs may provide a cost-effective safeguard.
References
- Received September 20, 2014.
- Accepted after revision November 13, 2014.
- © 2015 by American Journal of Neuroradiology