Good publication practices: declaration, approval, and now enforcement of reporting standards

Cover Page


Cite item

Abstract

This article introduces a new submission policy for original research manuscripts. Starting next year, such manuscripts will be considered only if they comply with the reporting guidelines recommended by the EQUATOR (Enhancing the QUAlity and Transparency Of health Research) Network and are approved by an ethics committee during the planning stage. The information to be included in a manuscript upon submission has been expanded. When these rules are fully implemented, prior study registration and submission of primary research data to the editorial office (with subsequent publication upon article acceptance) will become mandatory for manuscript consideration. These changes aim to transition the journal from organic growth to controlled development according to the principles of scientific integrity.

Full Text

Conceptually, there is no difference between simply endorsing and requiring without enforcement.

EQUATOR Network

INTRODUCTION, OR RESULTS OF ORGANIC GROWTH

Since the inaugural issue of Digital Diagnostics in 2020, nearly 350 articles of various categories have been published, with one in every four containing new data, such as original research, case reports, systematic reviews, technical reports, and datasets.1 These publications have been cited 176 times, accounting for 43% of the journal's total citations (n = 407). This reflects the considerable interest in new findings published in the journal. This is due in part to the journal's ranking at the top of the SCIENCE INDEX2 among Russian medical journals in diagnostic radiology as of 2023. This was accomplished solely by organic growth; one exception is that the journal was actively promoted in the international medical community by translating its articles into English and Chinese. Otherwise, the journal's evolution has been consistent, with a focus on balancing editorial capabilities with the scope and quality of research in diagnostic radiology in Russia. According to submitted papers, the quality of research has not changed considerably. However, the number of submitted papers is steadily increasing (from 62 in 2022 to 161 in 2023 and 203 in 2024). In this context, it is difficult to ensure a consistently high quality of published information, as well as its use (citing) by Russian and international researchers. Thus, it is time to shift from organic to controlled growth.

CONTROLLED GROWTH: DO WHAT YOU MUST

Every scientific journal's primary purpose is to ensure that submitted materials are thoroughly reviewed before being published and disseminated, in order to provide quality content. However, achieving this goal is not easy. Numerous publications about insufficient content quality, including in top-rated journals, indicate that no one has yet achieved complete success. There are three main causes behind this:

  • Low quality of submitted research and reports (manuscripts);
  • Low efficacy of peer review (including when the journal's editors are involved);
  • Researchers' misconduct (concealing, distorting, or even fabricating research findings).

We will not go into detail on how these statements are justified. This issue is certainly relevant for today's scientific journals, including Russian medical journals [1–6] and researchers [7, 8]. Therefore, the central question (which has remained for several decades) is: What should we do?

Indeed, how can a journal improve the quality of its publications in the near future while staying on budget? Before we go any further, let us define the quality of a research paper and which of its elements can be modified by a scientific journal's editorial team. First, the quality of a research paper is determined by the quality of the study and study report. Second, the quality of the study and study report is defined as compliance with existing scientific, ethical, and legal regulations, guidelines, and standards. Third, it is crucial to acknowledge that changing the quality of research in a specific field is beyond the competence of a specialized journal. This requires nationwide collaboration among medical journal editors and publishers, ethics committees, grantmakers, and management of research institutions. However, we have no control over the quality of study reports. Our goal is to publish papers that contain essential information (sufficient for a comprehensive assessment and replication of the study), while excluding works that violate basic scientific and ethical standards. These include the Declaration of Helsinki3 and guidelines of the Council for International Organizations of Medical Sciences [9], as well as legal regulations (e.g., guidelines for the use and disclosure of personal and confidential medical information, copyright regulations, and state-regulated rules for conducting clinical trials).

Today, ethics review (preliminary or, in some cases, continuing) is mandatory for all health-related research involving humans, including human materials and data, and preliminary registration of such studies3 (regarding not only protocol and safety, but also the expertise of researchers) [9]. Source data must be reviewed as well (including by editorial teams) [10]. Finally, the Good Publication Practice requires the use of reporting guidelines when preparing reports on planned and completed studies. These guidelines can be found in the Enhancing the QUAlity and Transparency Of health Research (EQUATOR) Network.4 Our goal is to ensure that manuscripts contain essential information and that the data used in the study are correct. Therefore, our top priority is to gradually introduce reporting guidelines into the editorial workflow (transition period: until the end of 2025), with a mandatory conclusion from an ethics committee or other authorized body with similar functions for all submitted manuscripts. Following the implementation of these changes, all submitted manuscripts will be required to complete preliminary (prior to initiation) research registration and submit source data for subsequent publication (provided that the work was accepted for publication and there are no data access restrictions).

IMPLEMENTING REPORTING GUIDELINES INTO THE EDITORIAL WORKFLOW

Standardization of research content is necessary to prevent publishing incomplete, incorrect, or misleading information, which is a common issue in scientific publications. The extent of the problem cannot be overstated [11]. The major concern appears to be the incompleteness of published data, which prevents critical assessment of research and future use of study findings, depreciating financial and human assets invested in research [12]. To address this particular issue, the international initiative EQUATOR Network was launched in 2008.4 The EQUATOR Network creates, updates, and disseminates reporting guidelines, supports the development of new guidelines, and assists scientific journals and research institutions in implementing adequate research reporting.

The EQUATOR Network4 defines reporting guidelines as a document (a checklist or other structured list) with minimal requirements to be followed in preparing a research publication that allows for replication of the study, is useful for clinical and other professional decision-making, and contains generalizable data. The primary guidelines (CONSORT,5 STROBE,6 STARD,7 TRIPOD-AI,8 PRISMA,9 CARE10) provide checklists along with detailed explanations and examples produced by collaborative efforts of researchers, methodologists, editors, and medical writers. The checklists of all these guidelines have been translated into Russian, and some of them (STROBE6 [13], STARD7 [14], TRIPOD8 [15], CARE10 [16]) have been fully translated and published in our journal. In the near future, we must ensure that Russian translations of the primary guidelines are fully synchronized and regularly updated; ideally, Russian experts should be involved in their development. The key challenge, however, is to integrate reporting guidelines into the editorial workflow rather than merely translating them.

To date, several approaches to implementing reporting guidelines by scientific journals have naturally emerged:

  • Declaring support for the EQUATOR Network4 initiative (“We hereby declare…” or similar statements);
  • Recommending or endorsing applicable reporting guidelines during manuscript preparation (“authors are recommended…”, “authors should…”, etc.);
  • Enforcing compliance with reporting guidelines (by efforts of editors, reviewers, etc.).

The efficacy of all these approaches to implementing reporting guidelines has been studied. It is now widely recognized that merely declaring support or recommending reporting guidelines is either insufficiently effective [17–22] or completely ineffective [23]. Moreover, even if the authors declare compliance with reporting guidelines, it does not ensure their appropriate use [21]. Enforcing reporting guidelines as a mandatory condition for reviewing a manuscript also has no considerable positive effect on the completeness of the reported data [24]. The reason is that authors incorrectly complete checklist items or leave them blank (see examples for various guidelines in [25–28]); furthermore, some authors (approximately 10%) refuse to comply with the journal’s requirements [24]. Thus, the only viable option to improve content quality is to enforce reporting guidelines through the editorial team's efforts. Large journals employ reporting editors for this purpose [25]. Randomized trials support the advantages of these changes in the editorial workflow over conventional reviewing [29, 30]. Unfortunately, success is difficult to achieve when reporting guidelines are enforced by editors [24, 31] or reviewers who are not adequately trained. Reviewers frequently fail to remind authors that they must comply with reporting guidelines [32]. Furthermore, encouraging reviewers to follow reporting guidelines has little impact on the completeness of data in submitted manuscripts [33]. This is most likely why the quality of research reports published in scientific journals differs only marginally from that of preprints [34]. This indicates that the organic growth model fails to improve the quality of a scientific journal (including ours) when a standard and universally applicable peer-review system and editorial workflow are maintained.

BARRIERS TO IMPLEMENTING REPORTING GUIDELINES

Several factors limit the implementation of reporting guidelines in the editorial workflow. These include the characteristics of guidelines (numerous guidelines even within a single subject area), their limited applicability (individual guidelines do not provide a comprehensive set of necessary reporting items and sometimes contain unnecessary items) [35], and the difficulties of mass, timely translation into national languages. Furthermore, the characteristics of process owners (editors, authors, and reviewers) must be considered. Let us discuss the latter in more detail. In this context, there are two major barriers: low awareness of the existence and importance of guidelines, and low compliance by editors, authors, and reviewers who are already familiar with the guidelines [36, 37].

One would expect modern medical journal editors to be aware of the EQUATOR Network.4 However, just like 10 years ago [38], approximately 40% of medical journal editors have never heard of this initiative [37]. Furthermore, editors rarely prioritize reporting guidelines for inclusion in professional (editorial) educational programs [39], despite the fact that familiarity with reporting guidelines has long been recognized as a key competency of biomedical journal editors [40]. Notably, only approximately a third (29%) of medical journals in a representative sample [41], and approximately every second journal (46%) in a sample of top-rated medical journals [42], encourage reviewers to use reporting guidelines; however, even the latter merely provide general instructions [42]. Specialty reporting guidelines, such as those for artificial intelligence research, are generally never included in editorial instructions and policies [43, 44].

Another major concern is that authors are twice less frequently aware of reporting guidelines than editors [38]. Importantly, every third (35%) author learns about reporting guidelines from the journal (editorial articles, author guidelines, editorial policy); furthermore, approximately 30% of authors learn via colleagues or educational programs [45]. However, even authors who are aware of reporting guidelines (every fourth, according to some research [46]) may struggle with selecting relevant guidelines. Fortunately, the EQUATOR Network4 website has a search engine for selecting guidelines based on research type, research area, and manuscript section/element, as well as a keyword search. Feedback will support the use of reporting guidelines in Digital Diagnostics. After submitting a manuscript, the authors will be informed within one day about applicable guidelines and whether the Russian version is available. Reviewers will get similar recommendations, along with an option to engage in peer review. The latter is essential for reviewers who may eventually become authors of our journal. Scientific journals rarely include this information in reviewer guidelines [42], and reviewers are typically not expected to assess manuscripts for compliance with reporting guidelines [47].

However, informing those involved in publishing is not the greatest challenge. The primary issue is poor adherence to reporting guidelines when preparing reports on planned and completed studies. As previously stated, this issue can be addressed by enforcing reporting guidelines through the efforts of qualified editors. Authors best follow reporting guidelines when they are enforced by the journal's editorial team. In contrast, when the journal's editorial team fails to enforce reporting guidelines, approximately half of informed authors refuse to use them (as illustrated by the STROBE6 guidelines) [45]. Another issue is that editors frequently fail to comply with the rules already adopted by the journal [24]. Only half of editors require authors to review their manuscripts according to applicable reporting guidelines, even if the manuscript meets the journal's criteria for relevance and originality [24]. The main reasons why editors fail to comply with reporting guidelines are the lack of time [37] and the fear of losing authors to journals with a simpler (if not downright primitive) manuscript submission policy [36, 48], which determines the fear of losing the competition in the subject area [49].

One way to overcome these fears is to simultaneously introduce reporting guidelines in all journals in the specific subject area. This appears to be unachievable in a highly competitive field of academic publishing. However, there have been cases where reporting guidelines were implemented simultaneously in several journals in the same subject area (up to 28 journals at once) [50]. Furthermore, editors should understand that the use of reporting guidelines indicates high-quality content. One-third (36%) of all medical journals in the Scopus database incorporate reporting guidelines in their author guidelines [51]. In comparison, this parameter was nearly twice as high (61%) for top-rated journals over the same period (2017) and increased to 73% by 2022 [52]. Comparably high support levels have been reported for medical journals of 11 top-rated publishers [32]. Notably, international radiology journals also show relatively strong support for reporting guidelines (61.5%) [43]. However, the majority of radiology journals merely declare support or endorsement of reporting guidelines [43]. This is despite the fact that Radiology, the leading journal in this subject area, mandated the use of reporting guideline checklists as early as 2016 [53].

Therefore, it is the journal's responsibility to inform authors, promote and monitor compliance (through reporting guideline selection algorithms, adapted explanatory texts for the most frequently used guidelines, and active feedback channels for authors and reviewers), and track changes by incorporating specialized journal indicators into the editorial workflow. However, there is reason to believe that enforcing reporting guidelines during the manuscript submission stage may be too late (and ineffective). Thus, it is advisable to use reporting guidelines already during study planning [54], or at least during draft manuscript preparation [55].

BENEFITS OF REPORTING GUIDELINES (FOR AUTHORS AND BEYOND)

Implementing reporting guidelines into the editorial workflow ensures that essential information is provided, thus improving the quality of manuscripts. As previously stated, this essential information is necessary to assess research quality, replicate the findings, use published data in real-world practice, or generalize them (when necessary). Reporting guidelines are designed specifically for this purpose and will surely be effective if all parties involved adhere to them [29, 30].

Furthermore, reporting guidelines improve the readability and structure of publications, making them more useful and understandable [56].

Faster peer review [57] is more than just a benefit; it is a long-held dream of both authors and scientific journal editors. Accelerating peer review will also reduce the workload for everyone involved in the editorial workflow. This is only attainable through compliance with reporting guidelines, at least by authors and editors; moreover, authors must follow the guidelines already at the study planning stage.

Enforcement of reporting guidelines increases citation rates (on average by 43%) [58]. Furthermore, citation rates of articles prepared according to quality standards may be improved by promoting the citation of high-quality publications. Unfortunately, citation rates are currently driven by factors other than final publication quality (data completeness) [59, 60] and actual content (research and study report quality), as evidenced by numerous studies on citation patterns.

The expected benefit is that manuscripts prepared according to reporting guidelines are more likely to be accepted for publication [61, 62]. Furthermore, compliance with reporting guidelines from the EQUATOR Network4 may accelerate study report preparation, facilitate data extraction for systematic reviews, and improve the reproducibility and clinical relevance of research.

The expected benefit for a journal is a seamless inclusion into bibliographic databases, which is necessary to improve its international presence. Unfortunately, in 2024, Digital Diagnostics was not included in PubMed Central11 due to the low content quality. We must overcome this barrier within 1–2 years together with you, our authors and reviewers, if we wish to be something more than a regional journal with a narrow professional audience. We, as editors, believe we can do better.

Furthermore, reporting guidelines may help to improve instructions for use of commercial products proposed for clinical practice, which currently contain less than half of the information necessary for reliability, transparency, and objectivity assessment (as evidenced by machine learning-based predictive models) [35]. Given the potential negative impact of low-quality clinical study reports on treatment outcomes [63, 64], researchers have an ethical obligation to ensure that they are sufficiently complete. Therefore, compliance with reporting guidelines is essential for the core purpose (responsibility) of medical science (i.e., patient health), regardless of the impact on citation rates, reader interest, and other secondary efficacy indicators.

OTHER URGENT EDITORIAL INNOVATIONS

As previously stated, completeness of provided information (background, methods, and results) is not the sole factor determining the quality of a study report (manuscript). It is equally crucial to provide data that ensures the transparency of ethical aspects of the research work. There are four groups of ethical considerations that facilitate scientific integrity and foster trust is science in general. These include authorship, human participants and animal subjects in research, research transparency (conflicts of interest, originality of submitted materials, disclosure of information about the use of generative artificial intelligence in report preparation, open peer review), and data availability. A comprehensive discussion of these issues is beyond the scope of this editorial article. As editors, we expect most of these issues to be considered and resolved by ethics committees [65]. However, in practice, we witness the opposite. Therefore, it was decided to expand information to be included in the Additional Information section of each manuscript (with some items only applicable to specific article types).

  • Author contributions.
  •  
  • Ethics approval.
  • Consent for publication.12
  • Funding sources.
  • Disclosure of interests.
  • Statement of originality.
  • Data availability statement.
  • Generative AI.
  • Provenance and peer-review.
  •  

Detailed explanations with examples of author statements can be found in the Author Guidelines. Providing relevant information will become mandatory for manuscript acceptance immediately after this editorial article has been published.

CONCLUSION

In the last decade, requirements for manuscripts submitted to scientific journals have changed considerably all over the world. This primarily refers to the implementation of reporting guidelines that are recognized and supported by top-rated journals. This is today's reality. In this article, we attempted to demonstrate that reporting guidelines, when applied correctly, are a “life vest” rather than a “straitjacket” for authors. We encourage authors submitting manuscripts to Digital Diagnostics and other Russian medical journals to follow the EQUATOR Network's reporting guidelines. This will undoubtedly improve the quality of research and facilitate publication in top-rated Russian and international journals.

ADDITIONAL INFORMATION

Author contributions: R.T. Saygitov: conceptualization, formal analysis, writing—original draft; V.E. Sinitsyn: conceptualization, writing—review & editing. All the authors approved the version of the manuscript to be published and agreed to be accountable for all aspects of the work, ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.

Funding sources: No funding.

Disclosure of interests: R.T. Saygitov is a scientific editor at Digital Diagnostics and Consortium Psychiatricum journals, and a scientific advisor at the publishing house Pediatr (Russia). V.E. Sinitsyn is the Editor-in-Chief at Digital Diagnostics.

Statement of originality: No previously published material (text, images, or data) was used in this work.

Data availability statement: The editorial policy regarding data sharing does not apply to this work.

Generative AI: No generative artificial intelligence technologies were used to prepare this article.

Provenance and peer review: This paper was commissioned by the journal’s Editorial Board and underwent prioritized internal peer review.

 

1 Hereinafter: adapted from eLIBRARY.RU. Accessed on: February 17, 2025.

2 SCIENCE INDEX is an information analysis system based on the Russian Science Citation Index. It is designed to provide a comprehensive analysis and statistical assessment of the publication activities of Russian researchers and scientific organizations.

3 WMA Declaration of Helsinki — Ethical Principles for Medical Research Involving Human Participants; [approximately 5 pages]. In: World Medical Association [Internet]. Ferney-Voltaire: World Medical Association, 2024–2025. Available at: https://www.wma.net/policies-post/wma-declaration-of-helsinki/ Accessed on: May 6, 2025.

4 Enhancing the QUAlity and Transparency Of health Research Network [Internet]. Oxford: Centre for Statistics in Medicine; 2006–. Available at: www.equator-network.org. Accessed on: May 6, 2025.

5 CONsolidated Standards Of Reporting Trials (CONSORT).

6 Strengthening the Reporting of Observational Studies in Epidemiology (STROBE).

7 Standards for Reporting Diagnostic Accuracy Studies (STARD).

8 Transparent Reporting of a Multivariable Prediction Model for Individual Prognosis or Diagnosis — Artificial Intelligence (TRIPOD-AI).

9 Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA).

10 Case Report Guidelines (CARE).

11 PubMed Central is an open-access database of full-text biomedical publications maintained by the United States National Library of Medicine.

12 For case reports only.

×

About the authors

Ruslan T. Saygitov

Eco-Vector IP

Author for correspondence.
Email: saygitov@yandex.ru
ORCID iD: 0000-0002-8915-6153
SPIN-code: 8641-2334

MD, Dr. Sci. (Medicine)

Russian Federation, Moscow

Valentin E. Sinitsyn

Moscow State University named after M.V. Lomonosov

Email: vsini@mail.ru
ORCID iD: 0000-0002-5649-2193
SPIN-code: 8449-6590

MD, Dr. Sci. (Medicine), Professor

Russian Federation, Moscow

References

  1. Leonov VP. Application of statistical methods in cardiology (based on materials from the Journal “Cardiology” for 1993–1995). Kardiologiia. 1998;38(1):55–58. (In Russ.)
  2. Rebrova OYu. Trend in the quality of presenting the results of statistical analysis in the original papers in this journal in 1999 to 2006. Problems of Endocrinology. 2007;53(5):31–33. doi: 10.14341/probl200753531-33 EDN: ZTFHFB
  3. Vlassov VV. Is Content of medical journals related to advertisements? Case-control study. Croatian medical journal. 2007;48(6):786–790. doi: 10.3325/cmj.2007.6.786 EDN: LKKGDT
  4. Musatov MI. Statistical methods quality and evaluation of results: a study of publications in Russian “Immunology” and Journal of Immunology. Problemy standartizacii v zdravoohranenii. 2009;(2):30–34. EDN: KXFPZV
  5. Dombrovskiy VS, Rakina EA, Rebrova OYu. Assessment of the methodological quality of randomized controlled trials published in “Russian Allergology Journal” in 2009–2013 (Part 2). Russian Allergology Journal. 2014;(4):28–34. EDN: SMGWAL
  6. Golenkov AV, Kuznetsova-Moreva EA, Mendelevich VD, et al. The quality of research publications in psychiatry. Zhurnal nevrologii i psikhiatrii im. S.S. Korsakova. 2017;117(11):108–113. doi: 10.17116/jnevro2017117111108-113 EDN: XHXQBO
  7. Chekhovich YuV, Khazov AV. Analysis of duplicated publications in Russian journals. Journal of Informetrics. 2022;16(1):101246. doi: 10.1016/j.joi.2021.101246
  8. Talantov P, Niyazov R, Viryasova G, et al. Unapproved clinical trials in Russia: exception or norm? BMC Medical Ethics. 2021;22(1):46. doi: 10.1186/s12910-021-00617-3 EDN: CEFXTQ
  9. International ethical guidelines for health-related research involving humans. Geneva: The Council for International Organizations of Medical Sciences (CIOMS); 2016. ISBN: 978-92-9036-088-9 doi: 10.56759/rgxl7405
  10. Aleksic J, Alexa A, Attwood TK, et al; as part of the AllBio: Open Science & Reproducibility Best Practice Workshop. An open science peer review oath. F1000Research. 2015;3:271. doi: 10.12688/f1000research.5686.2
  11. Jin Y, Sanger N, Shams I, et al. Does the medical literature remain inadequately described despite having reporting guidelines for 21 years? — A systematic review of reviews: an update. Journal of Multidisciplinary Healthcare. 2018;11:495–510. doi: 10.2147/JMDH.S155103
  12. Glasziou P, Altman DG, Bossuyt P, et al. Reducing waste from incomplete or unusable reports of biomedical research. The Lancet. 2014;383(9913):267–276. doi: 10.1016/S0140-6736(13)62228-X
  13. Vandenbroucke JP, Von Elm E, Altman DG, et al. Strengthening the reporting of observational studies in epidemiology (STROBE): explanation and elaboration. translation to Russian. Digital Diagnostics. 2021;2(2):119–169. doi: 10.17816/DD70821 EDN: FKQJKL
  14. Cohen JF, Korevaar DA, Altman DG, et al. STARD 2015 guidelines for reporting diagnostic accuracy studies: explanation and elaboration. Translation to Russian. Digital Diagnostics. 2021;2(3):313–342. doi: 10.17816/DD71031 EDN: OCMMPU
  15. Moons KGM, Altman DG, Reitsma JB, et al. Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis (TRIPOD): explanation and elaboration. translation in to Russian. Digital Diagnostics. 2022;3(3):232–322. doi: 10.17816/DD110794 EDN: VPRPPQ
  16. Barber MS, Aronson JK, Von Schoen-Angerer T, et al. CARE guidelines for case reports: explanation and elaboration document. Translation into Russian. Digital Diagnostics. 2022;3(1):16–42. doi: 10.17816/DD105291 EDN: WHTQFL
  17. Stevens A, Shamseer L, Weinstein E, et al. Relation of completeness of reporting of health research to journals' endorsement of reporting guidelines: systematic review. BMJ. 2014;348:g3804. doi: 10.1136/bmj.g3804
  18. Struthers C, Harwood J, de Beyer JA, et al. There is no reliable evidence that providing authors with customized article templates including items from reporting guidelines improves completeness of reporting: the GoodReports randomized trial (GRReaT). BMC Medical Research Methodology. 2025;25(1):71. doi: 10.1186/s12874-025-02518-0EDN: OGWYOH
  19. Page MJ, Moher D. Evaluations of the uptake and impact of the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) Statement and extensions: a scoping review. Systematic Reviews. 2017;6(1):1–14. doi: 10.1186/s13643-017-0663-8 EDN: IFOTOO
  20. Leung V, Rousseau-Blass F, Beauchamp G, Pang DSJ. ARRIVE has not ARRIVEd: Support for the ARRIVE (Animal Research: Reporting of in vivo Experiments) guidelines does not improve the reporting quality of papers in animal welfare, analgesia or anesthesia. PLOS ONE. 2018;13(5):e0197882. doi: 10.1371/journal.pone.0197882
  21. Innocenti T, Salvioli S, Giagio S, et al. Declaration of use and appropriate use of reporting guidelines in high-impact rehabilitation journals is limited: a meta-research study. Journal of Clinical Epidemiology. 2021;131:43–50. doi: 10.1016/j.jclinepi.2020.11.010 EDN: RCYGJA
  22. Kumar S, Mohammad H, Vora H, Kar K. Reporting Quality of Randomized Controlled Trials of Periodontal Diseases in Journal Abstracts—A Cross-sectional Survey and Bibliometric Analysis. Journal of Evidence Based Dental Practice. 2018;18(2):130–141.e22. doi: 10.1016/j.jebdp.2017.08.005
  23. Turner L, Shamseer L, Altman DG, et al. Consolidated standards of reporting trials (CONSORT) and the completeness of reporting of randomised controlled trials (RCTs) published in medical journals. Cochrane Database of Systematic Reviews. 2012;2013(1):MR000030. doi: 10.1002/14651858.MR000030.pub2
  24. Hair K, Macleod MR, Sena ES; on behalf of the IICARus Collaboration. A randomised controlled trial of an Intervention to Improve Compliance with the ARRIVE guidelines (IICARus). Research Integrity and Peer Review. 2019;4(1):12. doi: 10.1186/s41073-019-0069-3 EDN: SKWVBB
  25. Qureshi R, Gough A, Loudon K. The SPIRIT Checklist—lessons from the experience of SPIRIT protocol editors. Trials. 2022;23(1):359. doi: 10.1186/s13063-022-06316-7 EDN: LLYYBY
  26. Blanco D, Biggane AM, Cobo E; MiRoR network. Are CONSORT checklists submitted by authors adequately reflecting what information is actually reported in published papers? Trials. 2018;19(1):1–4. doi: 10.1186/s13063-018-2475-0 EDN: NZKYYF
  27. Agha RA, Fowler AJ, Limb C, et al. Impact of the mandatory implementation of reporting guidelines on reporting quality in a surgical journal: a before and after study. International Journal of Surgery. 2016;30:169–172. doi: 10.1016/j.ijsu.2016.04.032
  28. Stahl AC, Tietz AS, Dewey M, Kendziora B. Has the quality of reporting improved since it became mandatory to use the Standards for Reporting Diagnostic Accuracy? Insights into Imaging. 2023;14(1):85. doi: 10.1186/s13244-023-01432-7 EDN: GNTEET
  29. Cobo E, Cortes J, Ribera JM, et al. Effect of using reporting guidelines during peer review on quality of final manuscripts submitted to a biomedical journal: masked randomised trial. BMJ. 2011;343(nov22 2):d6783. doi: 10.1136/bmj.d6783
  30. Blanco D, Schroter S, Aldcroft A, et al. Effect of an editorial intervention to improve the completeness of reporting of randomised trials: a randomised controlled trial. BMJ Open. 2020;10(5):e036799. doi: 10.1136/bmjopen-2020-036799 EDN: QLYBIW
  31. Cobo E, Selva-O'Callagham A, Ribera JM, et al. Statistical reviewers improve reporting in biomedical articles: a randomized trial. PLoS ONE. 2007;2(3):e332. doi: 10.1371/journal.pone.0000332
  32. Wang P, Wolfram D, Gilbert E. Endorsements of five reporting guidelines for biomedical research by journals of prominent publishers. PLOS ONE. 2024;19(2):e0299806. doi: 10.1371/journal.pone.0299806 EDN: BQOFAY
  33. Speich B, Mann E, Schönenberger CM, et al. Reminding peer reviewers of reporting guideline items to improve completeness in published articles. JAMA Network Open. 2023;6(6):e2317651. doi: 10.1001/jamanetworkopen.2023.17651 EDN: EUOCKZ
  34. Carneiro CFD, Queiroz VGS, Moulin TC, et al. Comparing quality of reporting between preprints and peer-reviewed articles in the biomedical literature. Research Integrity and Peer Review. 2020;5(1):1–19. doi: 10.1186/s41073-020-00101-3 EDN: FECPOG
  35. Lu JH, Callahan A, Patel BS, et al. Assessment of adherence to reporting guidelines by commonly used clinical prediction models from a single vendor. JAMA Network Open. 2022;5(8):e2227779. doi: 10.1001/jamanetworkopen.2022.27779 EDN: PJDCML
  36. Heus P. Maximizing research value: adequate reporting and effective (de-)implementation strategies. 2020. ISBN: 978-94-6375-925-0 Available from: https://dspace.library.uu.nl/bitstream/handle/1874/397221/5ee2b2d9a461d.pdf
  37. Innocenti T, Ostelo R, Verhagen A, et al. Rehabilitation journal editors recognize the need for interventions targeted to improve the completeness of reporting, but there is heterogeneity in terms of strategies actually adopted: a cross-sectional web-based survey. Journal of Evidence-Based Medicine. 2023;16(2):111–115. doi: 10.1111/jebm.12527 EDN: ZQPAPY
  38. Fuller T, Pearson M, Peters J, Anderson R. What affects authors’ and editors’ use of reporting guidelines? Findings from an online survey and qualitative interviews. PLOS ONE. 2015;10(4):e0121585. doi: 10.1371/journal.pone.0121585
  39. Galipeau J, Cobey KD, Barbour V, et al. An international survey and modified Delphi process revealed editors’ perceptions, training needs, and ratings of competency-related statements for the development of core competencies for scientific editors of biomedical journals. F1000Research. 2017;6:1634. doi: 10.12688/f1000research.12400.1
  40. Moher D, Galipeau J, Alam S, et al. Core competencies for scientific editors of biomedical journals: consensus statement. BMC Medicine. 2017;15(1):1–10. doi: 10.1186/s12916-017-0927-0 EDN: UYBDTQ
  41. Chauvin A, Ravaud P, Baron G, et al. The most important tasks for peer reviewers evaluating a randomized controlled trial are not congruent with the tasks most often requested by journal editors. BMC Medicine. 2015;13(1):1–10. doi: 10.1186/s12916-015-0395-3 EDN: JFJLIW
  42. Hirst A, Altman DG. Are peer reviewers encouraged to use reporting guidelines? A Survey of 116 health research journals. PLoS ONE. 2012;7(4):e35621. doi: 10.1371/journal.pone.0035621
  43. Zhong J, Xing Y, Lu J, et al. The endorsement of general and artificial intelligence reporting guidelines in radiological journals: a meta-research study. BMC Medical Research Methodology. 2023;23(1):292. doi: 10.1186/s12874-023-02117-x EDN: NWIQBP
  44. Koçak B, Keleş A, Köse F. Meta-research on reporting guidelines for artificial intelligence: are authors and reviewers encouraged enough in radiology, nuclear medicine, and medical imaging journals? Diagnostic and Interventional Radiology. 2024;30(5):291–298. doi: 10.4274/dir.2024.232604
  45. Sharp MK, Bertizzolo L, Rius R, et al. Using the STROBE statement: survey findings emphasized the role of journals in enforcing reporting guidelines. Journal of Clinical Epidemiology. 2019;116:26–35. doi: 10.1016/j.jclinepi.2019.07.019
  46. Shanahan DR, Lopes de Sousa I, Marshall DM. Simple decision-tree tool to facilitate author identification of reporting guidelines during submission: a before–after study. Research Integrity and Peer Review. 2017;2(1):1–6. doi: 10.1186/s41073-017-0044-9 EDN: IZCRWE
  47. Glonti K, Cauchi D, Cobo E, et al. A scoping review on the roles and tasks of peer reviewers in the manuscript review process in biomedical journals. BMC Medicine. 2019;17(1):1–14. doi: 10.1186/s12916-019-1347-0 EDN: ACKUKY
  48. Grindlay DJC, Dean RS, Christopher MM, Brennan ML. A survey of the awareness, knowledge, policies and views of veterinary journal Editors-in-Chief on reporting guidelines for publication of research. BMC Veterinary Research. 2014;10(1):1–10. doi: 10.1186/1746-6148-10-10 EDN: ZVKVCJ
  49. Wager E, Williams P. “Hardly worth the effort”? Medical journals' policies and their editors' and publishers' views on trial registration and publication bias: quantitative and qualitative study. BMJ. 2013;347:f5248. doi: 10.1136/bmj.f5248
  50. Chan L, Heinemann AW, Roberts J. Elevating the quality of disability and rehabilitation research: mandatory use of the reporting guidelines. Annals of Physical and Rehabilitation Medicine. 2014;57(9-10):558–560. doi: 10.1016/j.rehab.2014.09.011
  51. Malički M, Aalbersberg IJJ, Bouter L, ter Riet G. Journals’ instructions to authors: a cross-sectional study across scientific disciplines. PLOS ONE. 2019;14(9):e0222157. doi: 10.1371/journal.pone.0222157 EDN: BTPZBM
  52. Heus P, Idema DL, Kruithof E, et al. Increased endorsement of TRIPOD and other reporting guidelines by high impact factor journals: survey of instructions to authors. Journal of Clinical Epidemiology. 2024;165:111188. doi: 10.1016/j.jclinepi.2023.10.004 EDN: ZABVLJ
  53. Levine D, Kressel HY. Radiology 2016: the care and scientific rigor used to process and evaluate original research manuscripts for publication. Radiology. 2016;278(1):6–10. doi: 10.1148/radiol.2015152256
  54. Vogt L, Reichlin TS, Nathues C, Würbel H. Authorization of animal experiments is based on confidence rather than evidence of scientific rigor. PLOS Biology. 2016;14(12):e2000598. doi: 10.1371/journal.pbio.2000598
  55. Dewey M, Levine D, Bossuyt PM, Kressel HY. Impact and perceived value of journal reporting guidelines among Radiology authors and reviewers. European Radiology. 2019;29(8):3986–3995. doi: 10.1007/s00330-018-5980-3 EDN: DODDKY
  56. Hartley J. Current findings from research on structured abstracts: an update. Journal of the Medical Library Association: JMLA. 2014;102(3):146–148. doi: 10.3163/1536-5050.102.3.002
  57. El Emam K, Leung TI, Malin B, et al. Consolidated reporting guidelines for prognostic and diagnostic machine learning models (CREMLS). Journal of Medical Internet Research. 2024;26:e52508. doi: 10.2196/52508 EDN: BVOUQD
  58. Vilaró M, Cortés J, Selva-O’Callaghan A, et al. Adherence to reporting guidelines increases the number of citations: the argument for including a methodologist in the editorial process and peer-review. BMC Medical Research Methodology. 2019;19(1):1–7. doi: 10.1186/s12874-019-0746-4 EDN: RIQDUM
  59. Choi YJ, Chung MS, Koo HJ, et al. Does the reporting quality of diagnostic test accuracy studies, as defined by STARD 2015, affect citation? Korean Journal of Radiology. 2016;17(5):706–714. doi: 10.3348/kjr.2016.17.5.706
  60. Dilauro M, McInnes MDF, Korevaar DA, et al. Is There an association between STARD Statement adherence and citation rate? Radiology. 2016;280(1):62–67. doi: 10.1148/radiol.2016151384
  61. Botos J. Reported use of reporting guidelines among JNCI: Journal of the National Cancer Institute authors, editorial outcomes, and reviewer ratings related to adherence to guidelines and clarity of presentation. Research Integrity and Peer Review. 2018;3(1):7. doi: 10.1186/s41073-018-0052-4 EDN: ZORMUW
  62. Stevanovic A, Schmitz S, Rossaint R, et al. CONSORT Item Reporting Quality in the Top Ten Ranked Journals of Critical Care Medicine in 2011: a retrospective analysis. PLOS ONE. 2015;10(5):e0128061. doi: 10.1371/journal.pone.0128061
  63. Duff JM, Leather H, Walden EO, et al. Adequacy of published oncology randomized controlled trials to provide therapeutic details needed for clinical application. JNCI: Journal of the National Cancer Institute. 2010;102(10):702–705. doi: 10.1093/jnci/djq117 EDN: NYTZUN
  64. Dancey JE. From Quality of Publication to Quality of Care: Translating Trials to Practice. JNCI: Journal of the National Cancer Institute. 2010;102(10):670–671. doi: 10.1093/jnci/djq142
  65. Pchelintseva OI, Omelyanskaya OV. Features of conducting ethical review of research on artificial intelligence systems on the basis of the research and practical clinical center for diagnostics and telemedicine technologies of the Moscow Health Care Department, Moscow, Russian Federation. Digital Diagnostics. 2022;3(2):156–161. doi: 10.17816/DD107983 EDN: GHDTJX

Supplementary files

Supplementary Files
Action
1. JATS XML

Copyright (c) 2025 Eco-Vector

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

СМИ зарегистрировано Федеральной службой по надзору в сфере связи, информационных технологий и массовых коммуникаций (Роскомнадзор).
Регистрационный номер и дата принятия решения о регистрации СМИ: серия ПИ № ФС 77 - 79539 от 09 ноября 2020 г.