Main Data History
Show Index Toggle 0 comments
  •  Quick Edit
  • Published to Cancer Treatment and Research Communications at August 12th, 2016

    Quality of systematic review and meta-analysis abstracts in oncology journals


    Purpose: The purpose of this study was to evaluate the quality of reporting in the abstracts of oncology systematic reviews using PRISMA guidelines for abstract writing.

    Methods: Oncology systematic reviews and meta-analyses from four journals - The Lancet Oncology, Clinical Cancer Research, Cancer Research, and Journal of Clinical Oncology - were selected using a PubMed search. The resulting 337 abstracts were sorted for eligibility and 182 were coded based on a standardized abstraction manual constructed from the PRISMA criteria. Eligible systematic reviews were coded independently and later verified by a second coder, with disagreements handled by consensus. One hundred eighty-two abstracts comprised the final sample.

    Results: The number of included studies, information regarding main outcomes, and general interpretation of results were described in the majority of abstracts. In contrast, risk of bias or methodological quality appraisals, the strengths and limitations of evidence, funding sources, and registration information were rarely reported. By journal, the most notable difference was a higher percentage of funding sources reported in Lancet Oncology. No detectable upward trend was observed on mean abstract scores after publication of the PRISMA extension for abstracts.

    Conclusion: Overall, the reporting of essential information in oncology systematic review and meta-analysis abstracts is suboptimal and could be greatly improved.

    Keywords: Review, Systematic; Meta-Analysis; Cancer; Medical Oncology; Abstracting as Topic; Funding


    Scanning journal abstracts allows clinicians to quickly determine the relevance of a particular article to their clinical practice (Fleming 2012). The abstract should be written clearly and sufficiently detailed such that clinicians can decide whether to read on if the article is in hand or to download an electronic version for further reading (Hopewell 2008). A recent study found that users of biomedical literature that searched PubMed predominately viewed abstracts exclusively after reviewing titles returned from their searches. These abstract views were well over two times as likely as full-text views (Islamaj 2009). However, despite the importance of abstracts to convey essential information to users of research, clear and comprehensive reporting of core study aspects remains an issue. In an effort to address concerns about the quality and clarity of abstract reporting in clinical trials, the Consolidated Standards of Reporting Trial (CONSORT) group developed a minimum set of essential information for inclusion in an abstract (Hopewell 2008). Since the CONSORT abstract extension was published in 2008, some improvement in abstract reporting has been noted but still remains an issue (Can 2011).

    More recently, systematic reviews have played a growing role in decision making for clinic practice. While allowing biomedical literature users access to a higher quality of evidence, systematic reviews are still hampered by issues in the quality of abstract reporting (Beller 2013). This prompted the release of an extension of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement that detailed a checklist of essential items to include in a systematic review abstract.

    Since the extension of the PRISMA statement was published in 2013, no formal evaluation has been conducted on guideline adherence in medical journals. Only Kiriakou et. al.’s investigation of systematic review abstracts in oral implantology has been conducted to date (Kiriakou 2014). We, therefore, analyzed the extent to which systematic review authors reported this information in abstracts from a sample of leading oncology journals. We analyzed how well authors published in these journals adhered to the PRISMA extension guidelines for abstracts and whether this adherence had changed since the release of the PRISMA extension.