There is much to celebrate when it comes to recent developments in scholarly communication and research assessment. The scholarly communication landscape shows many promising initiatives, including a growing use of preprint services, open access repositories, and open peer review platforms, as well as an increasing interest in more equitable scholarly publishing models such as diamond open access and Subscribe to Open. Likewise, the research assessment landscape is undergoing a transition toward more responsible assessment approaches. A growing number of organizations are evaluating researchers and research units on a broader range of outputs and activities rather than being focused primarily on articles in a limited number of selective journals. Evaluation is increasingly reliant on a healthy mix of expert judgment and data-driven analytics rather than being almost exclusively metrics-driven.
Innovations in scholarly communication and research assessment need to go hand in hand. Successful innovation in scholarly communication is possible only if research assessment incentivizes the uptake of better alternatives that ultimately improve research quality and practice. One of many elements required to support such a shift in evaluation practices is that evaluators have access to high-quality data on all relevant research outputs. Bibliographic databases, which aim to provide structured metadata on research output of researchers, institutions, countries or disciplines, play a crucial role here. Unfortunately, however, recent developments around the indexing of eLife in the Web of Science and Scopus databases show that too often these databases hinder innovation rather than support it.
Indexing of eLife in Web of Science and Scopus
eLife, a prominent journal in the life sciences, has been a champion of an innovative way of publishing and peer review that puts the onus on what an article says and what others say about it, avoiding labels of “published” or “peer-reviewed” as synonyms for “vetted” or “high-quality.” However, rather than being supportive of such innovation, Web of Science and Scopus are hindering it by discontinuing the indexing of articles peer reviewed by eLife in their core databases.
Web of Science is going to index eLife articles in its Emerging Sources Citation Index, but it will index only a subset of all eLife articles. Moreover, the Emerging Sources Citation Index is often seen as less prestigious than the Science Citation Index Expanded in which eLife was previously indexed and is therefore often not included in research analytics. Likewise Scopus will index eLife articles in its preprints database, a database that isn’t considered in the research analytics provided by Scopus. The decisions by Web of Science and Scopus make it challenging for research evaluators relying on Web of Science or Scopus data to consider eLife articles in their evaluations. We see these decisions as a huge missed opportunity for Web of Science and Scopus to show the role they can play in supporting innovation in scholarly communication and research assessment. Instead, these databases choose to protect and sustain a traditional model of scholarly communication and research assessment that is rapidly losing its relevance as a result of ongoing global transitions in publishing and assessment practices.
Bibliographic Databases and the Need for Experimentation
Researchers might expect that bibliographic databases are simply faithfully cataloging the scholarly corpus as it is published, but in actuality, many databases play an active role in choosing what to index and what to avoid. Some databases, notably Web of Science and Scopus, have built their brands on being highly selective, using bespoke (and often opaque) selection processes to determine what research “counts” and gets to be included in a database. For a journal, being included means discoverability in a given database. It can also mean the provision of metrics (e.g., Journal Impact Factor), which remain markers of prestige in some communities, despite the concerns and criticisms that are gradually reducing this influence. Other parts of the scholarly communication ecosystem have latched onto the signaling value that databases provide, using the presence of a journal in a database as a cue to the legitimacy of research published in it. For these reasons, many journals are highly motivated to be indexed in major databases. But whether these databases, as they currently function, are providing a net positive value to the research community remains questionable.
As individuals working in various ways to improve the research ecosystem, we are concerned that bibliographic databases such as Web of Science and Scopus are perpetuating a system that stymies innovation in scholarly communication and research assessment. Among other solutions, we advocate for experimentation that allows for the development of an evidence-based understanding of the pros and cons of new approaches to scholarly communication. eLife’s detailed expert evaluations and rich editorial commentary are an example of this type of experimentation. However, it is hard to perform meaningful experimentation if bibliographic databases dissuade researchers and other stakeholders from participating in experiments, because their outputs are no longer visible and recognized in existing assessments relying on these databases.
Criteria for Innovation-Friendly Bibliographic Databases
We urge the global research community to throw its weight behind bibliographic databases that support, rather than hinder, innovation in scholarly communication and research assessment. In the box below, we present four criteria for innovation-friendly bibliographic databases. These criteria complement principles for bibliographic databases developed in other contexts, such as principles for openness that are currently being discussed in the context of the Barcelona Declaration on Open Research Information.
Criteria for Innovation-Friendly Bibliographic Databases
Mainstream scholarly communication practices suffer from major problems, including slow dissemination of research outcomes, lack of openness and transparency, an overburdened peer review system, and a research integrity crisis. Innovation and experimentation are essential to improve scholarly communication practices. Bibliographic databases should support and facilitate this, not hinder it, whether directly or indirectly. Innovation-friendly bibliographic databases meet the following criteria:
- Recognize the broad variety of use cases for bibliographic databases and the need to offer flexibility to address different use cases in appropriate ways. Popular databases like Web of Science and Scopus have a narrow perspective on which journals deserve to be indexed. Although this might be appropriate for some use cases, it is deeply problematic for many others. Research isn’t one-size-fits-all, and bibliographic databases shouldn’t be either.
- Accommodate diverse approaches to scholarly communication. Indexing only articles in peer-reviewed journals provides an incomplete and biased picture of the scholarly record. Scholarly communication is happening on a variety of platforms, including preprint servers, peer review platforms, data repositories, and more. Bibliographic databases need to accommodate these important platforms.
- Acknowledge the importance of open peer review reports and other markers of trust. Open peer review reports and other trust markers play an increasingly important role in helping consumers of scholarly outputs assess the quality and trustworthiness of these outputs. Bibliographic databases need to index these reports and trust markers.
- Ensure involvement of the research community in the governance of bibliographic databases. Operating a bibliographic database that serves the diverse needs of the research community is a significant challenge. To address this challenge, the research community must be meaningfully involved in key decisions in the design and operation of a bibliographic database. Community governance builds trust.
Web of Science and Scopus perform relatively poorly on our criteria for innovation-friendly bibliographic databases. Although the perfect database doesn’t exist, there are important examples of databases that are more supportive of innovation in scholarly communication and research assessment. For instance, national bibliographic databases, often based on current research information systems (CRIS), offer broad coverage of journals and other publication channels, and are governed by research community stakeholders, more in line with our criteria 1 and 4. Unlike Web of Science and Scopus, Dimensions and OpenAlex are global databases that aim to have a comprehensive rather than selective scope (and that therefore offer full indexing of eLife articles), in the spirit of our criteria 1 and 2. Likewise, in the biomedical domain, Europe PMC has made a significant effort to index preprints and open peer review reports, in line with our criteria 2 and 3.
We call on users of bibliographic databases to evaluate such databases against the criteria we outline here, deeply considering whether a given database is helpful to the research enterprise before continuing to use it. We call on research evaluators to employ bibliographic databases that support rather than hinder innovation and experimentation in scholarly communication and the associated reforms in research assessment, as promoted by initiatives like CoARA and DORA. And we call on bibliographic databases to engage with the community to make changes that bring them closer to the criteria we have outlined here. The future success of a bibliographic database will depend on its ability to work with the community to support innovation in scholarly communication.
Additional Recommended Reading
- Statement by DORA: https://sfdora.org/2024/11/25/clarivates-actions-regarding-elife-doras-response/
- Statement by Bodo Stern (HHMI): https://www.coalition-s.org/blog/how-the-web-of-science-takes-a-step-back/
- News coverage by Science:
- News coverage by Nature:
- Updates from eLife:
- https://elifesciences.org/inside-elife/16afe6ec/update-on-elife-s-indexing-status-at-web-of-science
- https://elifesciences.org/inside-elife/c11c6101/the-elife-model-an-update-on-progress-following-changes-in-web-of-science-indexing-status
- https://elifesciences.org/inside-elife/ae620829/changes-to-elife-s-indexing-status-in-web-of-science-and-scopus
Note
The authors of this piece lead and contribute to various reform initiatives in scholarly communication and research assessment, including the Coalition for Advancing Research Assessment (CoARA), the Declaration on Research Assessment (DORA), the Higher Education Leadership Initiative for Open Scholarship (HELIOS Open), cOAlition S, the Barcelona Declaration on Open Research Information, the Helsinki Initiative on Multilingualism in Scholarly Communication, INORMS research evaluation group, and ASAPbio. The views expressed in this piece are meant to reflect our individual views and not necessarily those of the initiatives to which we contribute. Correspondence concerning the piece is welcomed in the comments here or may be directed to Katie Corker.
References
Sīle, L., Guns, R., Ivanović, D., Pölönen, J., & Engels, T. (2019). Creating and maintaining a national bibliographic database for research output: Manual of good practices. figshare. https://doi.org/10.6084/M9.FIGSHARE.9989204.V1
Levchenko, M. (2024, November 25). How we built a database of preprints. Europe PMC News Blog. https://doi.org/10.59350/6be58-c7z28
Barcelona Declaration on Open Research Information, Kramer, B., Neylon, C., & Waltman, L. (2024). Barcelona Declaration on Open Research Information. https://doi.org/10.5281/ZENODO.10958522
Copyright © 2025 Ginny Barbour, Caitlin Carter, Jonny Coates, Kelly D. Cobey, Katherine S. Corker, Elizabeth Gadd, Bianca Kramer, Rebecca Lawrence, Eva Méndez, Cameron Neylon, Janne Pölönen, Bodo Stern, Ludo Waltman. Distributed under the terms of the Creative Commons Attribution 4.0 License.