SCOAP3 Open Science Elements

SCOAP3 Open Science Elements

SCOAP3 centrally pays publishers from a common fund at CERN, to which libraries, library consortia, research institutions and funding agencies jointly contribute. To control costs, SCOAP3 and the publishers have agreed to a fixed maximum yearly payment for relevant content in each journal, commensurate with an estimated number of articles.

For the contractual period 2025-2027, a new mechanism was established with the goal to incentivise SCOAP3 publishers to enhance Open Science practices when publishing content by comparing their individual performances on pre-defined aspects with each other. The SCOAP3 partnership will assess all publishers’ performance on the Open Science Elements on an annual basis and adjust the financial compensations accordingly.

The adjustment is calculated as follows:

Adjustment = Publisher’s score / SCOAP3 score + Bonus

  • Publisher’s score:
    • The journal gets points for every element, depending on the level of fulfilment of the requirements (=journal’s score)
    • If a publisher has more than one Journal in the SCOAP3 program, the publisher’s score is the average across its Journals’ scores, weighted by the number of SCOAP3 Articles per Journal in the last 12 months
  • SCOAP3 score is the average of the Publisher’s Scores for all publishers participating in SCOAP3 on the date the calculation is made, weighted by the number of SCOAP3 Articles per Journal in the last 12 months. 
  • Bonus: if a publisher scores at 16 or above, they gain a bonus of 5% on the adjustment. 

The adjustment is capped at:  -/+ 10% (e.g. the adjustment can never be below -10% or above +15% (if there is a bonus.)

Evaluation process

The CERN team have analyzed the metadata in the SCOAP3 repository and CrossRef. In addition, publishers have been required to declare specific elements, such as public peer review or community values, through online forms. Once the metadata and declarations have been reviewed, CERN has conducted an evaluation and compiled the corresponding data. This evaluation has then sent to the publishers, allowing them to provide feedback and confirm the assessment.

List of Open Science elements

NameMax. points in 2025SCOAP3 average 2025Description and evaluation criteria
ORCiD integration52.63ORCiD is a persistent identifier for researchers and contributors in science to facilitate the attribution of all kinds of scholarly output to the contributing researcher. Sharing the ORCiD of authors directly contributes to the improvement of transparency and traceability in science.
Publishers ensure that ORCiD registration for all co-authors is properly supported, with the relevant information carried through the entire editorial and production process, including the article metadata, and for the submitting author, publishers validate the correctness of the provided ORCiD through standard API procedures.
Publishers can earn extra points depending on the proportion of ORCiDs associated with authors in the evaluated set of papers.
ROR integration21.68Similar to the ORCiD, the ROR represents a persistent identifier for research organisations. Comparable to other organisational identifiers like Ringgold or GRID it assigns a unique alphanumeric code to every registered organisation. Its aim is to facilitate the traceability of affiliations and create connections between research, researchers and their research organisations.
Publishers must include ROR as the persistent identifier for institutional affiliations in the article metadata of SCOAP3 articles.
Public Peer Review10There are various definitions of the concept of public/open peer review, but all have in common that they bring a level of transparency to this known to be sealed process. Transparent peer review enhances Open Science by fostering greater accountability, trust, and collaboration through openly sharing the peer-review process and feedback.
Publishers give the authors the option to select any kind of transparent peer-review process, that results in peer-review reports to be accessible and citable (through DOIs) if they decide to do so.
Dataset linking21.97Citing datasets used in research is important to ensure proper credit, facilitate reproducibility, and enable others to verify and build upon the work.
Publishers have a policy in place that requires authors to make their data publicly available, i.e. by depositing it in a repository, or at least give a data availability statement to be transparent on their data-sharing practices.
In 2026, publishers will have the possibility to gain additional points for increasing the share of their SCOAP3 Articles which have publicly available linked datasets, clearly identified or labelled as such either in the article’s references or its metadata. (This part will be only evaluated in year 2, comparing numbers of current and previous year)
Software linking21.97Citing software used in research is important to ensure proper credit, facilitate reproducibility, and enable others to verify and build upon the research work.
Publishers have a policy in place that requires authors to make their software publicly available, i.e. by depositing it in a repository, or at least give a software availability statement.
In 2026, publishers will have the possibility to gain additional points for increasing the share of their SCOAP3 Articles which have publicly available linked software, clearly identified or labelled as such either in the article’s references or its metadata. (This part will be only evaluated in year 2, comparing numbers of current and previous year.)
Depositing of detailed metadata In Crossref62.17Crossref is a non-profit organisation known as an official digital object identifier (DOI) Registration Agency of the International DOI Foundation. It lets publishers deposit the publication’s metadata and makes them publicly available.
Publishers should deposit a set of metadata (“mandatory”) together with the articles.
Publishers gain additional points if they deposit more metadata (“additional”)

More information:
List of mandatory metadata (when applicable):
- Copyright statement
- Author information (Name, ORCiD)
- DOI
- Affiliation information (including identifiers)
- Journal information (Name, ISSN, Issue, Volume,...)
- License information
- Dates of reception, acceptance and publication
- List of references
- Title & Subtitle
- Preprint information (arxiv ID, DOI,...)
- Dataset & Software information (Links, DOI)
- Page information

List of additional metadata (when applicable):
- Abstract
- Author role Award number
- Funder information (Name & Identifier)
- DOI of Peer Review Report
Excellence in Accessibility41.98The World Wide Web Consortium (W3C) defined 3 levels of accessibility for online content in their “Web Content Accessibility Guidelines” (WCAG) to lower the barriers for people with impairment of any kind to take part in the world online.
To measure the level of accessibility the Information Technology Industry Council created the so called Voluntary Product Accessibility Template (VPAT) which website creators can use to evaluate the accessibility of their webcontent.
The journal’s website and articles themselves providing access to SCOAP3 Articles consistently satisfies the relevant Level A and Level AA success criteria according to WCAG.
Publishers gain additional points if the journal’s website and articles themselves providing access to SCOAP3 Articles consistently satisfies the relevant Level AAA success criteria according to WCAG
Disclosure on SCOAP3 Community Values33The SCOAP3 community has defined a list of values related to qualitative elements
in scholarly publishing. The aim is to encourage transparency from publishers on those values.
Publishers openly disclose statements on the values. They are not penalised for non-compliance with these specific values, but full transparency is encouraged and incentivised.

More information:
List of values:
- Diversity, Equity & Inclusion
- Sustainability
- Data Privacy
- Financial Transparency
- Referee Recognition/Compensation
- Publication Transparency
Total2515.41

Evaluation results per publisher

PublisherJournalScore 2025
APSPhysical Review C (PRC)
Physical Review D (PRD)
Physical Review Letters (PRL)
20.18
Elsevier
Nuclear Physics B (NPB)
Physics Letters B (PLB)
11.70
IOP
Chinese Physics C (CPC)
8.00
Jagiellonian UniversityActa Physica Polonica B (APPB)
3.00
Oxford University PressProgress of Theoretical and Experimental Physics (PTEP)
5.00
Springer NatureThe European Physical Journal C (EPJC)
The Journal of High Energy Physics (JHEP)
13.33
Wiley
Advances in High Energy Physics (AHEP)
7.00
SCOAP3 average
15.41

The publisher’s statements on community values are available here.