Documentation

Detailed Guidance on Our Methods and Metrics

Context

The utilization of bibliometric indicators for research evaluation has long been restricted by limited data access for researchers and funding bodies. This limitation has prompted the search for alternative metrics, often leading to the exclusion of key indicators or the use of more accessible but less accurate substitutes. This situation has sometimes led to a misuse of metrics, as seen with the widespread but often criticized use of the Journal Impact Factor (JIF).

Mission

The ONE Research Community aims to revolutionize research assessment by providing all researchers with access to a broad spectrum of indicators and collaboration and benchmarking tools. These tools are designed to empower researchers to make informed decisions to advance their careers and to foster a deeper understanding of the available metrics.

Item & group oriented indicators

Our bibliometric indicators are categorized into two types: item-oriented and group-oriented. Item-oriented indicators are calculated for individual publications, such as citation counts or the nature of the publication (international, lead-authored, etc.). These reflect direct attributes of a publication.

In contrast, group-oriented indicators require aggregating data across multiple publications by an author, such as total publication counts. These indicators provide a broader view of a researcher’s output and influence, necessitating grouping publications according to specific criteria to ensure accurate assessment..

Item-oriented indicators

Among the item-oriented indicators, citation counts are particularly complex due to the varying citation practices across disciplines, document types, and publication ages. To fairly compare citation counts, we use a detailed method:

Number of citations

Citations are a fundamental measure of a publication’s influence. The number of citations a publication receives can vary significantly based on:

  • Research field: Publications in fields like Chemistry might garner more citations compared to Mathematics, but fewer than those in High Energy Physics.
  • Type of document: Reviews tend to accumulate more citations than research articles due to their comprehensive nature.
  • Publication age Older publications have had more time to accumulate citations.

To account for these variables, we rank publications by their citation count within their respective categories (type, year, discipline) and calculate percentiles to assess their relative impact. This approach ensures fair comparisons by adjusting for field-specific and temporal factors.

Ranking Publications by Their Number of Citations

We create hundreds of rankings by grouping publications within each field by discipline, document type, and publication year. Each publication is placed in a percentile within its cohort, allowing us to identify Highly Cited Papers (HCPs) and Outstanding Papers (OPs) based on their percentile rank:

  • Highly Cited Papers (HCP): Publications ranking between the 90th and 99th percentiles.
  • Outstanding Papers (OP): Publications at or above the 99th percentile.

Expected Values: Reference Values

In bibliometrics, reference values help determine whether an indicator for a publication or author stands above or below average. These values are crucial due to the asymmetric distribution typical of bibliometric data:

  • Mean (average): Often used as a reference, unfortunately it is affected by extreme values common in citation counts.
  • 90th Percentile: Used internationally to define Highly Cited Papers and represents the top 1% of cited papers by field, publication year and document type.
  • 99th Percentile: Used internationally to define what we call Outstanding Papers and represents the top 1% of cited papers by field, publication year and document type.

These reference values enable us to set benchmarks for what constitutes significant academic impact, aiding in the fair evaluation of research output.

Authors' indicators

Group-oriented indicators aggregate an author's bibliometric data to provide insights into their overall research output and influence. These indicators require the compilation of all publications attributed to an author, which involves complex identification processes:

Identifying the Actual Researchers

Correctly associating publications to authors is a challenge due to common issues like name synonyms and homonyms. With the introduction of systems like ORCID, which provide unique identifiers, and our own sophisticated algorithms, we have significantly improved the accuracy of author identification. Despite these advancements, minor discrepancies can occur, often only rectifiable by the authors themselves.

Counting Method

Each publication attributed to an author counts fully towards their total output. This total count method ensures that all recognized contributions are acknowledged equally, reflecting the collaborative nature of scientific work.

Main Field and Years of Scientific Activity

We assign authors to fields based on the most frequent classification of their publications, with allowances for secondary disciplines if they constitute a significant portion of an author's work. The years of scientific activity are calculated from the first to the last publication, providing a timeline of the author's active research period.

Indicators & Dimensions

The indicators within the ONE Index are carefully organized into distinct dimensions, each aimed to capture different facets of a researcher's academic and societal contributions. Within each dimension, indicators are assigned specific weights, allowing for tailored emphasis on aspects deemed most relevant to different research areas. This weighted system facilitates a nuanced evaluation, enabling a more detailed and area-specific assessment of research impact.

List of indicators updated at December 2024

Invitation for Community Engagement

We acknowledge that the current system, while comprehensive, is not without its imperfections. In our pursuit of a fair and universally accepted assessment tool, we invite the research community to engage in ongoing dialogue and contribute to the refinement of the ONE Index.

This platform is meant to evolve through community input, reflecting a broad spectrum of academic activities and values. Our current framework is a humble contribution, intended as a foundation for a community-driven, equitable research assessment tool.

Comparing authors' indicators

The process of comparing authors within the ONE Index is designed to ensure a fair and contextually relevant assessment by considering similar cohorts of researchers. This methodology allows for meaningful comparisons by aligning researchers with their peers who share similar backgrounds and research trajectories.

cohort of peers

Cohort Definition

Researchers are grouped into cohorts based on two factors:

  • Research field: Authors are grouped with peers within the same academic field and subfield of the classification available with OpenAlex.
  • Years of Activity: Authors are also grouped according to their active years in the research field, ensuring that career stage is appropriately considered.

Scoring System

Authors are assessed using a range of indicators, as previously described. The scores for each indicator are calculated based on percentiles within their respective cohorts:

  • Percentile Ranking: Each indicator is assessed by ranking the author's value within his/her cohort for that specific indicator. This ranking is then translated into a percentile score, providing a clear metric of how an author compares to their peers.

Weighted Average

  • Final Score: The author's final score in the ONE Index is a weighted average of these percentile scores across all indicators. This composite score reflects the overall performance of the author relative to their peers.

Dynamic Adjustment

  • Adaptive Weights: Weights assigned to each indicator can be adjusted to reflect changes in disciplinary standards or the evolving nature of research fields, allowing the ONE Index to remain relevant and accurately reflective of current academic values.
  • Flexible Cohort Windows: For cohort formation based on years of activity, younger researchers are grouped within narrow windows to ensure that comparisons are made among individuals at similar career stages. Conversely, for senior researchers with extensive years of activity, such as those over 25 years, the cohort windows are less restrictive. This flexibility recognizes that the impact of a few years more or less of activity diminishes as career length increases, allowing for fairer comparisons among more established researchers.

Invitation for Community Engagement

At the ONE Research Community, we believe in the power of collective wisdom to create tools that are both fair and effective. We invite all stakeholders in the academic community—researchers, administrators, funders, and policymakers—to participate in an ongoing dialogue to refine and enhance the ONE Index. Your insights, critiques, and suggestions are invaluable to ensuring that our tool evolves in ways that are most beneficial and relevant to the diverse needs of the global research community.

Together, we can build an assessment system that not only measures but also meaningfully contributes to the advancement of knowledge and innovation. Join us in this collaborative effort to shape the future of research assessment.

Structure of the ONE index

The One Index is a weighted measure that captures the breadth and depth of a researcher’s contributions across ten dimensions of academic and societal impact. Designed for clarity and precision, it empowers researchers, institutions, and decision-makers to assess and communicate research achievements effectively.

At the heart of the One Index are three supra-dimensions: Performance, Interaction within the Scientific Community, and Interaction with Society. Together, they encompass the diverse roles of researchers and the many ways their work influences the world.

  • The Performance dimension reflects the quality, influence, and originality of a researcher’s work. Indicators such as normalized impact, highly cited publications, and sustained productivity paint a vivid picture of scholarly contributions. While still under development, the Innovation and Originality metrics will soon quantify a researcher’s ability to open new paths in science.
  • The Interaction within Scientific Community dimension showcases a researcher’s role within the academic ecosystem. Metrics like international collaborations, leadership roles, and contributions to community support—such as peer review and mentoring—capture their influence beyond individual research outputs.
  • Lastly, the Interaction with Society dimension highlights a researcher’s ability to translate academic knowledge into real-world applications. Patents, industry collaborations, public outreach, and successful fundraising efforts demonstrate the broader societal and economic impact of their work.

Each of these dimensions includes specific, actionable indicators tailored for easy interpretation. By combining these into a single, weighted score, the One Index offers a comprehensive view of a researcher’s impact—both within academia and beyond.

Structure of the ONE Index

*Currently in development.