Browser does not support script.

Policies and documents

Responsible Metrics Statement

York St John University is committed to the fair, transparent, and inclusive use of research metrics.

York St John University is committed to the responsible and informed use of research metrics.

While metrics can provide valuable insights, they must be applied with transparency, care, and in conjunction with expert judgement. Metrics should never be used as standalone indicators of research quality but rather as complementary tools that provide context to qualitative assessments.

This statement is underpinned by internationally recognised frameworks for responsible research assessment, including the San Francisco Declaration on Research Assessment (DORA), the Leiden Manifesto, and The Metric Tide report. In particular, we draw on the principles of The Metric Tide, which advocate for transparency, contextualisation, and the responsible use of quantitative indicators in evaluating research.

Strategic context

This statement supports the University’s strategic priorities as outlined in The University for Social Impact Strategy, reinforcing our commitment to:

  • Inclusive and equitable research assessment, ensuring fairness and contextual sensitivity.
  • Transformational partnerships, through transparent and credible evaluation of collaborative research.
  • Impactful research and knowledge exchange, by recognising diverse outputs and societal contributions.
  • Demonstrating social impact, locally and globally, through meaningful and responsible metrics.

It also aligns with the Strategic Framework for Research and Innovation, which promotes integrity, excellence, and inclusivity in research culture. By embedding responsible metrics practices, York St John is well-positioned to meet the expectations of evolving national frameworks such as REF 2029, which emphasise research culture, diversity, and broader contributions to knowledge and impact.

Implementation

As a diverse and inclusive institution, York St John University recognises the importance of applying research metrics in ways that are sensitive to disciplinary norms and local contexts. This statement provides a flexible framework of guiding principles and recommendations, supporting thoughtful and context-aware decision-making.

The University intends to become a signatory to DORA in 2025, affirming our commitment to improving research assessment practices. This aligns with our institutional goals of promoting inclusive, equitable, and context-sensitive evaluation. By reducing reliance on journal-based metrics such as Impact Factors, we support diverse publishing practices and recognise a broader range of research contributions.

To embed DORA principles, the University Research Committee will lead a review of assessment practices in 2026, with recommendations published in early 2027. This process will include consultation with School Research Leads and UoA Leads to ensure integration with REF 2029 preparations.

Guiding principles

Our approach to metrics is guided by the following sector-recognised principles:

  • Robustness: Use accurate, validated, and contextually appropriate data.
  • Humility: Recognise that metrics are partial indicators, not definitive measures of quality.
  • Transparency: Clearly communicate data sources, methodologies, and limitations.
  • Diversity: Value a wide range of outputs, disciplines, languages, and career paths.
  • Reflexivity: Regularly review practices to avoid unintended consequences.

Use of metrics

Metrics may be used to inform:

  • Research assessment (for example, REF environment statements, benchmarking).
  • Staff evaluation (for example, promotion, appraisal).
  • Strategic planning and institutional reporting.

However, metrics will never be used in isolation for decisions related to recruitment, promotion, or funding.

In all cases, metrics must be:

  • Transparent: Clear in origin, calculation, and limitations
  • Contextualised: Interpreted within disciplinary and career norms
  • Complementary: Used alongside, not in place of, peer review
  • Developmental: Used to support learning and improvement

Types of research metrics

Metrics should support, not replace, expert judgement. When used, they must be applied transparently and in context. The University recognises five broad categories:

  • Examples: citation counts, field-normalised impact, usage statistics
  • Use: indicates scholarly reach of outputs
  • Caution: varies by discipline and time; not standalone indicators
  • Examples: h-index, citation counts, altmetrics
  • Use: provides partial insights into researcher profiles
  • Caution: may disadvantage early-career researchers or interdisciplinary work
  • Examples: journal Impact Factor, SCImago Journal Rank
  • Use: contextualises journal reach
  • Caution: not proxies for article or researcher quality (see DORA for further information)
  • Examples: field-weighted citation impact, collaboration indices, research income
  • Use: identifies trends and strengths
  • Caution: requires contextualisation
  • Examples: case studies, policy influence, open research contributions
  • Use: captures societal impact and research culture
  • Caution: should meet standards of evidence and peer review

Equity, diversity and inclusion

Metrics will be interpreted fairly, with consideration for:

  • Disciplinary norms and publication practices
  • Career stage and career breaks
  • Contributions in languages other than English

Governance and review

  • Oversight is provided by the University Research Committee
  • Concerns about metric use can be raised with the Research Office
  • This statement will be reviewed biennially in line with sector developments

For questions or feedback, please contact: ResearchOffice@yorksj.ac.uk.

Further information

This statement complements the following national and international initiatives: