Skip to Main Content

Bibliometrics & Measuring Research Output: Recommendations

Recommended practices for working with bibliometric data

The following recommendations, originally offered by the University's White Paper, "Measuring Research Output Through Bibliometrics", are suggested to researchers, administrators, and others interested in using bibliometrics or assessing the relevance of bibliometric results. Recommendations are offered to:

  1. Researchers interested in using bibliometrics or assessing the relevance of bibliometric results.
  2. All users interested in using bibliometrics or assessing the relevance of bibliometric results.

Note that these recommendations speak only to the methodological reliability of bibliometric measures, as indicated in the relevant literature, and that University policies (such as Waterloo's Policy 77 on Tenure and Promotion) may direct the use of these measures. 

Recommended practices specific to researchers

Define your identity as an author

  • Define a researcher’s identity convention as an author early, and use that convention systematically throughout your career.

  • Give appropriate affiliation and acknowledgement to the University of Waterloo to properly track research output and productivity. 

  • Proactively determining how your name will appear in published form will increase the likelihood that your works will be accurately attributed to you within citation-tracking databases.

  • The only way to ensure that your publications are listed as your own is by creating an author profile such as Open Researcher and Contributor ID (ORCID).

Recommended practices for all users of bibliometric data

Analyze research outputs in the same way that you would conduct good research

  • Develop a strong research question with the scope and clarity appropriate to the discipline and issue under consideration.
  • Assess whether bibliometric measures can appropriately provide the information required to answer the research question. If not, it may be necessary to revise the research question or use other measures.
  • If bibliometric measures are indicated, select appropriate tools and measures to investigate the research question.
  • Be explicit about other non-bibliometric data sources that should also be considered.
  • Understand the research and comparison context, including discipline-specific effects and the implications of sample size.

Consider bibliometrics as one measure among a set of others for understanding research output and impactRelationship between measures of research productivity and impact, shown by bibliometric measures and research metrics.

  • Best practice is to work from a basket of measures, with bibliometrics used to complement, not replace, other research assessment measures.
  • Bibliometric measures are one data point among many for understanding elements of research output and impact. It is impossible for any bibliometric analysis to present a complete picture.

Account for varying research publication cultures

  • Understand and account for variations in how disciplines produce and use research publications.
  • Use measures relevant for each discipline and recognize that meaningful comparisons across those measures may not be possible.

Involve those being evaluated in the process

  • As context has a significant role in the use of bibliometrics for indicating research productivity and impact, researchers in the field or discipline under investigation may best understand how measures capture research outcomes in their field. 

Understand the distinctions among bibliometric measures

  • Be aware of the methodology, purpose, and limitations of bibliometric databases and of individual bibliometric measures.
  • For example, normalized measures have value over whole or raw counts, but can be vulnerable to outliers such as a single highly cited paper which can increase the average citation count somewhat artificially.

Exercise caution when using journal impact rankings

  • Journal impact rankings whcih capture the relative importance of a journal, such as the Journal Impact Factor (JIF) or SCImago Journal Rank (SJR), should not be broadly used as a surrogate measure of the quality of individual research articles or an individual’s overall performance when opportunities exist for an in-depth evaluation of individual publications.