Skip to Main Content

Bibliometrics & Measuring Research Output: Discipline data

Bibliometric analysis at the discipline level

The University's White Paper, "Measuring Research Output Through Bibliometrics", identifies how bibliometric data can be used to support a bibliometric analysis at the discipline level, and offers context for:

Objectives

Possible objectives of a discipline-level bibliometric analysis include:

  1. Understand the group’s research activity in terms of production of publication outputs and associated citations.
     
  2. Understand the group's performance over time, or compared with selected peers.
     
  3. Validate discipline-level data that external agencies have published, for example, the data generated by a university rankings organization.

It is also important to be mindful of how the level of detail influences a discipline-level bibliometric analysis. For example, this type of analysis will provide some detail, but it will be restricted to comparisons within fields and disciplines.

Open image in larger format

Inappropriate uses

Inappropriate uses of a discipline-level bibliometric analysis include:

  • Comparing across different fields or disciplines based on h-index data without appropriate normalization. 
     
  • Measures in fields or disciplines that are not robust or well captured in citation-tracking databases. For example, regional and interdisciplinary disciplines, fields where books or conference proceedings are the primary forms of research output.
     
  • Measuring performance between different faculties or institutes. For example, comparing the Waterloo Institute for Nanotechnology with the National Institute for Nanotechnology, a joint initiative of the Government of Alberta, University of Alberta and the National Research Council.

Appropriate uses

Appropriate uses of a discipline-level bibliometric analysis include:

  • To capture research publications and citations in fields or disciplines that are well covered in citation-tracking databases.
     
  • To better understand performance measurement and reporting analysis at the discipline level, by examining total documents in a citation-tracking database within selected journal classifications where faculty members are most active.

    • For example, Waterloo’s Faculty of Engineering has used bibliometric data from InCites along with other measures (research funding and research chairs, honours and awards) to support its annual strategic plan evaluation process.

      In 2014/15, Engineering used: T
      otal Web of Science publications in three major journal classifications in which faculty members are most active, category-normalized citation impact of the above publications, and percentage of documents in the top 10%, based on citations, of the selected publications (above). 

      The faculty has also used bibliometric measures internally to test the validity of data that other institutions or agencies have published about Waterloo Engineering or to provide input to internal exercises to better understand Waterloo Engineering relative to key peers. 
  • Internal reporting to examine the validity of data that other institutions or agencies have published.
     
    • For example, examining Waterloo’s performance in a specific subject area as captured by a journal subject category. Note: This can only be completed at the journal classification level, and will not map directly onto the work of the faculty members within the department, faculty or centre under consideration.
  • Performance measurement relative to a similar faculty, department, centre, or institute at another university only with acknowledgement and consideration of differing contexts that would impact comparison including: different missions, program mixes or sub-disciplines within the overall unit,  regional or international foci, age of the unit, dominant language for publishing, and the administrative or funding environment. 
     
    • For example, a comparison of the Faculty of Engineering at Waterloo with that at the University of Toronto must consider the types and numbers of researchers in each sub-discipline, the age and stage of researchers, and the age of the faculty.

Possible Measures

  • Citation counts over time.
     
  • Normalized citation impact at the journal subject category level.
     
  • Top percentiles.
     
  • For more information, see Measures.

Considerations

  • Analysis may also be possible with groups of authors if the data is properly cleaned and validated, and if it is analyzed over time and not in comparison with others.
     
  • Measures or analysis provided with contextual information, e.g., number of researchers, areas of specialty, career stage.
     
  • Measures or analysis provided as part of a data package along with other measures, e.g., research funding, awards and honours.
     
  • Measures provided with appropriate definitions such as those based on journal classification, e.g., all research output from the University of Waterloo in journals classified as "area of interest" by Thomson Reuters.