Skip to Main Content

Publishing Guide

Metrics for Scholarly Output

The world of scholarly publications is a multifaceted enterprise, often variously defined depending on the discipline of the scholars involved. 

Throughout their careers, faculty members at all levels of academic institutions engage in the creation, development, analysis, and publication of their research and findings. They may be documenting, analyzing, and sharing original research, overviewing or investigating topics researched by others, serving as reviewers for scholarly publications, presenting findings at conferences, and/or participating on topic-specific panels, among other pursuits.

Metrics are the means by which the world of scholarly publishing seeks to evaluate and rank academic journals as a whole, individual aritlcles, as well as the output of researchers, themselves.

Scholarly metrics include the following:

  • Journal metrics
  • Article metrics
  • Author  metrics
  • Altmetrics

Click on one of the tabs in the box below to learn more about metric types.

Metrics

Journal Metrics

Metrics associated with specific journal titles are those that attempt to quantify the impact of the articles published in a given journal overall.

This method can help indicate the extent to which scholars prefer to publish in a particular journal.

However, to arrive at this metric, a journal will average the impact of all articles published within it, thereby masking particular articles or authors who may be represented on either end of the the metric scale. Another potential downside of this method is that journal metrics are less useful in cross disciplinary evaluations.

 

Journal Citation Reports (JCR) 

Published in the database, Web of Science, JCR Impact Factors are calculated by dividing the number of citations attributed to a specific journal in a given year by the total number of articles that journal has published in the two previous years. Each year, the JCR releases a summary of the scholarly citations from the prior year of Web of Science coverage. As mentioned elsewhere, the information is proprietary and requires a subscription to the database to access.

The citation reports help researchers identify leading journals in the in a range of disciplinary areas and understand citation impact trends with an aim to support a researcher's publishing strategy. The 2022 release of this product includes 12,828 scientific journals, 6,691 social science journals, 3,092 journals in the arts and humanities, and 5,300 Gold Open Access journals. Inclusion of the arts and humanities journals are newly available in this most recent publication.

 

SCImago Journal and Country Reports (SJCR)

The SCImago Journal & Country Rank is a publicly available website/portal that includes journal and country scientific indicators based on the information housed in the Scopus database published by Elsevier. This resource was developed by a research group based at the Consejo Superior de Investigaciones Científicas (CSIC), at the University of Granada in Spain. 

Within SJCR, journals can be compared or analyzed separately, as can country rankings . Journals can be grouped by subject area, including 27 major thematic areas, with 309 specific subject categories, or by country. Citation data is drawn from more than 34,100 titles from more than 5,000 international publishers; country performance metrics include those from 239 countries around the world. 

                                                SCImago, (n.d.). SJR — SCImago Journal & Country Rank [Portal]. Retrieved Oct 26, 2022 from http://www.scimagojr.com

 

Scopus

Published by Elsevier, the database, Scopus, provides access to science, technology, & medical journal articles and the references included in those articles. The database can be used for collection development as well as for research. Scopus is an abstract and indexing database with full-text links. The name of the database references the bird, Hammerkop (Scopus umbretta), which is known for its excellent navigation skills. 

Scopus indexes content from more than 25,000 active titles and 7,000 publishers—all rigorously vetted and selected by an independent review board. Users get access to thousands of titles, millions of author profiles and 1.7 billion cited references.

Scopus CiteScore metrics provide insight into the impact of research. Calculated using data from Scopus, CiteScore metrics include evaluations of journals, book series, conference proceedings and trade journals to the benefit of researchers. Citescore 2021 is the most recent set of published data.  CiteScore metrics are freely available at www.Scopus.com/sources.

Bowman Library no longer has a current subscription to Scopus

Article metrics

These metrics aim to determine the impact of an individual article. Most often this method evaluates an article's impact by directly counting the number of times a given article is cited in other published articles. 

 

Google Scholar

Searching an article on Google Scholar allows researchers to discover the number of times a given article has been cited.

 
SciVal

SciVal is an Elsevier product that helps researchers and institutions evaluate their respective research landscape and impact overall. Access to the site requires a subscription, and includes information from more than 22 thousand institutions and their associated researchers from 224 nations around the world.

 
Web of Science

In Web of Science, click on a cited reference search to locate the number of times a given article has been cited. As mentioned elsewhere, this information is proprietary: access to it requires a subscription to the database. 

Author Metrics

Metrics based on an individual author's scholarly output places value on the number of articles that author has published. Referred to as h-index, these are citation metrics that measure the bibliometric impact of individual authors. They can provide an overall evaluation of a single person's work, however, this type of metric tends to favor more established scholars, and does not readily tease out each article's relative merit within the whole of that author's set of publications.

H-index metrics, so named these were proposed by JE Hirsch in 2005. This type of assessment has gained popularity amongst researchers while other scholars of bibliometrics have proposed variants that address some of the its weaknesses. These include the g-index which addresses include in its calculation, credit for the most highly cited papers in a data set; and the m-index are good examples).

There are a number of sources where you can find your h-index.  The value of the index may vary depending on the source of information, the time span, number of indexed publications, etc.  For example, h-index derived from Google Scholar tend to be higher than those based on Web of Science or SCOPUS data.  

 

Google Scholar

Google Scholar provides citation counts by author for authors with a profile. The site also includes the following features:

  • Authors can track their own publications
  • Count types in use on Google Scholar include a basic citation count, h-index, and i10-index
    • h-index, or Hirsch Index, evaluates the scientist him/herself, rather than the journal
      • For the most part, this index is used for publications in the sciences
    • i10-index counts articles with a minimum of 10 citations
  • Users can set up automatic updates to the citation metrics
  • Authors can manually update their profiles
  • Information on citations for other authors can be retrieved with an author search; metrics are included under the citation information

 

Web of Science

The database, Web of Science, provides citation counts by author. However, the information provided by the database is proprietary and requires an institutional subscription to enable access. Included in the subscription, is a tool, Essential Science Indicators, which helps researchers  identify authors within the last 10 years who have received enough citations, in their respective disciplines, to place them in the top 1% of all authors in Web of Science. Currently, Menlo College's Bowman Library does not provide access to this database. NB: The database is available at Stanford University Libraries; every person has seven free visits a year to Stanford's Green Library.

 

Clarivate Analytics provides lists of Highly Cited Authors.  These are individuals who, in the last 10-year period, boast the highest cumulative number of highly cited papers (papers placing in the top 1% of the distribution) for their publications across 21 broad subject categories.  The 2022 edition of Highly Cited lists approximately 6,938 authors.  

Other metrics originally developed for academic journals can be reported at the researcher level: author-level eigenfactor and the author impact factor (AIF) are among such examples. 

Altmetrics

The rise of the social web and its increasing use by scholars has led to the creation of altmetrics. These are social web metrics for academic publications. Theoretically, these new metrics can be used in an evaluative role, to give early, often more immediate, estimates of the impact of publications amongst readers. These are also useful to provide assessments of non-traditional types of impact.

Altmetrics can also be used as an information seeking tool by bringing together tagged or mentioned published works from various social media sites to the attention of readers/users in those spaces. It may be these altmetric techniques will require some follow up evaluation if they are to be trusted by a wider audience.  A determination should be made as to the extent to which the claims made about any given published article is worthy of the attention it receives, and even whether the positive reviews hold up over time. 

Altmetric evaluation strategies include correlation tests, content analyses, interviews and pragmatic analyses. 

According to Piowar (2013), altmetrics offer four potential advantages:

■ A more nuanced understanding of impact, showing us which scholarly products are read, discussed, saved and recommended as well as cited.

■ Often more timely data, showing evidence of impact in days instead of years.

■ A window on the impact of web-native scholarly products like datasets, software, blog posts, videos and more.

■ Indications of impacts on diverse audiences including scholars but also practitioners, clinicians, educators and the general public

Piowar, H. (2013). Introduction altmetrics: What, why, & where? ASIS&T Bulletin, 39(4), 8-9.