The Wiley Network

How to Navigate the World of Citation Metrics

how-to-navigate-the-world-of-citation-metrics

Jenny Neophytou, Senior Manager, Academic Market & Impact Analysis, Wiley

May 15, 2014

Since 2009, the number of journal research papers has increased at an average of 4% per year.1 Within this growing (and rapidly changing) research environment, citation metrics continue (rightly or wrongly) to be used to benchmark the performance of journals, institutions, and even individual academics.

The core principle of citation metrics is the assumption that when an article is cited by another academic, it has had an impact on their research. The validity of this assumption is a matter of debate – after all, there are many reasons why an academic might choose to cite another person’s work, and those reasons do not always reflect the ‘quality’ of the cited work. Nevertheless, citations provide a way to measure the extent to which the published academic community has engaged with a given piece of research.

Another factor common to most citation metrics is that they are journal level metrics, and, for the most part, average counts of citations per paper (within set parameters). This means that most citation metrics do not tell us anything about individual papers, or individual authors, within a given journal.

There are many factors beyond academic quality that can influence the rate at which an article is cited. The purpose of this post is to provide guidance on the various types of citation metrics that are available, including the background to those metrics, what they tell us, and crucially, what they do not tell us about academic behavior. It is crucial that these factors are considered before metrics are used in any decision-making process:

  • Discipline. In particular, social science and humanities disciplines tend to cite more slowly, and cite a larger proportion of books (as opposed to journals) compared with scientific disciplines. Citation metrics should not be compared across disciplines unless this is accounted for (i.e. the SNIP metric (see below)).
  • Document type. Review papers tend to attract the most citations; case studies tend to attract the fewest citations. That is not necessarily a comment on the research quality – just the type of research produced. Usually, ‘non-substantive’ papers, such as meeting abstracts and editorials, are excluded from the denominator of citation metrics.
  • Age of research cited. Older articles will have more citations. If using a metric that measures ‘total citation counts’, keep in mind that the metric will be skewed towards older papers, or towards academics who have been in their careers for a longer period of time.
  • The data source. There are many sources of citation information (i.e. Web of Science, Scopus, Google Scholar), and the citation scores for a single article are likely to be higher in the largest database (Google Scholar). Most citation metrics are tied to a single database, however not all are. In these instances, it is important to note the data source.

Overview of Key Metrics:

  • 5-Year Impact Factor Data source: Web of Science. Published in the annual Journal Citation Reports. Average citations in the JCR year to substantive papers (articles, proceedings papers, reviews) published in the previous 5 years.
  • Altmetrics Metrics based on a broad spectrum of indicators, such as tweets, blog mentions, social bookmarking, etc. For more details see this blog posting on Wiley Exchanges: http://exchanges.wiley.com/blog/2013/05/20/article-level-metrics-painting-a-fuller-picture/.
  • Eigenfactor Data source: Web of Science. Published in the annual Journal Citation Reports. Based on weighted citations in the JCR year (excluding journal self-citations) to papers published within the previous 5 years. Citations are weighted according to the prestige of the citing journal (i.e. citations from top journals ‘mean more’ than citations from lesser journals). The mathematics of the calculation are akin to the PageRank calculations that Google uses in its ranking algorithms.
  • Google Scholar Metrics Data source: Google Scholar. These are ‘rolling metrics,’ i.e. based on a continually changing dataset. The main Google Scholar journal metric is the H5 index. This is very similar to the H-Index (explained below) but limited to papers published within the past 5 years.
  • H-index Data source: Any. An article level measure designed to evaluate individual authors, but which can be extended to any dataset. The H-index indicates the number of papers, H, that have been cited at least H times, e.g. an H-index of 15 means that 15 papers have been cited at least 15 times each. This metric does not control for the age of documents or citations, and can be calculated from any citation database. Caution is advised, as the same group of articles will yield a different H-Index in different databases.
  • Immediacy Index Data source: Web of Science. Published in the annual Journal Citation Reports. Average citations in the JCR year to substantive papers published in the same year. This is really an indication of how rapidly research is cited. Journals with a high Immediacy Index will usually be journals representing a fast-paced research environment.
  • Impact Factor Data source: Web of Science. Published in the annual Journal Citation Reports. Average citations in the JCR year to substantive papers published in the previous two years.
  • SJR Data source: Scopus. Published in the SCImago journal and country rank reports.The SCImago Journal Rank (SJR) is based on weighted citations in Year X to papers published in the previous 3 years. Citations are weighted by the ‘prestige’ of the citing journal, so that a citation from a top journal will ‘mean more’ than a citation from a low-ranked journal. As with the Eigenfactor, the calculation is broadly similar to the Google PageRank algorithm.
  • SNIP Data source: Scopus. Published twice yearly on CWTS Journal Indicators.The Source Normalized Impact per Paper (SNIP) measures average citations in Year X to papers published in the previous 3 years. Citations are weighted by the ‘citation potential’ of the journal’s subject category, thereby making the metric more comparable across different disciplines.

Useful Links
Altmetrics:http://www.altmetrics.org
CWTS Journal Indicators: http://www.journalindicators.com/indicators
Google Scholar: http://scholar.google.com
Google Scholar Metrics: http://www.google.com/intl/en/scholar/metrics.html#metrics
Journal Citation Reports: http://www.isiknowledge.com/jcr
Scopus: http://www.scopus.com
SCImago Journal & Country Ranks:http://scimagojr.com
Web of Science: http://www.isiknowledge.com

1 ©Thomson Reuters (2014) – Web of Science. Includes articles and reviews indexed in the Science Citation Index – Expanded, Social Science Citation Index and Arts & Humanities Citation Index. Retrieved May 6th, 2014, from www.isiknowledge.com.

Promote Your Article

Published and ready to promote your article? Read our how-to guide to scholarly article promotion!

Related Articles

/global/aem/banner-6. This is a very global banner for every single page of EN language