This post is the third in a series entitled Show Me the Evidence. It is about the evidence gained from bibliometric data and journal impact factor analysis.
Let’s start with an excerpt from an 2008 article:
“ The assumption that Impact Factor (IF) is a number absolutely proportional to science quality has led to misuses beyond the index’s original scope, even in the opinion of its devisor*. When the IF is inappropriately attributed to all articles within a single journal, it leads to false applications regarding the evaluation of individual scientists or research groups. This is, unfortunately, a common practice, especially among governmental funding boards and academic institutions entitled to judge scientists for positioning and grant allocation. The IF has thus accumulated huge strength and importance, mainly implied by its, at least to a degree, undue application as an index of overall scientific quality“.
Excerpt on page 1 from “The Top-Ten in Journal Impact Factor Manipulation” by ME Falagas and VG Alexiou, published in Arch Immunol Ther Exp (Warsz). 2008 Jul-Aug;56(4):223-6 – All rights reserved – Copyright 2010
This post was sparked by a recent reference question from a retired professor who needed some assistance on how to find and search the Journal Citation Reports® database*, which is described by its’ producer, Thomson Reuters, in this way:
“ Journal Citation Reports® (JCR) offers a systematic, objective means to critically evaluate the world’s leading journals, with quantifiable, statistical information based on citation data. By compiling articles’ cited references, JCR® helps to measure research influence and impact at the journal and category levels, and shows the relationship between citing and cited journals. “
So let’s take a look ways to find current evidence about publication patterns in the biomedical literature.
There were two things about JCR® that needed explanation for the professor. First, the latest annual edition of JCR® was released in June 2010 and indexes journal citation data for the 2008 calendar year only (not 2009). Second, only those journal titles indexed in the Web of Science database* are searchable in JCR®.
As one example: Let’s say that you’re a scientist working on stem cell research and you subscribe to ten international journals that are critical to your continuing professional knowledge, lab work and research. It’ll be a good idea to check the list of 6,600+ journals that are included in the Web of Science database in order to determine if your “best” journals are searchable in Journal Citation Reports®. If those titles are not covered in JCR®, you’ll be missing essential facts for comparing bibliometric data.
Here is a screenshot of a search done on the 2009 JCR® database for journals indexed under Cell and Tissue Engineering:
Image source: Thomson-Reuters – All rights reserved – Copyright 2010
Some folks assume that “every journal in the world” is included in Journal Citation Reports, but that’s not the case. 9,100 journal titles were indexed in the 2009 edition.
Another thing to know is that there are six subsets available for annual subscription from JCR® and UConn Libraries subscribes only to these two: Science Citation Index Expanded (indexing of 7,100 major journals across 150 disciplines and Social Sciences Citation Index (2,474 journals across 50 social science disciplines).
Below is a screenshot from an online tutorial about ways to search Journal Citations Reports® (with my added comment in the upper left-hand corner):
Next, the professor asked: “I have a manuscript to submit for publication. Is this the only place I should use to look at statistics about specific journal titles? “.
While JCR® is an important reference resource, it’s neither free or the only one available worldwide for researchers to search. Below are sites which provide evidence that there are other ways to do citation analysis in year 2010 (some are free, some are via subscription).
SCImago Journal & Country Rank Indicator (SJR). I like this site – it is easy to learn to use. There are many ways to search their datasets (ranked by country, by journal title, by countries grouped by continent, etc.). I also found their Map generators intriguing, which show comparative relationships between discipline or subject-specific citations.
Below is a screenshot of the SCImago Journal & Country Rank page showing a search done on Year “2008”, “Medicine” as a general category, “Emergency Medicine” as a specialty and USA for the “country, with a limit for displaying journals that had at minimum 12 citable documents over 8 years:
Link here to a 2007 paper written by the creators of SCImago which describe the process by which journals are ranked on their site.
Those who have access to the Scopus database* through their library may have already discovered the Scopus Journal Analyzer, where allows one to select a discipline (shown below as “Biochemistry, Genetics and Molecular Biology”) and a journal title (“Cell” was searched below) and then choose method of analysis to determine journal impact. Elsevier is the producer of the Scopus database. 15,000 journal titles are indexed for inclusion in Scopus analytics.
A screenshot below shows results of a search performed in Scopus Journal Analyzer recently for the broad topic of Biochemistry, Genetics and Molecular Biology. The journal Cell holds the most-cited place in the list (no surprise there):
The journal analyzer can be sorted using the following criteria: SJR versus SNIP. I found out that four years of data are necessary for sorting results using these filters. Below, see a different screenshot: rankings by SJR and SNIP for the same subject area:
Explanations for SJR and SNIP were easily found in the Scopus Help section (screenshots shown below):
Credit for all Scopus Images shown above: Elsevier B.V. – All rights reserved – Copyright 2010
Want a different way to search Scopus analytics for evidence? Use the search feature in Journal Analyzer to select and compare up to ten Scopus sources on number of citations, documents, and percentage not cited.
A 12-page PDF white paper (from 2006) is available to download from Scopus, entitled “Using Scopus for Bibliometric Analysis: A Users’ Guide“. Following is an excerpt from that document:
” Introduced in January 2006, the Scopus Citation Tracker enables users to easily evaluate research by using citation data. This tool offers at-a-glance
intelligence about the influence of a set of articles, an author or group of authors over time, so users can quickly spot trends using a visual table of citations broken down by article and chronology. “
Text Source: Courtesy of Elsevier B.V. – All rights reserved – Copyright 2010
Two other Scopus pages which I found useful were the Scopus Top Cited page and Scopus Journal Metrics Factsheet.
No use trying to make this post pithy. It would be an error not to mention the following means of assessing the scientific literature: Eigenfactor, h-index and JANE.
University of Washington biology professor Carl Bergstrom and colleagues created the Eigenfactor Project™. The main webpage is http://www.eigenfactor.org.
Give the interactive map a try: click here. Here is an example for Molecular & Cell Biology Map:
Following is an excerpt from a May 2007 article that Dr. Bergstrom wrote for the Association of College and Research Libraries publication, College & Research Library News:
” We can view the Eigenfactor score of a journal as a rough estimate of how often a journal will be used by scholars. The Eigenfactor algorithm corresponds to a simple model of research in which readers follow citations as they move from journal to journal. The algorithm effectively calculates the trajectory of a hypothetical “random researcher” who behaves as follows: Our random researcher begins by going to the library and selecting a journal article at random. After reading the article, she selects at random one of the citations from the article. She then proceeds to the cited work and reads a random article there. She selects a new citation from this article, and follows that citation to her next journal volume. The researcher does this ad infinitum.
” Since we lack the time to carry out this experiment in practice, Eigenfactor uses mathematics to simulate this process.
” Because our random researcher moves among journals according to the citation network that connects them, the frequency with which she visits each journal gives us a measure of that journal’s importance within network of academic citations. Moreover, if real researchers find a sizable fraction of the articles that they read by following citation chains, the amount of time that our random researcher spends with each journal may give us a reasonable estimate of the amount of time that real researchers spend with each journal “.
A slideshow presentation created by Dr. Bergstrom and presented at a conference hosted by Microsoft in 2009 can be viewed here.
Professor Alan Fersht wrote an article in 2009 published in PNAS Vol. 106(17):6883-4 (Apr 28 2009) entitled “The Most Influential Journals: Impact Factor and Eigenfactor” which is available free online on the PubMedCentral website.
Physics professor Jorge E. Hirsch wrote a paper published in 2005 in PNAS entitled “An Index to Quantify an Individual’s Scientific Research Output“, in which he outlined the algorithm known as the Hirsch Index (or h-Index).
And – LOL – according to Scopus, that PNAS paper by Dr. Hirsch has been cited 575 times!
JANE (or Journal Author/Name Estimator) is a software tool created in 2007 by members of the Biosemantics Group, a collaborative group at the Medical Informatics department of the Erasmus MC University Medical Center of Rotterdam and the Center for Human and Clinical Genetics of the Leiden University Medical Center. Following is how the creators of JANE describe the purpose of the tool:
“ Have you recently written a paper, but you’re not sure to which journal you should submit it? Or are you an editor, and do you need to find reviewers for a particular paper? JANE can help! Just enter the title and/or abstract of the paper in the box, and click on ‘Find journals’ or ‘Find authors”. JANE will then compare your document to millions of documents in Medline [over 10 years] to find the best matching journals or authors. ” —
M. J. Schuemie and J.A. Kors – two of the creators of JANE – published a paper about the software in the journal Bioinformatics – Vol 24:5 (Mar 1 2008).
* Dr. Eugene Garfield was a co-founder of the Institute for Scientific Information, the producer of Science Citation Index. A professor at the University of Pennsylvania and a prolific author, Dr. Garfield is now 85 years old. Here is a link to his website.
In 1955, he wrote a paper titled “Citation Indexes to Science: A New Dimension in Documentation through Association of Ideas“, published in the journal Science (Vol. 122:108-111). The online version is available to be read at this link.
From looking around on his Library website (url above), I think he has a sense of humor and the soul of an archivist. A great deal of his professional life has been taken up thinking about information management, and the ways in which scientists use their literature. I – and other librarians everywhere – should thank him for being an early adopter!
For example, in a commentary he wrote in 1963 published in the journal Science (Vol. 141:3579 – Aug 2 1963), titled “Citations in Popular and Interpretive Science Writing“, he admonishes mainstream periodical editors for not including basic volume and issue information. Here is a direct quote: ” Librarians and scientists spend hundreds of hours tracking down precise literature citations which are missing in articles published in otherwise reputable publications like Scientific American, the New York Times, or The Sciences-a task that could be eliminated if brief but complete citations were given. This is certainly false economy and annoying “. Garfield… You go!
The text of a presentation he gave at the International Congress on Peer Review And Biomedical Publication (2005) can be read online at “The Agony and the Ecstasy: The History and Meaning of the Journal Impact Factor“.
I performed an author search on PubMed for his publications and created a small group of citations, those search results can be viewed here.
In addition to thanking Dr. Garfield for creating this field of citation analysis, there are many fellow health science or academic librarians whose work has helped me understand this complex subject, or who have made public their own instruction for others to benefit from. These folks deserve recognition (and applause!).
Thanks to UCHC collection management librarian, Arta Dobbs, for her suggestions and explanations of sources and methods of bibliometric analysis.
Thanks to Janice Flahiff and Jolene Miller, librarians at Mulford Health Science, University of Toledo (Ohio) who have written a great fact sheet on the uses and misuses of interpreting journal impact factors.
Props to Kathi Sarli, health science librarian at Bernard Becker Medical Library of Washington University of St. Louis, wrote a very useful library guide called “Tools for Authors“… check the section-tab for “Preparing for Publication“.
I enjoyed watching an excellent tutorial on Journal Impact Factors produced by librarians at the Ebling Library for the Health Sciences, University of Wisconsin-Madison.
Finally, remember this is all about Publish or Perish.
* Subscription via UCHC Library. If off-campus, use your library proxy number to connect.