Skip to Main Content


Criticisms of Impact Factor


1.  The system can be gamed.  When the impact      factor was first developed H. J. M. Hanley at the National Bureau of Standard somewhat jokingly suggested that future scientist should “cite yourself as often as possible; insist that your work be cited in all articles that you review; and automatically pass articles that already contain a sufficient number of citations to you.” 

2.  Citations appear in journals for a number of reasons.  There are many reasons a citation may appear in a journal.  Sometimes citations appear because the article is being negatively reviewed.  Some citations appear in the editorials of journals, a factor which is known, for example, to have inflated impact factors of trauma and orthopedic journals. 

3.  Does not reflect all academic output.  Datasets, models, study methodology, best practices methodology, pedagogy, and code are all valid outputs of academic work, but are not counted or necessarily weighed in the same fashion as articles are. 

4.  Interdisciplinary work harder to measure.  Also, as authors work across disciplines, it becomes harder to work with impact factor and the like since they are usually focused on comparisons within a discipline. 


Pay to Play - Predatory Publishing

"Tyranny of metrics”  has driven the rise of “predatory” publishing. 

  • Journals who will publish the article if the author pays the article processing charges. 
  • Journals can operate by pretending to have connections to editors and faking impact measurements. 
  • Journals hijack retired titles and ISSNs. 

Unfortunately there is no consensus among scholars on how to identify and classify journals as predatory or not.  Past efforts to create blacklists have run into frequent questions regarding criteria as have efforts to create whitelists. 


Efforts at Reform

DORA (Declaration on the Research Assessment)

  • 2012 annual meeting of the American Society of Cell Biology
  • Recommendations included
    • Don’t use journal-based metrics as a measurement of quality of article
    • Account for variation in article types and subject areas
    • Cite primary literature not reviews

Leiden Manifesto - 2014

  • 10 principles
  • Propose any use of indicators meet three criteria
  • Development of a governing body to support changing demands on indicators and protect researchers from predatory journals. 


  • bottom up open access effort between 15 universities from the UK, US, and Australia, and Elsevier
  • attempt to create a standard that can be used to calculate across multiple data sources. 
  • Open access “recipes” anyone can adopt
© UAB Libraries ι University of Alabama at Birmingham ι About Us ι Contact Us ι Disclaimer