Ranking the research productivity of library and information science faculty and schools: An evaluation of data sources and research methods: Research Articles

  • Authors:
  • Lokman I. Meho;Kristina M. Spurgin

  • Affiliations:
  • School of Library and Information Science, Indiana University, 1320 E. 10th Street, LI 011, Bloomington, IN 47405;School of Information and Library Science, University of North Carolina, 201 Manning Hall, CB# 3360, Chapel Hill, NC 27599-3360

  • Venue:
  • Journal of the American Society for Information Science and Technology
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

This study evaluates the data sources and research methods used in earlier studies to rank the research productivity of Library and Information Science (LIS) faculty and schools. In doing so, the study identifies both tools and methods that generate more accurate publication count rankings as well as databases that should be taken into consideration when conducting comprehensive searches in the literature for research and curricular needs. With a list of 2,625 items published between 1982 and 2002 by 68 faculty members of 18 American Library Association– (ALA-) accredited LIS schools, hundreds of databases were searched. Results show that there are only 10 databases that provide significant coverage of the LIS indexed literature. Results also show that restricting the data sources to one, two, or even three databases leads to inaccurate rankings and erroneous conclusions. Because no database provides comprehensive coverage of the LIS literature, researchers must rely on a wide range of disciplinary and multidisciplinary databases for ranking and other research purposes. The study answers such questions as the following: Is the Association of Library and Information Science Education's (ALISE's) directory of members a reliable tool to identify a complete list of faculty members at LIS schools? How many and which databases are needed in a multifile search to arrive at accurate publication count rankings? What coverage will be achieved using a certain number of databases? Which research areas are well covered by which databases? What alternative methods and tools are available to supplement gaps among databases? Did coverage performance of databases change over time? What counting method should be used when determining what and how many items each LIS faculty and school has published? The authors recommend advanced analysis of research productivity to provide a more detailed assessment of research productivity of authors and programs. © 2005 Wiley Periodicals, Inc.