Scientometrics is the field of study which concerns itself with measuring and analysing
scholarly literature. Scientometrics is a sub-field of
informetrics. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts.[1] In practice there is a significant overlap between scientometrics and other scientific fields such as
information systems,
information science,
science of science policy,
sociology of science, and
metascience. Critics have argued that over-reliance on scientometrics has created a system of
perverse incentives, producing a
publish or perish environment that leads to low-quality research.
Historical development
[2][3][4][5]
Modern scientometrics is mostly based on the work of
Derek J. de Solla Price and
Eugene Garfield. The latter created the
Science Citation Index[1] and founded the
Institute for Scientific Information which is heavily used for scientometric analysis. A dedicated academic journal,
Scientometrics, was established in 1978. The industrialization of science increased the number of publications and research outcomes and the rise of the computers allowed effective analysis of this data.[6] While the sociology of science focused on the behavior of scientists, scientometrics focused on the analysis of publications.[1] Accordingly, scientometrics is also referred to as the scientific and empirical study of science and its outcomes.[7][8]
Later, around the turn of the century, evaluation and ranking of scientists and institutions came more into the spotlights. Based on bibliometric analysis of scientific publications and citations, the
Academic Ranking of World Universities ("Shanghai ranking") was first published in 2004 by the
Shanghai Jiao Tong University.
Impact factors became an important tool to choose between different journals. Rankings such as the Academic Ranking of World Universities and the
Times Higher Education World University Rankings (THE-ranking) became an indicator for the status of universities. The
h-index became an important indicator of the productivity and impact of the work of a scientist. However, alternative
author-level metrics have been proposed.[10][11]
Around the same time, the interest of governments in evaluating research for the purpose of assessing the impact of science funding increased. As the investments in scientific research were included as part of the U.S.
American Recovery and Reinvestment Act of 2009 (ARRA), a major economic stimulus package, programs like
STAR METRICS were set up to assess if the positive impact on the economy would actually occur.[12]
Methods and findings
Methods of research include qualitative, quantitative and computational approaches. The main focus of studies have been on institutional productivity comparisons, institutional research rankings, journal rankings[7][8][13] establishing faculty productivity and tenure standards,[14] assessing the influence of top scholarly articles,[15] and developing profiles of top authors and institutions in terms of research performance.[16]
One significant finding in the field is a principle of cost escalation to the effect that achieving further findings at a given level of importance grow exponentially more costly in the expenditure of effort and resources. However, new algorithmic methods in search,
machine learning and
data mining are showing that is not the case for many information retrieval and extraction-based problems.[citation needed]
More recent methods rely on
open source and
open data to ensure transparency and reproducibility in line with modern
open science requirements. For instance, the
Unpaywall index and attendant research on
open access trends is based on data retrieved from
OAI-PMH endpoints of thousands of
open archives provided by libraries and institutions worldwide.[17]
Recommendations to avoid common errors in scientometrics include: select topics with sufficient data; use data mining and web scraping, combine methods, and eliminate "false positives".[18][19] It is also necessary to understand the limits of search engines (e.g. Web of Science, Scopus and Google Scholar) which fail to index thousands of studies in small journals and underdeveloped countries.[20]
The impact factor (IF) or journal impact factor (JIF) of an
academic journal is a measure reflecting the yearly average number of
citations to recent articles published in that journal. It is frequently used as a
proxy for the relative importance of a journal within its field; journals with higher impact factors are often deemed to be more important than those with lower ones. The impact factor was devised by
Eugene Garfield, the founder of the
Institute for Scientific Information (ISI).
The Science Citation Index (SCI) is a
citation index originally produced by the
Institute for Scientific Information (ISI) and created by
Eugene Garfield. It was officially launched in 1964. It is now owned by
Clarivate Analytics (previously the Intellectual Property and Science business of
Thomson Reuters).[21][22][23][24] The larger version (Science Citation Index Expanded) covers more than 8,500 notable and significant
journals, across 150 disciplines, from 1900 to the present. These are alternatively described as the world's leading journals of
science and
technology, because of a rigorous selection process.[25][26][27]
An acknowledgment index (British acknowledgement index)[28] is a method for
indexing and analyzing acknowledgments in the
scientific literature and, thus, quantifies the impact of
acknowledgments. Typically, a scholarly article has a section in which the authors acknowledge entities such as funding, technical staff, colleagues, etc. that have contributed materials or knowledge or have influenced or inspired their work. Like a
citation index, it measures influences on scientific work, but in a different sense; it measures institutional and economic influences as well as informal influences of individual people, ideas, and artifacts.
Unlike the impact factor, it does not produce a single overall metric, but analyzes the components separately. However, the total number of acknowledgments to an acknowledged entity can be measured and so can the number of citations to the papers in which the acknowledgment appears. The ratio of this total number of citations to the total number of papers in which the acknowledge entity appears can be construed as the impact of that acknowledged entity.[29][30]
In scholarly and scientific publishing, altmetrics are non-traditional
bibliometrics[31] proposed as an alternative[32] or complement[33] to more traditional
citation impact metrics, such as
impact factor and
h-index.[34] The term altmetrics was proposed in 2010,[35] as a generalization of
article level metrics,[36] and has its roots in the #altmetrics
hashtag. Although altmetrics are often thought of as metrics about articles, they can be applied to people, journals, books, data sets, presentations, videos, source code repositories, web pages, etc. Altmetrics use public APIs across platforms to gather data with open scripts and algorithms. Altmetrics did not originally cover
citation counts,[37] but calculate scholar impact based on diverse online research output, such as social media, online news media, online reference managers and so on.[38][39] It demonstrates both the impact and the detailed composition of the impact.[35] Altmetrics could be applied to research filter,[35] promotion and tenure dossiers, grant applications[40][41] and for ranking newly-published articles in
academic search engines.[42]
Criticisms
Critics have argued that over-reliance on scientometrics has created a system of
perverse incentives, producing a
publish or perish environment that leads to low quality research.[43]
^
abcLeydesdorff, L. and Milojevic, S., "Scientometrics"
arXiv:1208.4566 (2013), forthcoming in: Lynch, M. (editor), International Encyclopedia of Social and Behavioral Sciences subsection 85030. (2015)
^Nalimov, Vasily Vasilyevich; Mulchenko, B. M. (1969). ""Scientometrics." Studies of science as a process of information". Science. Moscow, Russia.
^Lowry, Paul Benjamin; Humphreys, Sean; Malwitz, Jason; Nix, Joshua C (2007). "A scientometric study of the perceived quality of business and technical communication journals". IEEE Transactions on Professional Communication. 50 (4): 352–378.
doi:
10.1109/TPC.2007.908733.
S2CID40366182.
SSRN1021608. Recipient of the Rudolph Joenk Award for Best Paper Published in IEEE Transactions on Professional Communication in 2007.
^Dean, Douglas L; Lowry, Paul Benjamin; Humpherys, Sean (2011). "Profiling the research productivity of tenured information systems faculty at U.S. institutions". MIS Quarterly. 35 (1): 1–15.
doi:
10.2307/23043486.
JSTOR23043486.
SSRN1562263.
^Piwowar, Heather; Priem, Jason; Orr, Richard (2019-10-09). "The Future of OA: A large-scale analysis projecting Open Access publication and readership".
bioRxiv10.1101/795310.
^Jiawei, H., Kamber, M., Han, J., Kamber, M., Pei, J. 2012. Data Mining: Concepts and Techniques. Morgan Kaufmann, Wlatham, EE.UU.
^Councill, Isaac G.;
Giles, C. Lee; Han, Hui; Manavoglu, Eren (2005). "Automatic acknowledgement indexing: expanding the semantics of contribution in the CiteSeer digital library". Proceedings of the 3rd international conference on Knowledge capture. K-CAP '05. pp. 19–26.
CiteSeerX10.1.1.59.1661.
doi:
10.1145/1088622.1088627.
ISBN1-59593-163-5.
^Haustein, Stefanie; Peters, Isabella;
Sugimoto, Cassidy R.; Thelwall, Mike; Larivière, Vincent (2014-04-01). "Tweeting biomedicine: An analysis of tweets and citations in the biomedical literature". Journal of the Association for Information Science and Technology. 65 (4): 656–669.
arXiv:1308.1838.
doi:
10.1002/asi.23101.
ISSN2330-1643.
S2CID11113356.
^Nariani, Rajiv (2017-03-24). "Supplementing Traditional Ways of Measuring Scholarly Impact: The Altmetrics Way". ACRL 2017 Conference Proceedings.
hdl:
10315/33652.
^Mehrazar, Maryam; Kling, Christoph Carl; Lemke, Steffen; Mazarakis, Athanasios; Peters, Isabella (2018-04-08). "Can We Count on Social Media Metrics? First Insights into the Active Scholarly Use of Social Media". Proceedings of the 10th ACM Conference on Web Science. p. 215.
arXiv:1804.02751.
doi:10.1145/3201064.3201101.
ISBN978-1-4503-5563-6.