A critique of ‘quick and dirty’ bibliometrics: issues for assessing complexity in bibliometric analysis

Main Article Content

Rafael Repiso
https://orcid.org/0000-0002-2803-7505
Álvaro Cabezas-Clavijo
https://orcid.org/0000-0001-9641-8855

Abstract

This paper denounces the indiscriminate rise of low-quality bibliometric studies. These studies have proliferated across scientific journals in a wide range of disciplines, taking advantage of the accessibility of databases such as Web of Science and Scopus and the use of automated tools. This trend, driven by a lack of specialised knowledge among journal reviewers and a desire to rapidly increase academic output, has led to the publication of studies with methodological shortcomings, characterised by limited conceptual development and low analytical value. In response to this situation, the authors advocate for bibliometrics as a highly specialised field that requires theoretical understanding, methodological expertise, and critical interpretation of data. The paper proposes a methodological framework for assessing the complexity of bibliometric studies. This framework is based on six dimensions: size of the population, origin and source of data, mode of data collection, degree of data normalisation, types of analysis employed, and analytical and visualisation tools used. This methodological perspective aims to provide editors and researchers with a framework for identifying substantive research and distinguishing it from studies produced with minimal effort, limited judgment, and weak theoretical grounding.

Article Details

Section

Monográfico

How to Cite

Repiso, R., & Cabezas-Clavijo, Álvaro. (2025). A critique of ‘quick and dirty’ bibliometrics: issues for assessing complexity in bibliometric analysis. Revista Panamericana De Comunicación, 7(1). https://doi.org/10.21555/rpc.v7i1.3419

References

Cabezas-Clavijo, A., Milanés-Guisado, Y., Alba-Ruiz, R., & Delgado-Vázquez, Á. M. (2023). The need to develop tailored tools for improving the quality of thematic bibliometric analyses: Evidence from papers published in Sustainability and Scientometrics. Journal of Data and Information Science, 8(4), 10-35. https://doi.org/10.2478/jdis-2023-0021

Hoang, A. D. (2025). Evaluating bibliometrics reviews: A practical guide for peer review and critical reading. Evaluation Review, OnlineFirst. https://doi.org/10.1177/0193841X251336839

Hulland, J. (2024). Bibliometric reviews-some guidelines. Journal of the Academy of Marketing Science, 52(4), 935-938. https://doi.org/10.1007/s11747-024-01016-x

Leydesdorff, L. (2001). The challenge of scientometrics: The development, measurement, and self-organization of scientific communications. Universal Publishers.

Moed, H. F. (2005). Citation analysis in research evaluation. Springer.

Montazeri, A., Mohammadi, S., Hesari, P. M., Ghaemi, M., Riazi, H., & Sheikhi-Mobarakeh, Z. (2023). Preliminary guideline for reporting bibliometric reviews of the biomedical literature (BIBLIO): a minimum requirements. Systematic Reviews, 12(1), 239. https://doi.org/10.1186/s13643-023-02410-2

Ng, J. Y., Haustein, S., Ebrahimzadeh, S., Chen, C., Sabe, M., Solmi, M., & Moher, D. (2023). Guidance List for repOrting Bibliometric AnaLyses (GLOBAL): A research protocol. https://osf.io/nvu6w

Waltman, L., & Van-Eck, N. J. (2015). Field-normalized citation impact indicators and the choice of an appropriate counting method. Journal of Informetrics, 9(4), 872–894. https://doi.org/10.1016/j.joi.2015.08.001

Similar Articles

You may also start an advanced similarity search for this article.