Foreign experience in evaluating the efficiency of scientific and pedagogical research
DOI:
https://doi.org/10.31812/educdim.v56i4.4435Keywords:
efficiency of research, scientometric indicators, scientometric databases, altmetrics, reproducibility of research, digital library systems, repositoriesAbstract
The article is devoted to the analysis of foreign experience in determining the criteria for assessing the effectiveness of scientific and pedagogical research. Thus, the problems of qualified analysis of the source base and reproducibility of research, the use of formal scientometric indicators, altmetric approaches, and digital libraries are considered. Another conclusion of authors is to ensure the reliability of the results, and, accordingly, for higher evaluation indicators of research performance, peer-reviewed publications indexed in scientometric databases should be used as primary sources. At the same time, the implementation of a digital identifier (first of all DOI), the provision of open data of research results, and the availability of copies in digital institutional repositories also contribute to higher performance indicators of scientific and pedagogical research. Despite the implementation of various new scientometric indicators, still the most used is the citation (H-index). Altmetric indicators receive data from social networks. Articles are published in them have a faster release of the results. But many scientists still doubt whether there is a direct relationship between the publication impact according to the classical and altmetric approaches. The main reason for this is the authors’ ability to artificially influence some impact indicators.
Downloads
References
San Francisco Declaration on Research Assessment (2018), URL https://sfdora.org/read/
Opys ramky tsyfrovoi kompetentnosti dlia hromadian Ukrainy (2020), URL https://tinyurl.com/4dm5bpac
The Leiden Manifesto for research metrics (2020), URL https://www.nature.com/news/polopoly_fs/1.17351!/menu/main/topColumns/topLeftColumn/pdf/520429a.pdf
Digital Agenda for Ukraine – 2020 (2021), URL https://ucci.org.ua/uploads/files/58e78ee3c3922.pdf
Reproducibility of scientific results in the EU (2021), URL https://op.europa.eu/en/publication-detail/-/publication/6bc538ad-344f-11eb-b27b-01aa75ed71a1
Baheti, A.D., Bhargava, P.: Altmetrics: A Measure of Social Attention toward Scientific Research. Current Problems in Diagnostic Radiology 46(6), 391–392 (2017), doi:10.1067/j.cpradiol.2017.06.005 DOI: https://doi.org/10.1067/j.cpradiol.2017.06.005
Barnes, C.: The Use of Altmetrics as a Tool for Measuring Research Impact. Australian Academic and Research Libraries 46(2), 121–134 (2015), doi:10.1080/00048623.2014.1003174 DOI: https://doi.org/10.1080/00048623.2014.1003174
Barroso, J., Gollop, C.J., Sandelowski, M., Meynell, J., Pearce, P.F., Collins, L.J.: The challenges of searching for and retrieving qualitative studies. Western Journal of Nursing Research 25(2), 153–178 (2003), doi:10.1177/0193945902250034 DOI: https://doi.org/10.1177/0193945902250034
Brown, M.: Is Almetrics an Acceptable Replacement for Citation Counts and the Impact Factor? Serials Librarian 67(1), 27–30 (2014), doi:10.1080/0361526X.2014.915609 DOI: https://doi.org/10.1080/0361526X.2014.915609
Carroll, C., Booth, A.: Quality assessment of qualitative evidence for systematic review and synthesis: Is it meaningful, and if so, how should it be performed? Research Synthesis Methods 6(2), 149–154 (2015), ISSN 17592887, doi:10.1002/jrsm.1128 DOI: https://doi.org/10.1002/jrsm.1128
Chang, Y.W.: Characteristics of high research performance authors in the field of library and information science and those of their articles. Scientometrics 126(4), 3373–3391 (2021), doi:10.1007/ s11192-021-03898-y DOI: https://doi.org/10.1007/s11192-021-03898-y
Costas, R., Zahedi, Z., Wouters, P.: Do “altmetrics” correlate with citations? Extensive comparison of altmetric indicators with citations from a multidisciplinary perspective. Journal of the Association for Information Science and Technology 66(10), 2003–2019 (2015), doi:https://doi.org/10.1002/asi.23309 DOI: https://doi.org/10.1002/asi.23309
Ćurković, M., Košec, A.: Bubble effect: Including internet search engines in systematic reviews introduces selection bias and impedes scientific reproducibility. BMC Medical Research Methodology 18(1) (2018), doi:10.1186/s12874-018-0599-2 DOI: https://doi.org/10.1186/s12874-018-0599-2
Evans, D.: Database searches for qualitative research. Journal of the Medical Library Association 90(3), 290–293 (2002)
Fang, Z., Costas, R., Tian, W., Wang, X., Wouters, P.: An extensive analysis of the presence of altmetric data for Web of Science publications across subject fields and research topics. Scientometrics 124, 2519–2549 (2020), doi:https://doi.org/10.1007/s11192-020-03564-9 DOI: https://doi.org/10.1007/s11192-020-03564-9
Formanek, M.: Solving SEO issues in DSpace-based digital repositories. Information Technology and Libraries 40(1) (2021), doi:10.6017/ITAL.V40I1.12529 DOI: https://doi.org/10.6017/ital.v40i1.12529
Gorraiz, J., Melero-Fuentes, D., Gumpenberger, C., Valderrama-Zurián, J.C.: Availability of digital object identifiers (DOIs) in Web of Science and Scopus. Journal of Informetrics 10(1), 98–109 (2016), doi:10.1016/j.joi.2015.11.008 DOI: https://doi.org/10.1016/j.joi.2015.11.008
Hammersley, M.: The issue of quality in qualitative research. International Journal of Research and Method in Education 30(3), 287–305 (2007), doi:10.1080/17437270701614782 DOI: https://doi.org/10.1080/17437270701614782
Ibrahim, N., Habacha Chaibi, A., Ben Ahmed, M.: New scientometric indicator for the qualitative evaluation of scientific production. New Library World 116(11-12), 661–676 (2015), doi:10.1108/NLW-01-2015-0002 DOI: https://doi.org/10.1108/NLW-01-2015-0002
Karmakar, M., Banshal, S.K., Singh, V.: A large-scale comparison of coverage and mentions captured by the two altmetric aggregators: Altmetric.com and PlumX. Scientometrics 126(5), 4465–4489 (2021), doi:10.1007/s11192-021-03941-y DOI: https://doi.org/10.1007/s11192-021-03941-y
Kochhar, S.K., Ojha, U.: Index for objective measurement of a research paper based on sentiment analysis. ICT Express 6(3), 253–257 (2020), doi:10.1016/j.icte.2020.02.001 DOI: https://doi.org/10.1016/j.icte.2020.02.001
Konkiel, S.: Altmetrics a 21st-century solution to determining research quality. Online 37(4), 10–15 (2013)
Lin, D., Crabtree, J., Dillo, I., Downs, R.R., Edmunds, R., Giaretta, D., De Giusti, M., L’Hours, H., Hugo, W., Jenkyns, R., Khodiyar, V., Martone, M.E., Mokrane, M., Navale, V., Petters, J., Sierman, B., Sokolova, D.V., Stockhause, M., Westbrook, J.: The TRUST Principles for digital repositories. Scientific Data 7(1) (2020), doi:10.1038/s41597-020-0486-7 DOI: https://doi.org/10.1038/s41597-020-0486-7
Loan, F.A., Sheikh, S.: Is Google scholar really scholarly? Library Hi Tech News 35(3), 7–9 (2018), doi:10.1108/LHTN-11-2017-0078 DOI: https://doi.org/10.1108/LHTN-11-2017-0078
Marco-Cuenca, G., Salvador-Oliván, J.A., Arquero-Avilés, R.: Fraud in scientific publications in the European Union. An analysis through their retractions. Scientometrics 126(6), 5143–5164 (2021), doi:10.1007/s11192-021-03977-0 DOI: https://doi.org/10.1007/s11192-021-03977-0
Martı́n-Martı́n, A., Orduna-Malea, E., Thelwall, M., Delgado López-Cózar, E.: Google Scholar, Web of Science, and Scopus: A systematic comparison of citations in 252 subject categories. Journal of Informetrics 12(4), 1160–1177 (2018), doi:10.1016/j.joi.2018.09.002 DOI: https://doi.org/10.1016/j.joi.2018.09.002
Martı́n-Martı́n, A., Thelwall, M., Orduna-Malea, E., Delgado López-Cózar, E.: Google Scholar, Microsoft Academic, Scopus, Dimensions, Web of Science, and OpenCitations’ COCI: a multidisciplinary comparison of coverage via citations. Scientometrics 126(1), 871–906 (2021), doi:10.1007/s11192-020-03690-4 DOI: https://doi.org/10.1007/s11192-020-03690-4
Masson, A., De Marchi, G., Merin, B., Sarmiento, M.H., Wenzel, D.L., Martinez, B.: Google dataset search and DOI for data in the ESA space science archives. Advances in Space Research 67(8), 2504–2516 (2021), doi:10.1016/j.asr.2021.01.035 DOI: https://doi.org/10.1016/j.asr.2021.01.035
Methley, A.M., Campbell, S., Chew-Graham, C., McNally, R., Cheraghi-Sohi, S.: PICO, PICOS and SPIDER: A comparison study of specificity and sensitivity in three search tools for qualitative systematic reviews. BMC Health Services Research 14(1) (2014), ISSN 14726963, doi:10.1186/s12913-014-0579-0 DOI: https://doi.org/10.1186/s12913-014-0579-0
Mingers, J., Meyer, M.: Normalizing Google Scholar data for use in research evaluation. Scientometrics 112(2), 1111–1121 (2017), doi:10.1007/s11192-017-2415-x DOI: https://doi.org/10.1007/s11192-017-2415-x
Mingers, J., O’Hanley, J.R., Okunola, M.: Using Google Scholar institutional level data to evaluate the quality of university research. Scientometrics 113(3), 1627–1643 (2017), doi:10.1007/s11192-017-2532-6 DOI: https://doi.org/10.1007/s11192-017-2532-6
Nuzzolese, A.G., Ciancarini, P., Gangemi, A., Peroni, S., Poggi, F., Presutti, V.: Do altmetrics work for assessing research quality? Scientometrics 118(2), 539–562 (2019), doi:10.1007/s11192-018-2988-z DOI: https://doi.org/10.1007/s11192-018-2988-z
Oleksyuk, V.P.: Designing of university cloud infrastructure based on Apache Cloudstack. Information Technologies and Learning Tools 54(4), 153–164 (2016), doi:10.33407/itlt.v54i4.1453 DOI: https://doi.org/10.33407/itlt.v54i4.1453
Ozdemir, O., Hendricks, C.: Instructor and student experiences with open textbooks, from the California open online library for education (Cool4Ed). Journal of Computing in Higher Education 29(1), 98–113 (2017), doi:10.1007/s12528-017-9138-0 DOI: https://doi.org/10.1007/s12528-017-9138-0
Poirrier, M., Moreno, S., Huerta-Cánepa, G.: Robust h-index. Scientometrics 126(3), 1969–1981 (2021), doi:10.1007/s11192-020-03857-z DOI: https://doi.org/10.1007/s11192-020-03857-z
Rivai, M.A., Wang, G.: Cloud computing platform services in the university libraries for digital repository. International Journal of Advanced Trends in Computer Science and Engineering 9(1), 285–294 (2020), doi:10.30534/ijatcse/2020/43912020 DOI: https://doi.org/10.30534/ijatcse/2020/43912020
Santana-Perez, I., da Silva, R.F., Rynge, M., Deelman, E., Pérez-Hernández, M., Corcho, O.: A semantic-based approach to attain reproducibility of computational environments in scientific workflows: A case study. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 8805, 452–463 (2014), doi:10.1007/978-3-319-14325-5_39 DOI: https://doi.org/10.1007/978-3-319-14325-5_39
Scotti, V., De Silvestri, A., Scudeller, L., Abele, P., Topuz, F., Curti, M.: Novel bibliometric scores for evaluating research quality and output: A correlation study with established indexes. International Journal of Biological Markers 31(4), e451–e455 (2017), doi:10.5301/jbm.5000217 DOI: https://doi.org/10.5301/jbm.5000217
Semerikov, S.O., Mintii, I.S.: Mapping lifelong learning quality: a bibliometric study. Educational Dimension 3, 1–8 (Dec 2020), doi:10.31812/ed.622 DOI: https://doi.org/10.31812/ed.622
Shaw, R.L., Booth, A., Sutton, A.J., Miller, T., Smith, J.A., Young, B., Jones, D.R., Dixon-Woods, M.: Finding qualitative research: An evaluation of search strategies. BMC Medical Research Methodology 4, 1–5 (2004), doi:10.1186/1471-2288-4-5 DOI: https://doi.org/10.1186/1471-2288-4-5
Singh, V.K., Singh, P., Karmakar, M., Leta, J., Mayr, P.: The journal coverage of Web of Science, Scopus and Dimensions: A comparative analysis. Scientometrics 126(6), 5113–5142 (2021), doi:10.1007/s11192-021-03948-5 DOI: https://doi.org/10.1007/s11192-021-03948-5
Strübing, J., Hirschauer, S., Ayaß, R., Krähnke, U., Scheffer, T.: Criteria for qualitative research. A stimulus for discussion [Gütekriterien qualitativer Sozialforschung. Ein Diskussionsanstoß]. Zeitschrift fur Soziologie 47(2), 83–100 (2018), ISSN 03401804, doi:10.1515/zfsoz-2018-1006 DOI: https://doi.org/10.1515/zfsoz-2018-1006
Tracy, S.J.: Qualitative quality: Eight a “big-tent” criteria for excellent qualitative research. Qualitative Inquiry 16(10), 837–851 (2010), doi:10.1177/1077800410383121 DOI: https://doi.org/10.1177/1077800410383121
Weissgerber, S.C., Brunmair, M., Rummer, R.: Null and Void? Errors in Meta-analysis on Perceptual Disfluency and Recommendations to Improve Meta-analytical Reproducibility. Educational Psychology Review (2021), doi:10.1007/s10648-020-09579-1 DOI: https://doi.org/10.1007/s10648-020-09579-1
Yu, H., Murat, B., Li, L., Xiao, T.: How accurate are Twitter and Facebook altmetrics data? A comparative content analysis. Scientometrics 126(5), 4437–4463 (2021), doi:10.1007/s11192-021-03954-7 DOI: https://doi.org/10.1007/s11192-021-03954-7
Zhang, L., Ma, L.: Does open data boost journal impact: evidence from Chinese economics. Scientometrics 126(4), 3393–3419 (2021), doi:10.1007/s11192-021-03897-z DOI: https://doi.org/10.1007/s11192-021-03897-z
Downloads
Submitted
Published
Issue
Section
License
Copyright (c) 2021 Vasyl P. Oleksiuk, Svitlana M. Ivanova, Iryna S. Mintii
This work is licensed under a Creative Commons Attribution 4.0 International License.
How to Cite
Accepted 2021-06-20
Published 2021-06-22