DOI QR코드

DOI QR Code

A Bibliometric Approach for Department-Level Disciplinary Analysis and Science Mapping of Research Output Using Multiple Classification Schemes

  • Published : 2019.07.22

Abstract

This study describes an approach for comparative bibliometric analysis of scientific publications related to (i) individual or several departments comprising a university, and (ii) broader integrated subject areas using multiple disciplinary schemes. It uses a custom dataset of scientific publications (ca. 15,000 articles and reviews, published during 2009-2013, and recorded in the Web of Science Core Collections) with author affiliations to the research departments, dedicated to science, technology, engineering, mathematics, and medicine (STEMM), of a comprehensive university. The dataset was subjected, at first, to the department level and discipline level analyses using the newly available KAKEN-L3 classification (based on MEXT/JSPS Grants-in-Aid system), hierarchical clustering, correspondence analysis to decipher the major departmental and disciplinary clusters, and visualization of the department-discipline relationships using two-dimensional stacked bar diagrams. The next step involved the creation of subsets covering integrated subject areas and a comparative analysis of departmental contributions to a specific area (medical, health and life science) using several disciplinary schemes: Essential Science Indicators (ESI) 22 research fields, SCOPUS 27 subject areas, OECD Frascati 38 subordinate research fields, and KAKEN-L3 66 subject categories. To illustrate the effective use of the science mapping techniques, the same subset for medical, health and life science area was subjected to network analyses for co-occurrences of keywords, bibliographic coupling of the publication sources, and co-citation of sources in the reference lists. The science mapping approach demonstrates the ways to extract information on the prolific research themes, the most frequently used journals for publishing research findings, and the knowledge base underlying the research activities covered by the publications concerned.

Keywords

References

  1. Bartol, T., Budimir, G., Juznic, P., & Stopar, K. (2016). Mapping and classification of agriculture in Web of Science: other subject categories and research fields may benefit. Scientometrics, 109(2), 979-996. doi:10.1007/s11192-016-2071-6
  2. Borner, K., & Polley, D. E. (2014). Replicable Science of Science Studies. In: Ding Y., Rousseau R., Wolfram D. (eds) Measuring Scholarly Impact, Springer, Cham, 321-341. doi:10.1007/978-3-319-10377-8_14
  3. Clarivate Analytics (2015a). https://clarivate.com/products/ (access to Web of Science and InCites under licensed subscription only)
  4. Clarivate Analytics (2015b). Master Journal List. http://ip-science.thomsonreuters.com/mjl/
  5. Clausen, S-E. (1998). Applied Correspondence Analysis: An Introduction, SAGE Publications.
  6. De Groote, S. L., & Raszewski, R. (2012). Coverage of Google Scholar, Scopus, and Web of Science: A case study of the h-index in nursing. Nursing Outlook, 60(6), 391-400. doi:10.1016/j.outlook.2012.04.007
  7. de Winter, J. C. F., Zadpoor, A. A., & Dodou, D. (2014). The expansion of Google Scholar versus Web of Science: a longitudinal study. Scientometrics, 98(2), 1547-1565. doi:10.1007/s11192-013-1089-2
  8. Franceschet, M. (2010). A comparison of bibliometric indicators for computer science scholars and journals on Web of Science and Google Scholar. Scientometrics, 83(1), 243-258. doi:10.1007/s11192-009-0021-2
  9. Gavel, Y., & Iselid, L. (2008). Web of Science and Scopus: a journal title overlap study. Online Information Review, 32(1), 8-21. doi:10.1108/14684520810865958
  10. Gautam, P., Kodama, K., & Enomoto, K. (2014). Joint bibliometric analysis of patents and scholarly publications from cross-disciplinary projects: implications for development of evaluative metrics. Journal of Contemporary Eastern Asia, 13(1), 19-37. https://doi.org/10.17477/jcea.2014.13.1.019
  11. Gautam, P. (2015). Deciphering the Department-Discipline Relationships within a University through Bibliometric Analysis of Publications Aided with Multivariate Techniques. 2015 IIAI 4th International Congress on Advanced Applied Informatics, Okayama, Japan, 468-471. doi:10.1109/IIAI-AAI.2015.212
  12. Gautam, P. (2016). Comparative Analysis of Scientific Publications of Research Entities Using Multiple Disciplinary Classifications. 2016 IIAI 5th International Congress on Advanced Applied Informatics, Kumamoto, Japan, 523-528. doi:10.1109/IIAI-AAI.2016.117
  13. Gautam, P. (2017). Scientific Publications and World University Rankings: Focus on Bibliometric Indicators at Institution (Hokkaido University) and Department (Dental Medicine) Levels. Hokkaido Journal of Dental Science, 38, 2-15.
  14. Glanzel, W. (2003). A Course on Theory and Application of Bibliometric Indicators, Course Handouts. https://www.researchgate.net/publication/242406991 (accessed on 2016/8/18).
  15. Harzing, A. W., & Alakangas, S. (2016). Google Scholar, Scopus and the Web of Science: a longitudinal and cross-disciplinary comparison. Scientometrics, 106(2), 787-804. doi:10.1007/s11192-015-1798-9
  16. Jalali, S. M. J., & Park, H. W. (2018). State of the art in business analytics: themes and collaborations. Quality & Quantity, 52(2), 627-633. doi:10.1007/s11135-017-0522-7
  17. JSPS (2015). Application Procedures for Grants-in-Aid for Scientific Research (KAKENHI). http://www.jsps.go.jp/j-grantsinaid/22_startup_support/data/27/h27_kensta_koubo_e.pdf.
  18. Lee, Y.-G. (2013). Multidisciplinary team research as an innovation engine in knowledge-based transition economies and implication for Asian countries. Journal of Contemporary Eastern Asia, 12(1), 49-63. https://doi.org/10.17477/jcea.2013.12.1.049
  19. Leydesdorff, L. (2013). An evaluation of impacts in "Nanoscience & nanotechnology": steps towards standards for citation analysis. Scientometrics, 94(1), 35-55. doi:10.1007/s11192-012-0750-5
  20. Lopez-Illescas, C., de Moya-Anegon, F., & Moed, H. F. (2008). Coverage and citation impact of oncological journals in the Web of Science and Scopus. Journal of Informetrics, 2(4), 304-316. doi:10.1016/j.joi.2008.08.001
  21. Meho, L. I., & Rogers, Y. (2008). Citation counting, citation ranking, and h-index of human-computer interaction researchers: A comparison of Scopus and Web of Science. Journal of the American Society for Information Science and Technology, 59(11), 1711-1726. doi:10.1002/asi.20874
  22. Minasny, B., Hartemink, A. E., McBratney, A., & Jang, H. J. (2013). Citations and the h index of soil researchers and journals in the Web of Science, Scopus, and Google Scholar. Peerj, 1, 16. doi:10.7717/peerj.183
  23. Mingers, J., & Lipitakis, E. (2010). Counting the citations: a comparison of Web of Science and Google Scholar in the field of business and management. Scientometrics, 85(2), 613-625. doi:10.1007/s11192-010-0270-0
  24. Mirkin, B. (2011). Core Concepts in Data Analysis: Summarization, Correlation and Visualization, Springer.
  25. Moed, H. F. (2010). Citation Analysis in Research Evaluation, Dordrecht: Springer.
  26. Moed, H. F. (2017). A critical comparative analysis of five world university rankings. Scientometrics, 110(2), 967-990. doi:10.1007/s11192-016-2212-y
  27. Moed, H. F., Markusova, V., & Akoev, M. (2018). Trends in Russian research output indexed in Scopus and Web of Science. Scientometrics, 116(2), 1153-1180. doi:10.1007/s11192-018-2769-8
  28. Mongeon, P., & Paul-Hus, A. (2016). The journal coverage of Web of Science and Scopus: a comparative analysis. Scientometrics, 106(1), 213-228. doi:10.1007/s11192-015-1765-5
  29. Nederhof, A. J., Meijer, R. F., Moed, H. F., & Vanraan, A. F. J. (1993). RESEARCH PERFORMANCE INDICATORS FOR UNIVERSITY DEPARTMENTS - A STUDY OF AN AGRICULTURAL UNIVERSITY. Scientometrics, 27(2), 157-178. doi:10.1007/bf02016548
  30. Prathap, G. (2013). Benchmarking research performance of the IITs using Web of Science and Scopus bibliometric databases. Current Science, 105(8), 1134-1138.
  31. Prins, A. A. M., Costas, R., van Leeuwen, T. N., & Wouters, P. F. (2016). Using Google Scholar in research evaluation of humanities and social science programs: A comparison with Web of Science data. Research Evaluation, 25(3), 264-270. doi:10.1093/reseval/rvv049
  32. Rafols, I. (2014). Knowledge Integration and Diffusion: Measures and Mapping of Diversity and Coherence. In: Ding Y., Rousseau R., Wolfram D. (eds) Measuring Scholarly Impact, Springer, Cham, 169-190. doi:10.1007/978-3-319-10377-8_8
  33. Santa, S., & Herrero-Solana, V. (2010). Scientific production in Latin America and the Caribbean: an approach using the data from Scopus, 1996-2007. Revista Interamericana de Bibliotecología, 33(2), 379-400.
  34. Thelwall, M. (2018). Dimensions: A competitor to Scopus and the Web of Science? Journal of Informetrics, 12(2), 430-435. doi:10.1016/j.joi.2018.03.006
  35. van Eck, N. J., & Waltman, L. (2010). Software survey: VOSviewer, a computer program for bibliometric mapping. Scientometrics, 84(2), 523-538. doi:10.1007/s11192-009-0146-3
  36. van Eck, N. J., & Waltman, L. (2014). Visualizing Bibliometric Networks. In: Ding Y., Rousseau R., Wolfram D. (eds) Measuring Scholarly Impact, Springer, Cham, 285-320. doi:10.1007/978-3-319-10377-8_13
  37. Vieira, E. S., & Gomes, J. (2009). A comparison of Scopus and Web of Science for a typical university. Scientometrics, 81(2), 587-600. doi:10.1007/s11192-009-2178-0
  38. Wagner, C. S., Roessner, J. D., Bobb, K., Klein, J. T., Boyack, K. W., Keyton, J., . . . Borner, K. (2011). Approaches to understanding and measuring interdisciplinary scientific research (IDR): A review of the literature. Journal of Informetrics, 5(1), 14-26. doi:10.1016/j.joi.2010.06.004
  39. Williams, R., & Bornmann, L. (2014). The Substantive and Practical Significance of Citation Impact Differences Between Institutions: Guidelines for the Analysis of Percentiles Using Effect Sizes and Confidence Intervals. In: Ding Y., Rousseau R., Wolfram D. (eds) Measuring Scholarly Impact, Springer, Cham, 259-281. doi:10.1007/978-3-319-10377-8_12https://doi.org/
  40. Zupic, I., & Cater, T. (2015). Bibliometric Methods in Management and Organization. Organizational Research Methods, 18(3), 429-472. doi:10.1177/1094428114562629

Cited by

  1. Investigating the applications of artificial intelligence in cyber security vol.121, pp.2, 2019, https://doi.org/10.1007/s11192-019-03222-9