DOI QR코드

DOI QR Code

Publication Metrics and Subject Categories of Biomechanics Journals

  • Duane Victor Knudson (Department of Health and Human Performance, College of Education, Texas State University)
  • 투고 : 2023.03.29
  • 심사 : 2023.06.19
  • 발행 : 2023.12.30

초록

Research in interdisciplinary fields like biomechanics is published in a variety of journals whose visibility depends on bibliometric indexing that is often driven by citation analysis of bibliometric databases. This study documented variation in publication metrics and research subject categories assigned to 14 biomechanics journals. Authors, citation, and citation rate (CR) were collected for the top 15 cited articles in the journals retrieved from the Google Scholar service. Research subject categories were also extracted for journals from three databases (Dimensions, Journal Citation Reports, and Scopus). Despite the focus on biomechanics for the journals studied, these biomechanics journals have widely varying CR and subject categories assigned to them. There were significant (p=0.001) and meaningful (77-108%) differences in median CR between average, low, and high CR groups of these biomechanics journals. Since CR are primary data used to calculate most journal metrics and there is no one biomechanics subject category, field normalization for journal citation metrics in biomechanics is difficult. Care must be taken to accurately interpret most citation metrics of biomechanics journals as biased proxies of general usage of research, given a specific database, time frame, and area of biomechanics research.

키워드

1. INTRODUCTION

Biomechanics is a specialized scientific field integrating biology and physics. These two parent disciplines have a long history, while biomechanics is a relatively recently recognized term and field, with the Journal of Biomechanics first appearing in 1968. The utility of biomechanics in understanding performance and injury in a diversity of living things and movements has resulted in the establishment of additional biomechanics journals and biomechanics research reports appearing in a wide variety of multi-disciplinary or other-field journals. In addition, no scientific field is completely isolated, and so large electronic databases have increased across-field visibility of biomechanics research and citations. Searching for knowledge, therefore, relies on searching numerous electronic bibliometric database services (e.g., CINAHL, Dimensions, Engineering Source, Google Scholar [GS], Medline, Scopus, Web of Science [WoS]) for peer-reviewed journal articles (Bar-Ilan, 2018; De Groote & Raszewski, 2012; Lascar & Barnett, 2009; Martín-Martín et al., 2021; Meho & Yang, 2007; Ramos-Remus et al., 1994).

The visibility of biomechanics research depends on the indexing of journal articles in databases and the accuracy of the associated search engines (Elkins et al., 2010; Gusenbauer & Haddaway, 2020; Pranckutė, 2021) and promotion of ranked records (Delgado-López-Cózar & Cabezas-Clavijo, 2013; Yamato et al., 2018). Electronic indexing of journals and the articles they publish has been increasingly driven by analysis of citations and citation networks (Garfield, 2006). Journal citation metrics, however, must be field normalized because of large variation in citation patterns across diverse scholarly disciplines (Declaration on Research Assessment [DORA], 2015; Hicks et al., 2015; Seglen, 1997; Waltman, 2016). While field normalization of citation measures is an established principle of bibliometrics/scientometrics/informetrics, operationalization of the most appropriate field reference values remains an unsettled issue (Haunschild et al., 2022; Haunschild & Bornmann, 2022; Leydesdorff & Bornmann, 2016). This present article will use the term bibliometrics to refer to all three of the science of knowledge fields noted that focus on these important issues. The limited bibliometric research on biomechanics journals warrants further examination to understand the visibility and interpretation of citation-based metrics of different areas of biomechanics research.

Field normalization is usually based on the assignment of research subject categories to journals indexed by databases, so that journal citation metrics in a field can be scaled to typical values of that field and timeframe. There is considerable variation, however, in how categories are assigned (Bornmann & Marx, 2015; Haunschild et al., 2022; Waltman & van Eck, 2019) and the number (12 to thousands) of subject categories created. There is also variation in results based on the time frame of interest (research front vs. long-term research taxonomies) and computational power (Klavans & Boyack, 2017), article types (Haunschild & Bornmann, 2022), and index effects (Leydesdorff & Bornmann, 2016). Unfortunately, “biomechanics” does not appear as a subject category in major electronic databases and biomechanics has only recently been added to the Classification of Instructional Program codes by the National Center for Education Statistics (U.S. Department of Education’s National Center for Education Statistics, 2020). There is little known about bibliometrics of biomechanics journals or subject categories assigned to them by bibliometric databases (Zadpoor & Nikooyan, 2011).

1.1. Bibliometrics of Biomechanics

Knudson and Chow (2008) reported that perceptions of quality or impact of 62 journals publishing biomechanics research by American Society of Biomechanics (ASB) members varied by research interest. A study of publication metrics and the Journal Citation Reports (JCR) impact factor (IF) of fourteen biomechanics journals over eight years indicated differences in the IF between seven JCR subject categories, and there was a slower increase in IF in biomechanics journals compared to other biomedical journals (Zadpoor & Nikooyan, 2011). In this study the biomechanics journals had a mean IF similar (1.2) to sport sciences journals, but lower than for typical biomedical engineering/biomaterials (2-3) and somewhat higher than for mechanical engineering/robotics (0.6-1.2) journals. This mean inflation (0-22% annually) and variation (coefficient of variation [CV]=12-21%) of the IF and four other citation metrics was recently confirmed for 14 biomechanics journals in a recent study (Knudson & Quimby, 2023). Given the well-known differences in citation patterns between various sciences and support of this in perceptions of journals by ASB members (Knudson & Chow, 2008) and JCR subject categories (Zadpoor & Nikooyan, 2011), additional study was needed within biomechanics journals of likely differences in citation patterns. Assignment of numerous subject categories and overlap across different disciplines are considered evidence of interdisciplinarity of journals (Morillo et al., 2001).

Field normalization based on citation rates for an interdisciplinary field like biomechanics may be an even greater challenge given research may be published in biological, natural, medical, or many applied science journals. The ASB currently requests that members select from five primary discipline categories: biological sciences, engineering and applied science, exercise and sports science, ergonomics and human factors, and health sciences. This creates challenges to field journal visibility and field normalization due to wide variation in citation rates in numerous subject categories. For example, a study of a similar interdisciplinary field reported substantial differences in subject categories and citation patterns in JCR for 100 kinesiology-related journals (Knudson, 2022a).

1.2. Objectives

This current study documented variation in publication metrics of top-cited articles between different biomechanics journals. A secondary purpose was to document the variation of research subject categories assigned to these journals by databases and to determine if there were differences in citation metrics across categories.

1.3. Hypotheses

It was hypothesized that there would be differences in citation and publication metrics of top-cited articles across biomechanics journals associated with different subject categories assigned by bibliometric databases. These data will be important in potential field normalization of biomechanics research and in understanding how databases categorize research in the field.

2. METHODOLOGY

2.1. Journals and Subject Categories

Fourteen journals publishing primarily biomechanics research in the English language were selected for this study (Table 1). This systematic sample was based on previous studies (Knudson & Quimby, 2023; Zadpoor & Nikooyan, 2011), bibliometric indexing, and Internet visibility, and strove to include both long-standing (Journal of Biomechanics: 1968-) and more recently established biomechanics-focused journals (International Biomechanics: 2014-). Ergonomics/human factors journals related to biomechanics were excluded from the sample because of the large number of highly-cited articles in these journals focusing on psychology rather than biomechanics. The focus on 14 biomechanics-specific journals like previous research (Knudson & Quimby, 2023; Zadpoor & Nikooyan, 2011) avoided different citation patterns and subject categories seen in studies of large samples of multidisciplinary or other journals that only occasionally publish biomechanics-related research (Knudson & Chow, 2008).

Table 1. Median citation rate (CR) and age of top cited articles between three groups of biomechanics journals

E1JSCH_2023_v11n4_40_t0001.png 이미지

Age=2023-year of publication (yr) and CR=Citations/Age (C/yr).

The consistency of research subject category classification of these journals was examined by extracting the second level subject categories assigned by three bibliometric databases: Dimensions, JCR, and Scopus (Table 2). Journals were searched for using the “Source Title” function in the free version (https://www.dimensions.ai/ products/free) of the Dimensions database. The “Research Categories” assigned by the database and number of articles indexed were recorded. A university subscription to JCR was searched for the subject “Category” associated with the journal titles, and Scopus second level categories under “Subject Area and Category” were identified by searching journal titles in SCImago Journal & Country Rank (https://www.scimagojr.com) that use Scopus data.

Table 2. Second level research subject categories assigned to 14 biomechanics journals by three databases

E1JSCH_2023_v11n4_40_t0002.png 이미지

Numerals indicate the number of times each subject category was assigned to the 14 biomechanics journals in the study. Only 12 of the journals were indexed in Journal Citation Reports and were able to receive a subject category classification from that database.

2.2. Database and Measures

GS service was selected for journal publication metrics in this study given its superior coverage of peer reviewed journal articles over other subscription databases in all fields of science (Delgado-López-Cózar & Cabezas-Clavijo, 2013; Halevi et al., 2017; Harzing & Alakangas, 2016; Martín-Martín et al., 2018; 2021; Meho & Yang, 2007). This avoids bias in indexing/coverage of the two (WoS and Scopus) major subscription databases (Pranckutė, 2021); however, this requires extra time in manual review and cleaning of records returned from searches. The high (r=0.78-0.99) associations of citations and subsequent citation metrics for journal articles and authors between GS and WoS and Scopus (De Groote & Raszewski, 2012; Franceschet, 2010; Knudson, 2015a; 2022b; 2023; Martín-Martín et al., 2018; Renjith & Pradeepkumar, 2021) also allow for the conceptual replication and extension of initial research on variation in citation metrics across subject or topical interest areas within biomechanics (Knudson & Chow, 2008; Knudson & Quimby, 2023; Zadpoor & Nikooyan, 2011).

The publication metrics examined in this study focused on areas where previous bibliometrics have reported disciplinary differences: authorship, citations, and field speed/time. The names and number authors of the top 15 cited articles were recorded, and total GS citations (C) and the year of publication of each article (Year) were collected. Two additional variables calculated from these variables were article age (Age=2023-Year) and citation rate (citation rate [CR]=C/Age). Article citation rates determine most journal metrics (e.g., CiteScore, IF) and Age was available for all journals, unlike the JCR Cited Half-Life. CR also avoids well-known biases in cites per citable articles ratios, like the journal IF that have serious biases and are weakly correlated with individual articles they publish (Abramo et al., 2023; Seglen, 1997; Zhang et al., 2017). The history of publication of each journal (History=2023-Year of first issue) was also recorded.

The “Return articles published in” tool of the advanced search option of GS was used for each journal title and variations of journal title. Numerous searches were conducted, and data were extracted from the top 15 cited articles of each journal. The page rank algorithm of GS will return up to 1,000 records of any search, and articles are generally listed in descending order by total citations. The investigator reviewed the top 50 returned records to ensure the top 15 records by total citations were obtained. Searches were completed by February 26, 2023.

This study used the top 15 cited articles for each journal following the standard bibliometric practice of focusing on the top percentiles (e.g., 5 or 10%) of citation metrics (Bornmann & Marx, 2014; 2015). There are very strong positive skews of citations and a large percentage of uncited articles in all scholarly fields (Seglen, 1992; 1997; Stern, 1990; Zhang et al., 2017), including biomechanics (Knudson, 2015a; 2015b; 2023), meaning citations to journals are dominated by a small percentage of highly-cited articles. Recent research also indicates that the importance of citation elites is growing in importance (Reardon, 2021). Therefore, this study focused on total citations and citation rates of the top 15 cited articles as a less biased and representative estimate of likely variation and differences in publication metrics across areas of research interest within biomechanics.

2.3. Data Analysis

Extracted data were entered into Excel and rechecked before importing into JMP Pro 14 (SAS Institute, Cary, NC) for statistical analysis. Descriptive data were calculated for publication metrics across (n=210) all journals and by journal. The research subject categories assigned to these journals were compiled (Table 2).

Given the lack of normality from the large positive skew (Table 3) of publication metrics, subsequent comparison of across journals and subject categories was based on non-parametric Kruskal-Wallis tests. Significant (p<0.025) effects of journal on primary dependent variables (CR and Age) were followed up with comparison across journals split into three groups (High, Average, Low) based on median CR. The three groups were based on rank order of the medians into the top four, middle six, and lowest four journals. Subject categories assigned to these groups were qualitatively compared to explore differences in citation patterns and interdisciplinary in biomechanics journals. Potential interaction of journal history with field speed was examined by correlation of Age with History. The nonparametric statistical analyses precluded effect size calculation, so size of effects were qualitatively based on studies of variation of citation metrics in biomechanics journals (Knudson & Quimby, 2023; Zadpoor & Nikooyan, 2011).

Table 3. Descriptive data of publication metrics of top 15 cited articles in Google Scholar from 14 biomechanics journals

E1JSCH_2023_v11n4_40_t0003.png 이미지

Variables are: Authors=number of authors; Age=2023-year of publication; Citations=total Google Scholar citations; and Citation rate= Citations/Age.

Max, Maximum; 75%, 75th percentile; Me, median; 25%, 25th percentile; M, mean; SD, standard deviation; CV, coefficient of variation.

3. RESULTS

All publication metrics were not normally distributed (W=0.68-0.96, p<0.001), showing positive skews (γ=0.89-3.0) and large variability (CV=43-130%) across all biomechanics journals (Table 3). Typical (median and mean) Authors (3 and 4) and Age values for highly-cited articles (17 and 17.3) were less variable across journals than were C or CR. Consistent with this variability, the biomechanics journals were classified into 41 different research subject categories by the three bibliometric databases (Table 2).

Kruskal-Wallis tests indicated significant (p=0.001) difference in both CR and Age between the journals. Ranking the journals by median CR and Age allocation into three groups (Table 1) shows the interaction of C attraction and Age in biomechanics. High Age group biomechanics journals include both high CR (e.g., Journal of Biomechanics) and lower CR (e.g., Sports Engineering) journals. There was a significant (p=0.027) moderate association (r=0.586) between median Age of highly-cited articles of biomechanics journals and their publication history in years.

Disaggregation of journal subject categories in Table 2 by the three CR journal groups shows variation in subject assignments by the bibliometric databases. The number of and variation of subject categories assigned by Dimensions and Scopus were generally larger for low and average CR groups compared to the high CR group. The high citation rate biomechanics journals generally had the most classifications as health, sports science, and sports medicine research. For example, one hundred percent of the four high CR journals were classified as “Allied Health and Rehabilitation,” “Health Sciences,” and “Sports Science and Exercise” in Dimensions. Seventy-five percent or more of high CR journals were classified as “Sport Sciences” by JCR. Similarly, Scopus classified seventy-five percent or more journals into “Biophysics,” “Orthopedics and Sports Medicine,” and “Sports Science.” “Biomedical Engineering” classified journals were more common in the average (Dimensions and JCR) and low CR groups (Scopus) than in the high CR group.

4. DISCUSSION

4.1. Biomechanics Journal Citation Metrics

The hypothesized difference in publication metrics within biomechanics journals was supported. There were significant differences in CR and Age of top cited articles published in different biomechanics journals. CR is key input data for numerous journal metrics that are biased proxy estimators of research usage and which must be carefully interpreted (Aksnes et al., 2019; Hicks et al., 2015; Knudson, 2019; Roldan-Valadez et al., 2019). While much of the inquiry and commentary on citation metrics does not clearly define the use of numerous terms (e.g., impact, influence, quality), factor analyses of journal citation metrics clearly align with research usage/popularity or prestige (Bollen et al., 2009; Franceschet, 2010; Leydesdorff, 2009; Leydesdorff et al., 2016; Perera & Wijewickrema, 2018; Walters, 2017; Yan et al., 2011; Zhou et al., 2012). Research on biomechanics and kinesiology journals confirm this two-factor citation metric structure, reporting high associations between usage metrics (e.g., CiteScore, IF, SCImago Journal Rank; Source Normalized Impact per Paper) but with lower associations of these metrics with prestige (Eigenfactor, H Index) metrics (Knudson, 2013; 2015a; Knudson et al., 2023).

Sorting biomechanics journals into three groups by CR (Table 1) shows percentage differences from Average to High and Low groups were between 77 to 108% for CR and 38 to 44% for Age, respectively. These differences have some meaning because they are larger than the 30% variation reported for variability and annual growth of five citation metrics for biomechanics journals (Knudson & Quimby, 2023; Zadpoor & Nikooyan, 2011) or as a standard of meaningful difference in journal metrics from many fields (Amin & Mabe, 2003; Haghdoost et al., 2014; Ogden & Bartley, 2008). This is also confirmatory evidence of meaningful differences in citation rates across research interest areas within biomechanics journals. The current results are consistent with the Knudson and Chow (2008) study of ASB members reporting different perceptions of prestige and quality of biomechanics journals across research areas of interest. It is likely that all journal metrics based on CR create bias in favor or against some research areas within biomechanics because they influence indexing and interpretation based on subject categories with different citation patterns.

Biomechanics journal usage measured by CR also likely interacts with article Age and journal publication history. Inspection of Table 1 shows that a high prestige biomechanics journal like the Journal of Biomechanics (Knudson & Chow, 2008) can be influential with high usage (CR) over a long period of time, while other high CR journals (Gait & Posture, Journal of Electromyography and Kinesiology) may publish highly used research more related to faster and larger health/medical fields of study. Publishing research in a more recently founded and specialized journal will generally be less visible in bibliometric databases that use C and CR to index research, and in turn may lead to fewer citations. The additional C that biomechanics research may attract over time also interacts with well-known differences in speed of decline in citations (e.g., Cited Half-Life) across fields (Haghdoost et al., 2014). There did not appear to be meaningful differences in the number of authors across biomechanics journals. Most highly cited articles had three to four authors, which was consistent with previous research (Knudson, 2012).

4.2. Subject Category and Normalization

The biomechanics journals studied were classified into a wide variety (41) of research subject categories that is consistent with a previous study of JCR categories for biomechanics journals (Zadpoor & Nikooyan, 2011). The classification of many categories with similar sets of journals has been reported in biomedical sciences (Rafols & Leydesdorff, 2009; Wang & Waltman, 2016) and is also consistent with bibliometric interpretation of complex or interdisciplinary fields (Katz & Hicks, 1995; Morillo et al., 2001). In addition, the subject categories assigned by databases across the three CR groups were generally consistent with previous research, indicating that health and biomedical journals have higher citation rates than journals classified into engineering, education, or behavioral sciences (Jacso, 2005; Mongeon & Paul-Hus, 2016; Pozsgai et al., 2021; Wu et al., 2012). There is, however, well known inconsistency in assignment of and specific subject categories across databases (Bornmann & Marx, 2015; Haunschild et al., 2022). For example, this study found relatively higher CR in GS for the similar “Sport(s) Science(s) and Exercise” subject that also overlaps with several categories in all databases examined (Allied Health and Rehabilitation, Rehabilitation, and Orthopedics and Sports Medicine).

All these observations support the hypothesis that biomechanics journals may not be easily described with one citation rate on which to base field normalization of journal metrics. The overall median CR of 22 to 26 citations/year (Tables 1, 3) in GS was similar to studies of top cited articles in GS in a study of the journal Sports Biomechanics (Knudson, 2020) and seven biomechanics journals within a study of 65 kinesiology journals (Knudson, 2014). However, prestigious (e.g., Journal of Biomechanics) and biomedical journals (e.g., Gait & Posture; Clinical Biomechanics) tend to have CR twice as high. Newer journals (e.g., International Biomechanics) and specialized engineering journals (e.g., Sports Engineering) may tend to have 77% lower CR. It is important to not take these general percentages literally as likely differences in specific articles’ future citation potential or any other construct (impact, quality, or prestige).

Citation data are not true ratio level measurements due to differences in indexing and errors. Thus, journal metrics calculated with CR, their reporting with false precision, and subsequent efforts to rank journals have received international rebuke (Declaration on Research Assessment [DORA], 2015; Hicks et al., 2015) and are described as inaccurate, an insidious misuse, or a mania (Adler et al., 2008; Amin & Mabe, 2003; Casadevall & Fang, 2014; Knudson et al., 2023). Most articles published in a high CR and consequently high IF journal, however, do not receive high citations and CR because of skewed citations and the large percentage of uncited articles in journals (Knudson, 2015b; 2023; Seglen, 1992; 1997; Stern, 1990; Zhang et al., 2017). A high CR or IF journal merely has some highly cited articles in a specific time window relative to other journals that should only be used for comparison if they are from a similar field.

It is also important to note that CR and journal metrics vary by the database used and year for their calculation due to different indexing and errors (Franceschini et al., 2015; Meho & Yang, 2007; Moed et al., 2016; Rossner et al., 2008). In biomechanics, for example, Zadpoor and Nikooyan (2011) reported that the two-year IF varied from 1 to 3 across six JCR subject categories between 2002 and 2010. Biomechanics journals classified as “Sport Sciences” had a 2010 IF of about 2, while a more recent study reported eight biomechanics journals related to kinesiology in 2020 had a five-year IF of 2.7 and 27 Sport Science journals had a mean five-year IF of 4.0 (Knudson, 2022a). Persons seeking to field-normalize citation metrics for biomechanics journals, research, or authors should specify specific reference data based on both subject categories and the database used.

4.3. Limitations

Several limitations of this study should be considered in interpreting the results. Citation data were limited to the top 15 cited articles in GS. While CR forms the base of numerous journal metrics, it is often unclear how C and CR change by subsequent citable document rate calculations. There are well-known differences in journal metrics that are all based on CR and well-known differences in the size of the same journal metric calculated from different databases, which all influence the current results. The high positive correlation between citations between different databases and several results in the current study consistent with previous research supports the accuracy and potential utility of the current results. Future research is recommended to help clarify research subject categories and differences in CR and journal metrics for biomechanics research.

5. CONCLUSIONS

It was concluded that the interdisciplinary and application diversity of the field of biomechanics leads to substantial variation in citation rates in research published in biomechanics journals. This is reflected in variation in research subject categories associated with biomechanics journals by three major bibliometric databases. The lack of a unique biomechanics category, numerous reference categories, and the meaningful difference (77 to 108%) from low to average and high CR biomechanics journals, make field normalization with subsequent journal metrics difficult. Care must be taken to accurately interpret most citation metrics of biomechanics journals as biased proxies of general usage of research, given a specific database, time frame, and research topic.

CONFLICTS OF INTERST

No potential conflict of interest relevant to this article was reported.

참고문헌

  1. Abramo, G., D'Angelo, C. A., & Di Costa, F. (2023). Correlating article citedness and journal impact: An empirical investigation by field on a large-scale dataset. Scientometrics, 128(3), 1877-1894. https://doi.org/10.1007/s11192-022-04622-0 
  2. Adler, R., Ewing, J., & Taylor, P. (2008). Joint Committee on Quantitative Assessment of Research. Citation statistics: A report from the International Mathematical Union (IMU) in cooperation with the International Council of Industrial and Applied Mathematics (ICIAM) and the Institute of Mathematical Statistics (IMS). https://www.mathunion.org/fileadmin/IMU/Report/CitationStatistics.pdf 
  3. Aksnes, D. W., Langfeldt, L., & Wouters, P. (2019). Citations, citation indicators, and research quality: An overview of basic concepts and theories. SAGE Open, 9(1). https://doi.org/10.1177/2158244019829575 
  4. Amin, M., & Mabe, M. A. (2003). Impact factors: Use and abuse. Medicina, 63(4), 347-354. 
  5. Bar-Ilan, J. (2018). Tale of three databases: The implication of coverage demonstrated for a sample query. Frontiers in Research Metrics and Analytics, 3, 6. https://doi.org/10.3389/frma.2018.00006 
  6. Bollen, J., Van de Sompel, H., Hagberg, A., & Chute, R. (2009). A principal component analysis of 39 scientific impact measures. PLoS One, 4(6), e6022. https://doi.org/10.1371/journal.pone.0006022 
  7. Bornmann, L., & Marx, W. (2014). How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations. Scientometrics, 98(1), 487-509. https://doi.org/10.1007/s11192-013-1161-y 
  8. Bornmann, L., & Marx, W. (2015). Methods for the generation of normalized citation impact scores in bibliometrics: Which method best reflects the judgements of experts? Journal of Informetrics, 9(2), 408-418. https://doi.org/10.1016/j.joi.2015.01.006 
  9. Casadevall, A., & Fang, F. C. (2014). Causes for the persistence of impact factor mania. mBio, 5(2), e00064-e14. https://doi.org/10.1128/mBio.00064-14 
  10. Declaration on Research Assessment (DORA). (2015). DORA. http://www.ascb.org/dora/ 
  11. De Groote, S. L., & Raszewski, R. (2012). Coverage of Google Scholar, Scopus, and Web of Science: A case study of the hindex in nursing. Nursing Outlook, 60(6), 391-400. https://doi.org/10.1016/j.outlook.2012.04.007 
  12. Delgado-Lopez-Cozar, E., & Cabezas-Clavijo, A. (2013). Ranking journals: Could Google Scholar Metrics be an alternative to Journal Citation Reports and Scimago Journal Rank? Learned Publishing, 26(2), 101-114. https://doi.org/10.1087/20130206 
  13. Elkins, M. R., Maher, C. G., Herbert, R. D., Moseley, A. M., & Sherrington, C. (2010). Correlation between the Journal Impact Factor and three other journal citation indices. Scientometrics, 85(1), 81-93. https://doi.org/10.1007/s11192-010-0262-0 
  14. Franceschet, M. (2010). A comparison of bibliometric indicators for computer science scholars and journals on Web of Science and Google Scholar. Scientometrics, 83(1), 243-258. https://doi.org/10.1007/s11192-009-0021-2 
  15. Franceschini, F., Maisano, D., & Mastrogiacomo, L. (2015). Research quality evaluation: Comparing citation counts considering bibliometric database errors. Quality & Quantity, 49(1), 155-165. https://doi.org/10.1007/s11135-013-9979-1 
  16. Garfield, E. (2006). The history and meaning of the journal impact factor. Journal of the American Medical Association, 295(1), 90-93. https://doi.org/10.1001/jama.295.1.90 
  17. Gusenbauer, M., & Haddaway, N. R. (2020). Which academic search systems are suitable for systematic reviews or metaanalyses? Evaluating retrieval qualities of Google Scholar, PubMed, and 26 other resources. Research Synthesis Methods, 11(2), 181-217. https://doi.org/10.1002/jrsm.1378 
  18. Haghdoost, A., Zare, M., & Bazrafshan, A. (2014). How variable are the journal impact measures? Online Information Review, 38(6), 723-737. https://doi.org/10.1108/OIR-05-2014-0102 
  19. Halevi, G., Moed, H., & Bar-Ilan, J. (2017). Suitability of Google Scholar as a source of scientific information and as a source of data for scientific evaluation-Review of the literature. Journal of Informetrics, 11(3), 823-834. https://doi.org/10.1016/j.joi.2017.06.005 
  20. Harzing, A. W., & Alakangas, S. (2016). Google Scholar, Scopus and the Web of Science: A longitudinal and cross-disciplinary comparison. Scientometrics, 106(2), 787-804. https://doi.org/10.1007/s11192-015-1798-9 
  21. Haunschild, R., & Bornmann, L. (2022). Relevance of document types in the scores' calculation of a specific fieldnormalized indicator: Are the scores strongly dependent on or nearly independent of the document type handling? Scientometrics, 127(8), 4419-4438. https://doi.org/10.1007/s11192-022-04446-y 
  22. Haunschild, R., Daniels, A. D., & Bornmann, L. (2022). Scores of a specific field-normalized indicator calculated with different approaches of field-categorization: Are the scores different or similar? Journal of Informetrics, 16(1), 101241. https://doi.org/10.1016/j.joi.2021.101241 
  23. Hicks, D., Wouters, P., Waltman, L., de Rijcke, S., & Rafols, I. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature, 520(7548), 429-431. https://doi.org/10.1038/520429a 
  24. Jacso, P. (2005). As we may search - Comparison of major features of the Web of Science, Scopus, and Google Scholar citation-based and citation-enhanced databases. Current Science, 89(9), 1537-1547. https://www.jstor.org/stable/24110924  10924
  25. Katz, J. S., & Hicks, D. (1995, June 7-10). The classification of interdisciplinary journals: A new approach. In M. Koenig & A. Bookstein (Eds.), Proceeding of the 5th Biennial Conference of the International Society for Scientometrics and Informatics (pp. 245-254). Learned Information. 
  26. Klavans, R., & Boyack, K. W. (2017). Which type of citation analysis generates the most accurate taxonomy of scientific and technical knowledge? Journal of the Association for Information Science and Technology, 68(4), 984-998. https://doi.org/10.1002/asi.23734 
  27. Knudson, D. (2012). Twenty-year trends of authorship and sampling in applied biomechanics research. Perceptual and Motor Skills, 114(1), 16-20. https://doi.org/10.2466/11.PMS.114.1.16-20 
  28. Knudson, D. (2013). Impact and prestige of kinesiology-related journals. Comprehensive Psychology, 2, 13. https://doi.org/10.2466/50.17.CP.2.13 
  29. Knudson, D. (2014). Citation rates for highly-cited papers from different sub-disciplinary areas within kinesiology. Chronicle of Kinesiology in Higher Education, 25(2), 9-17. https://www.nakhe.org/_Library/journal_archives/25_2.pdf 
  30. Knudson, D. (2015a). Citation rate of highly-cited papers in 100 kinesiology-related journals. Measurement in Physical Education and Exercise Science, 19(1), 44-50. https://doi.org/10.1080/1091367X.2014.988336 
  31. Knudson, D. (2015b). Evidence of citation bias in kinesiology-related journals. Chronicle of Kinesiology in Higher Education, 26(1), 5-12. https://www.nakhe.org/_Library/journal_archives/26_1.pdf 
  32. Knudson, D. (2019). Judicious use of bibliometrics to supplement peer evaluations of research in kinesiology. Kinesiology Review, 8(2), 100-109. https://doi.org/10.1123/kr.2017-0046 
  33. Knudson, D. (2020). Top cited research over fifteen years in Sports Biomechanics. Sports Biomechanics, 19(6), 808-816. https://doi.org/10.1080/14763141.2018.1518478 
  34. Knudson, D. (2022a). What kinesiology research is most visible to the academic world? Quest, 74(3), 285-298. https://doi.org/10.1080/00336297.2022.2092880 
  35. Knudson, D. (2022b). Citations to biomechanics articles from four databases. ISBS Proceedings Archive, 40(1), 82. https://commons.nmu.edu/isbs/vol40/iss1/82 
  36. Knudson, D. (2023). Association of ResearchGate research influence score with other metrics of top cited sports biomechanics scholars. Biomedical Human Kinetics, 15(1), 57-62. https://doi.org/10.2478/bhk-2023-0008 
  37. Knudson, D., & Chow, J. W. (2008). North American perception of the prestige of biomechanics serials. Gait & Posture, 27(4), 559-563. https://doi.org/10.1016/j.gaitpost.2007.07.005 
  38. Knudson, D., & Quimby, S. (2023). Variation in recent citation metrics for biomechanics journals. ISBS Proceedings Archive, 41(1), 64. https://commons.nmu.edu/isbs/vol41/iss1/64 
  39. Knudson, D., Cardinal, B. J., & McCullagh, P. (2023). Synthesis of publication metrics in kinesiology-related journals: Proxies for rigor, usage, and prestige. Quest. https://doi.org/10.1080/00336297.2023.2237150 
  40. Lascar, C., & Barnett, P. (2009). Journals not included in BIOSIS previews have a notable impact in biology. Issues in Science and Technology Librarianship, 58. https://doi.org/10.29173/istl2486 
  41. Leydesdorff, L. (2009). How are new citation-based journal indicators adding to the bibliometric toolbox? Journal of the Association for Information Science and Technology, 60(7), 1327-1336. https://doi.org/10.1002/asi.21024 
  42. Leydesdorff, L., & Bornmann, L. (2016). The operationalization of "fields" as WoS subject categories (WCs) in evaluative bibliometrics: The cases of "library and information science" and "science & technology studies". Journal of the Association for Information Science and Technology, 67(3), 707-714. https://doi.org/10.1002/asi.23408 
  43. Leydesdorff, L., Bornmann, L., Comins, J. A., & Milojevic, S. (2016). Citations: Indicators of quality? The impact fallacy. Frontiers in Research Metrics and Analytics, 1, 1. https://doi.org/10.3389/frma.2016.00001 
  44. Martin-Martin, A., Orduna-Malea, E., Thelwall, M., & Delgado Lopez-Cozar, E. (2018). Google Scholar, Web of Science, and Scopus: A systematic comparison of citations in 252 subject categories. Journal of Informetrics, 12(4), 1160-1177. https://doi.org/10.1016/j.joi.2018.09.002 
  45. Martin-Martin, A., Thelwall, M., Orduna-Malea, E., & Delgado Lopez-Cozar, E. (2021). Google Scholar, Microsoft Academic, Scopus, Dimensions, Web of Science, and OpenCitations' COCI: A multidisciplinary comparison of coverage via citations. Scientometrics, 126(1), 871-906. https://doi.org/10.1007/s11192-020-03690-4 
  46. Meho, L. I., & Yang, K. (2007). Impact of data sources on citation counts and rankings of LIS faculty: Web of Science versus Scopus and Google Scholar. Journal of the American Society for Information Science and Technology, 58(13), 2105-2125. https://doi.org/10.1002/asi.20677 
  47. Moed, H. F., Bar-Ilan, J., & Halevi, G. (2016). A new methodology for comparing Google Scholar and Scopus. Journal of Informetrics, 10(2), 533-551. https://doi.org/10.1016/j.joi.2016.04.017 
  48. Mongeon, P., & Paul-Hus, A. (2016). The journal coverage of Web of Science and Scopus: A comparative analysis. Scientometrics, 106(1), 213-228. https://doi.org/10.1007/s11192-015-1765-5 
  49. Morillo, F., Bordons, M., & Gomez, I. (2001). An approach to interdisciplinarity through bibliometric indicators. Scientometrics, 51(1), 203-222. https://doi.org/10.1023/A:1010529114941 
  50. Ogden, T. L., & Bartley, D. L. (2008). The ups and downs of journal impact factors. Annals of Occupational Hygiene, 52(2), 73-82. https://doi.org/10.1093/annhyg/men002 
  51. Perera, U., & Wijewickrema, M. (2018). Relationship between journal-ranking metrics for a multidisciplinary set of journals. Libraries and the Academy, 18(1), 35-58. https://doi.org/10.1353/pla.2018.0003 
  52. Pozsgai, G., Lovei, G. L., Vasseur, L., Gurr, G., Batary, P., Korponai, J., Littlewood, N. A., Liu, J., Mora, A., Obrycki, J., Reynolds, O., Stockan, J. A., VanVolkenburg, H., Zhang, J., Zhou, W., & You, M. (2021). Irreproducibility in searches of scientific literature: A comparative analysis. Ecology and Evolution, 11(21), 14658-14668. https://doi.org/10.1002/ece3.8154 
  53. Pranckute, R. (2021). Web of Science (WoS) and Scopus: The titans of bibliographic information in today's academic world. Publications, 9(1), 12. https://doi.org/10.3390/publications9010012 
  54. Rafols, I., & Leydesdorff, L. (2009). Content-based and algorithmic classifications of journals: Perspectives on the dynamics of scientific communication and indexer effects. Journal of the American Society for Information Science and Technology, 60(9), 1823-1835. https://doi.org/10.1002/asi.21086 
  55. Ramos-Remus, C., Suarez-Almazor, M., Dorgan, M., Gomez-Vargas, A., & Russell, A. S. (1994). Performance of online biomedical databases in rheumatology. Journal of Rheumatology, 21(10), 1912-1921. 
  56. Reardon, S. (2021). 'Elite' researchers dominate citation space. Nature, 591(7849), 333-334. https://doi.org/10.1038/d41586-021-00553-7 
  57. Renjith, V. R., & Pradeepkumar, A. P. (2021). Citations of the top 100 most-cited papers of the journal Scientometrics in Web of Science and its association and correlation with Scopus and Google Scholar citations. Library Philosophy and Practice. 4710. https://digitalcommons.unl.edu/libphilprac/4710/  10/
  58. Roldan-Valadez, E., Salazar-Ruiz, S. Y., Ibarra-Contreras, R., & Rios, C. (2019). Current concepts on bibliometrics: A brief review about impact factor, Eigenfactor score, CiteScore, SCImago Journal Rank, Source-Normalised Impact per Paper, H-index, and alternative metrics. Irish Journal of Medical Science, 188(3), 939-951. https://doi.org/10.1007/s11845-018-1936-5 
  59. Rossner, M., Van Epps, H., & Hill, E. (2008). Show me the data. Journal of General Physiology, 131(1), 3-4. https://doi.org/10.1085/jgp.200709940 
  60. Seglen, P. O. (1992). The skewness of science. Journal of the American Society for Information Science, 43(9), 628-638. https://doi.org/10.1002/(SICI)1097-4571(199210)43:9%3C628::AID-ASI5%3E3.0.CO;2-0 
  61. Seglen, P. O. (1997). Why the impact factor of journals should not be used for evaluating research. British Medical Journal (Clinical research ed.), 314(7079), 498-502. https://doi.org/10.1136/bmj.314.7079.497 
  62. Stern, R. E. (1990). Uncitedness in the biomedical literature. Journal of the American Society for Information Science, 41(3), 193-196. https://doi.org/10.1002/(SICI)1097-4571(199004)41:3%3C193::AID-ASI5%3E3.0.CO;2-B 
  63. U.S. Department of Education's National Center for Education Statistics. (2020). Classification of instructional programs. https://nces.ed.gov/ipeds/cipcode/Default.aspx?y=56 
  64. Walters, W. H. (2017). Composite journal rankings in library and information science: A factor analytic approach. Journal of Academic Librarianship, 43(5), 434-442. https://doi.org/10.1016/j.acalib.2017.06.005 
  65. Waltman, L. (2016). A review of the literature on citation impact indicators. Journal of Informetrics, 10(2), 365-391. https://doi.org/10.1016/j.joi.2016.02.007 
  66. Waltman, L., & van Eck, N. J. (2019). Field normalization of Scientometric indicators. In W. Glanzel, H. F. Moed, U. Schmoch, & M. Thelwall (Eds.), Springer handbook of science and technology indicators (pp. 281-300). Springer. 
  67. Wang, Q., & Waltman, L. (2016). Large-scale analysis of the accuracy of the journal classification systems of Web of Science and Scopus. Journal of Informetrics, 10(2), 347-364. https://doi.org/10.1016/j.joi.2016.02.003 
  68. Wu, Y. P., Aylward, B. S., Roberts, M. C., & Evans, S. C. (2012). Searching the scientific literature: Implications for quantitative and qualitative reviews. Clinical Psychology Review, 32(6), 553-557. https://doi.org/10.1016/j.cpr.2012.06.007 
  69. Yamato, T. P., Arora, M., Stevens, M. L., Elkins, M. R., & Moseley, A. M. (2018). Quality, language, subdiscipline and promotion were associated with article accesses on Physiotherapy Evidence Database (PEDro). Physiotherapy, 104(1), 122-128. https://doi.org/10.1016/j.physio.2017.08.003 
  70. Yan, E., Ding, Y., & Sugimoto, C. R. (2011). P-Rank: An indicator measuring prestige in heterogeneous scholarly networks. Journal of the American Society for Information Science and Technology, 62(3), 467-477. https://doi.org/10.1002/asi.21461 
  71. Zadpoor, A. A., & Nikooyan, A. A. (2011). Publication and citation in biomechanics: A comparison with closely related fields (2003-2010). Journal of Mechanics in Medicine and Biology, 11(4), 705-711. https://doi.org/10.1142/S0219519411004836 
  72. Zhang, L., Rousseau, R., & Sivertsen, G. (2017). Science deserves to be judged by its contents, not by its wrapping: Revisiting Seglen's work on journal impact and research evaluation. PLoS One, 12(3), e0174205. https://doi.org/10.1371/journal.pone.0174205 
  73. Zhou, Y. B., Lu, L., & Li, M. (2012). Quantifying the influence of scientists and their publications: Distinguishing between prestige and popularity. New Journal of Physics, 14, 033033. https://doi.org/10.1088/1367-2630/14/3/033033