DOI QR코드

DOI QR Code

Return-on-Investment Measurement and Assessment of Research Fund: A Case Study in Malaysia

  • SANUSI, Nur Azura (Department of Economics, Faculty of Business, Economics and Social Development, University Malaysia Terengganu) ;
  • SHAFIEE, Noor Hayati Akma (Department of Economics, Faculty of Business, Economics and Social Development, University Malaysia Terengganu) ;
  • HUSSAIN, Nor Ermawati (Department of Economics, Faculty of Business, Economics and Social Development, University Malaysia Terengganu) ;
  • ABU HASAN, Zuha Rosufila (Department of Management and Marketing, Faculty of Business, Economics and Social Development, University Malaysia Terengganu) ;
  • ABDULLAH, Mohd Lazim (Department of Mathematics, Faculty of Ocean Engineering, Technology and Informatics, University Malaysia Terengganu) ;
  • SA'AT, Nor Hayati (Department of Social Ecology, Faculty of Business, Economics and Social Development, University Malaysia Terengganu)
  • Received : 2021.05.30
  • Accepted : 2021.08.15
  • Published : 2021.09.30

Abstract

This study estimates the financial value of return on investment (ROI) of research funds. Four simulation estimations are employed to measure ROI finance value that considers the outputs, outcomes, impacts and total ROI from the allocation input received. Research outputs, outcomes, and impacts can be quantitatively measured based on improvements to existing systems. In terms of input, the Malaysian government has allocated MYR301,350,000 for fundamental research in the 2021 budget compared with 2019, up 9.5 percent from 2019. It brings up the question: To what extent does the input of research funds allocated by the government yield a good return in outputs, outcomes, and impacts to the academic community, society, and country? The result of total ROI shows around MYR7 return is generated by researchers for each Malaysian ringgit channeled by the funder. More specifically, for a research project, it is more difficult to produce impacts and outcomes compared to research outputs. The positive return is evidence that all the allocated funds are beneficial to the stakeholders. The government can apply this approach in calculating ROI for evaluation and fund allocation to universities. Furthermore, the positive financial value of research output, outcome, and impact automatically contribute to a positive innovation environment in Malaysia.

Keywords

1. Introduction

Research is a systematic scientific process in solving an issue or problem. In general, research can be categorized into fundamental and applied. Fundamental research focuses on testing existing theories or developing new ones. Meanwhile, applied research focuses more on current issues and applications that attempt to answer current micro and macro issues (Sanusi et al., 2019; Sekaran, 2002; Zikmund, 2003).

Good quality will generate new knowledge or improve current knowledge, and it will be translated into academic output in the form of publications and product revenue. The output can be categorized as innovation. Hence, knowledge, technology, and social innovation have a bigger impact (Altbach, 2009). Moreover, Gross Domestic Product (GDP) depends on the level of innovation of a country (Sanusi et al., 2020). Accordingly, research will produce outputs, outcomes, and impacts, including knowledge development, new perspectives, social engagement, and networking.

In this regard, research is an essential field for public and private educational institutions. The research thrust also helps in increasing country output and promoting the country’s name globally. It can be identified by evaluating the products and research output of innovations that are quantified quantitatively based on system improvements. In addition, the knowledge generation of research positively affects the country’s macroeconomy and micro-business income. Knowledge generation is no exception when it comes to a community’s socioeconomic development (Yeo, 2018; Guerrero et al., 2015; Hulten, 2001; Lin & Chen, 2007; Aldrich, 2012; Audretsch, 2014; Audretsch & Keilbach, 2004; Rhoten & Calhoun, 2011; Hsu et al., 2015; Kelly et al., 2014; Valero & Van Reenen, 2016).

Historically, university activities have provided education and research, but the mathematician and philosopher Whitehead (1929) described the university’s primary purpose is to maintain a link between knowledge and the spirit of life by bringing together both young and old in imaginative learning processes. The knowledge is also communicated in a creative manner by the university. At the very least, this is a function that society requires. A university’s existence would be meaningless if it failed (Penfield et al., 2014). Hence, Kantor and Whalley (2014) stated that the university’s contribution to the local economy increases with time.

The government depended on university-industry interactions as part of the national innovation strategy to boost scientific innovation (Van Der Steen & Enders, 2008). Next, innovations would increase output contributing to economic production (Friedman & Silberman, 2003). Thus, public universities must operate more effectively and efficiently to become a more autonomous model, instead of receiving government grants (Le & Bui, 2020).

Universities, in addition to contributing to innovation, production efficiency, productivity, revenue, and the economy, are able to transfer invention outside of academia through innovation licensing or the establishment of startup companies (Hsu et al., 2015). Toivanen and Vääänänen (2016) found that the increase in the number of technical universities in Finland increased the number of patents as a result of license innovations. Belenzon and Schankerman (2013) found that firms tend to take patents from nearby universities. Hausman (2012) showed that university innovation increases job growth and business pay, particularly for enterprises near the university.

This positively affected workers, firms, industry, the economy, and the government by giving university research benefits through policymaking. Findings and observations of research projects had a significant impact on policymaking and social issues. Overall, research is an intangible asset of faculties, universities, and countries (Reddy et al., 2016).

How far would research funds allocated by the government yield a good return in outputs, outcomes, and impacts on academicians, communities, firms, and countries? This study aims to measure and estimate the financial value of the research funds. Based on the Ministry of Higher Education Malaysia’s (MOHE) finances for University Malaysia Terengganu (UMT) researchers, a measurement simulation of return on investment (ROI) is used.

This study has three advantages over other studies. First, it considers the entire academic research process by considering the research framework, including the research inputs, process, outputs, outcomes, and impacts. Second, each indicator measures the ROI of research outputs, outcomes, and impacts. Third, this study combines the outputs, outcomes, and impacts to measure the total ROI from the research funds.

Next, this paper will discuss the research framework in the second section. This research framework will be used as the basis of the study’s return of financial value measurement. The third and fourth sections are research methodology and findings, respectively, and the fifth section concludes the study. More specifically, the third section will measure the return based on the identified indicators from previous literature.

2. Literature Review

This section will begin with a research process review, requiring inputs and generating findings in terms of outputs, outcomes, and impacts. It will examine the relevant indicators based on previous studies that consider research findings and effectiveness. The discussion will focus on the findings and effectiveness of academic and non-academic research.

Hinrichs-Krapels and Grant (2016) introduced a framework to assess research effectiveness, efficiency, and equity. Based on Figure 1, the research began with inputs and generated outputs, outcomes, and impacts through the research process. The measurement of competency was based on the research method that included input productivity to generate output. Also, academic output was an element to test the study’s efficacy. At the same time, results and impacts are two vital aspects of equity measurement.

OTGHEU_2021_v8n9_273_f0001.png 이미지

Figure 1: Framework of Inputs, Process, Outputs, Outcomes, and Impacts of Research

Source: Hinrichs-Krapels and Grant (2016).

In detail, the inputs covered the amount of funding, knowledge, and other resources required in research. The research method involved all research activities, including developing a theoretical framework, research framework, data collection, data analysis, and reporting. These research activities would contribute to the research findings, including research objectives or outputs. Outputs should include scientific writing in books or journal articles and human capital or products or technology production. Moreover, the outcomes and impacts were more comprehensive and productive than the outputs.

Most researchers measured output as a micro research achievement. Johnson et al. (2004) measured output as a direct service product of combining inputs and processes. The outputs could be measured quantitatively, consisting of services provided, human capital produced, books published, reference questions answered and time spent for the raw material process. Meanwhile, Penfield et al. (2014) measured research outputs involving knowledge-generated indicators and publications, new products and services, and whether they are positive or value-added.

In general, the output of the first indicator was research productivity. Productivity measurement included the total number of publications (El-Boghdadly et al., 2018; Fursov et al., 2016; Joshi, 2014; Ranasinghe et al., 2012; Wooding et al., 2005; Sweileh et al., 2019; Sweileh, 2018; Ahmed & Gupta, 2018; Sweileh et al., 2018; Hammond et al., 2017), the number of postgraduates attainments (Wooding et al., 2005), and international networking (Hammond et al., 2017). The second indicator was research quality. Its indicator consisted of the number of authors (Joshi, 2014; Anderson et al., 2013; Sweileh et al., 2019).

The outcome was measured as a positive or negative involvement extended from the planned or unplanned output achievement in line with the output measurement. Outcomes included evaluating indicators for the short or medium term, leading to a change in individual or group situations, attitudes, or behaviors after involving the output (Johnson et al., 2004). According to Penfield et al. (2014), impacts and outcomes were measured mutually as extensions of research output as part of the institution’s research evaluation.

Accordingly, the indicator measuring outcomes consisted of four main constructs: knowledge production, including the number of citations (El-Boghdadly et al., 2018; Fursov, Roschina, & Balmush, 2016; Ranasinghe et al., 2012; Wooding et al., 2005; Sweileh et al., 2019; Sweileh, 2018; Ahmed & Gupta, 2018; Sweileh, Al-Jabi, Zyoud, & Sawalha, 2018) and awards and recognitions (Silver et al., 2018; Seppala & Smith, 2020; Ho et al., 2020).

The second measurement was the research system (research training and career development, capacity building, targeting further research and attracting further income), which included indicators of industrial affiliations (Mirani & Yusof, 2016; Yunus & Pang, 2015) and facilities including laboratories/facilities/research assets/nonmonetary contributions (in-kind) (Mahmoud et al., 2019).

Third, informing policy included the total research used in policymaking and clinical practice guidelines (Donovan et al., 2014; Wooding et al., 2005; Guthrie et al., 2015). Fourth, product development, broader health, and economic benefits consisted of inventions such as intellectual property, patents, and patent citations (Xing et al., 2019; Azoulay et al., 2019; Costello, 2020; Moaniba et al., 2020).

Most studies, however, measured results and impacts using two independent measures. According to Penfield et al. (2014) and Morton (2015), there was an improvement of interest in measuring research impact assessment methods beyond the academic world by researchers in the United Kingdom from 2009 to 2011 (HEFCE, 2011) and developed in Europe (LERU, 2013), the United States (Hicks, 2004) and Australia (Jones et al., 2004).

The Research Excellence Framework (REF) highlighted the topic of university research’s capacity to positively impact society, particularly quality of life (REF 2014), institutions (Penfield et al., 2014), economic growth, social well-being, environment, and culture (European Commission, 2010; REF, 2014; ARC, 2017), and also science and community development (Rau et al., 2018).

In particular, economic benefits included an increase in economic growth and wealth creation. Social benefits entailed improvements in people’s health and quality of life. Environmental benefits included environmental and lifestyle improvements. Meanwhile, cultural benefits stimulated creativity in society. Other direct impacts included legislation, practice, capacity, or other changes such as contributions to policies and policy discussions, and the development of new tools, resources, and technologies or personal and professional development.

Accordingly, Penfield et al. (2014) measured the specific effect on research funders. The REF (2014) defined impact as a consequence, change or benefit to the economy, society, culture, policy or public service, health, environment, or quality of life beyond academics. According to Duryea et al. (2007), impact also involved translating knowledge and research through various complex processes, individuals, and organizations. It also showed indirect contributions by specific individuals, advanced research funding, strategies, or organizations. In addition, according to the ARC (2017), impact was defined as the contributions of research on the economy, society, environment, and culture beyond academic findings.

Direct and indirect impacts could be achieved in the short and long term. Heyeres et al. (2019) stated that these effects involved long-term transformative impact assessment on society. Short-term effects should also be measured to generate long-term effects. Research could also attract and retain donors and support social institutions (Kelly & McNicoll, 2011).

Therefore, the suggested indicator by Penfield et al. (2014) included changes in behavior, economic and intellectual wealth as an interaction between university business and community. There were four primary constructs generally. The first was scholarly production impact, which is considered an indicator of H-index (Joshi, 2014; Donovan et al., 2014; Svider et al., 2014; Hammond et al., 2017; Carpenter et al., 2014; Ozanne et al., 2016).

The second construct was research advancement impact, consisting of debate stimulation in the research community (Bunn et al., 2015; Svider et al., 2014; Ozanne et al., 2016), methodological developments, other methods of press coverage, dissemination, number of mentions in media (Bunn et al., 2015), identification of knowledge gaps, dissemination of knowledge produced (Donovan et al., 2014), research training and career advancement (Donovan et al., 2014; Ozanne et al., 2016) and capacity building and critical mass to undertake effective research (Oliveira et al., 2014; Carpenter et al., 2014).

The third was policy implications, including the translation of research into clinical practice, which is evident in changes in health and service policy and decision-making (Donovan et al., 2014). Meanwhile, the fourth construct was health and economic impacts, including actual health gain (Donovan et al., 2014), and external funding of graduate medical education (Svider et al., 2014).

Most researchers presented academic and non-academic outcomes particularly in socioeconomics, which was supposed to be measured mutually. According to Penfield et al. (2014), UK research fund evaluation should consider impact beyond academic aspects, in line with academic and socioeconomic evaluation by other countries, which considers the full assessment and transformation by research. Thus, these three assessments interacted mutually.

3. Research Methods and Materials

3.1. Data Source and Indicator Measurement

The indicators identified in section 2 will be used to measure the return of research investment. However, not all indicators can be measured due to the constraint of the end reporting project. Only the indicators of the final research report will be measured quantitatively in terms of financial value. Table 1 presents the indicators used in this study to measure the return of research funds investment.

Table 1: Indicators to Measure Research Outputs, Outcomes, and Impact

OTGHEU_2021_v8n9_273_t0001.png 이미지

The secondary data focused on these indicators based on the final report of each project as submitted to the Centre of Research and Innovation Management (CRIM). The study respondents were project leaders who had received a research grant from the Research Fund of MOHE Malaysia from 2013 to 2018. There were 14 Fundamental Research Grant Scheme (FRGS) projects, 4 Exploratory Research Grant Scheme (ERGS) projects, 1 Research Acculturation Collaborative Effort (RACE) project, and 1 Research Culture Fund (RAGS) project. But after the screening process, only 18 projects were completed for analysis due to incomplete information, and the values of grants were not stated.

3.2. Return on Investment of Research Grants

Data analysis consisted of descriptive analysis and simulation of return on research investment based on MOHE grants. The data was categorized based on the indicators in section 2 as outputs, outcomes, and impacts. The ROI for each Malaysian ringgit provided to UMT researchers by the Research Fund-Ministry of Higher Education (RF-MOHE) was then calculated for each category.

Next, these indicators (refer to Table 1) would be used to measure ROI value. According to Ahmed et al. (2021), the importance of monetary value can be measured through ROI. ROI was described by White (2007) as the ratio of resources gained or lost in investments to the number of resources supplied. Positive ROI means that the benefits outweighed the costs of the investment and vice versa. The basic ROI formula, based on Wahab et al. (2016), Botchkarev and Andru (2011), Sim et al. (2020), and Peik et al. (2019), is expressed as follows:

\(\text { Return on investment }=\frac{\text { Investment profit }-\text { Investment cost }}{\text { Investment cost }}\)

ROI assessments have been important in many sectors for years, for example, their use in libraries has been rising recently due to a decrease in funding and a concomitant rise in demand for accountability and improvements in efficacy. The growing field of library valuation allows researchers to assess the monetary value of library programs and services and show efficient use of tax dollars in cost-benefit terminology (Gellings, 2007; Luther, 2008; Tenopir et al., 2010; Elsayed & Saleh, 2015; Kingma & McClure, 2015). Strouse (2003) created the first study and then developed an ROI model for corporate libraries. These studies calculated the ROI value of each investment unit allocated to the library.

According to Tenopir (2012), Oakleaf (2010), and Tenopir and King (2007), various approaches have been adopted to measure the importance of library resources and items. These techniques could be divided into three key categories: First, the implicit meaning calculates the number of downloads or service use logs. Second, explicit values assess library facilities’ importance through qualitative interview methods and surveys. Third, the financial value of revenue earned, such as ROI, from diverse data collected with returns (benefits) and expenses of users and libraries (investments) demonstrates the relevance.

In academic research, Preuss (2016) stated that the economic impact of research funding was emphasized in the United States (Kalutkiewicz & Ehman, 2014; Macilwain, 2010), Canada (Frank & Nason, 2009; Joosse, 2009), and the United Kingdom (Corbyn, 2009; Moriarty, 2010). Based on the findings of the Mansfield (1991) study, Corbyn (2009) estimated a $0.28 annual return for each dollar spent on research. Meanwhile, Macilwain (2010) reported that every dollar spent by the National Institutes of Health (NIH) typically generates an additional economic output of $2.21 within 12 months.

ROI was used by resource development officers to measure the development performance of community colleges. Moreover, it can be a benchmark of results and identify best practices beneficial to others. Community college grants are typically funded by federal and state organizations aimed at increasing attendance and improving completion rates at two-year and vocational schools. From the study findings of Morgan (2005), every dollar invested in community college grants yields an average return of $78.84. A reasonable ROI is a minimum of $1.04 per dollar to a maximum of $554.30. The median ROI to 20 colleges based on data is $34.47.

3.3. Data Analysis

This study applies simulation methods to measure the return of UMT research funding. This method is based on the Gellings (2007) method, modified by Luther (2008) and Tenopir et al. (2010), who evaluated ROI from the outputs, outcomes, and impacts of the research.

The simulation method, seen in Figure 2, involves the research input and the number of funds raised. It is followed by a research process comprising all research activities, including data collection, data analysis, reporting, and others. These research activities then produce findings in terms of outputs, outcomes, and impacts. The effectiveness of research will be measured based on whether the ROI value is positive or negative. Positive values reflecting each financial unit can generate positive benefits that have a multiplier value to universities, agencies, firms, and society. Meanwhile, negative values indicate that each financial unit cannot generate added value or benefits. This is a loss for the MOHE since it financed research projects that produced the deficit values.

OTGHEU_2021_v8n9_273_f0002.png 이미지

Figure 2: ROI Formula Evaluation

Table 2 shows the calculation method to measure returns for each indicator. For example, the return for publication is calculated as the total number of publications for each project divided by the total number of all project publications multiplied by the total number of grants. Generally, each indicator is calculated by the following formula:

Table 2: The Calculation of Return in Publication

OTGHEU_2021_v8n9_273_t0002.png 이미지

\(f x=a / \Sigma x \text { MYR Fund }\)

where:

a = the number of items for each indicator by vote no. (e.g. number of publications, citations, number of authors)

Ʃ = total of indicators (e.g. total publications)

MYRFund = total research funds (MYR) for all grants

i = cross-sectional data (by research project)

Based on the value of return for each indicator, the simulation of ROI can be measured in terms of the financial value of all research returns (outputs, outcomes, and impacts) based on the total research funds received by UMT during the study.

4. Results and Discussion

4.1. Descriptive Analysis

For the simulation, CRIM, UMT, has provided the final report of 18 research projects consisting of 13 FRGS, 4 ERGS, and 1 RAGS project. These 18 projects have been filtered by the CRIM, including science and social science.

The highest and lowest research grant funds have been approved, around MYR149,580 and MYR40,000 with a completion period of two years, and three years and six months, respectively. The final report reported the final achievement for each research project consisting of publications, awards, and recognitions, facilities, external research funds, designs, publication companies, commercialized products, talent, industry attachments, policy papers, knowledge transfer programs, and consultations.

In the publication, the FRGS59265 grant project recorded the highest number of indexes at 11. Meanwhile, the ERGS55088 grant project has no indexed publication. The FRGS59391 grant project received 10 awards (3 international awards, 5 national awards, and 2 university awards).

In addition, facilities refer to laboratory/facility/ research asset/nonfinancial support (in-kind) obtained from external sources of research grants, such as software licenses from partners. There were only two project grants with research facilities/assets, namely, ERGS55102 at MYR2,000 and FRGS59433 at MYR29,000.

The new research fund from the current fund, ERGS55067, managed to obtain an FRGS fund of MYR113,200 for three years by MOHE.

Design refers to intellectual property, patents, patent citations, licenses, and royalties from the research results. There were only five projects with intellectual property, including FRGS59388 with the highest number at one patent and two trademarks.

Additionally, for the talent variable, two projects, ERGS55102 and ERGS55087, recorded the highest number of postgraduate students (3 nos). Meanwhile, five projects have one postgraduate student: FRGS59388, ERGS55067, FRGS59220, FRGS59366, and RAGS57086.

Only three projects have academic attachments to the industry/agency/government locally or industrial attachments abroad. ERGS55088 carried out an industrial attachment with MYR3,000 allocation for 14 days. With MYR5,000 allocation, FRGS59391 performed academic attachments for 21 days, and FRGS59366 had academic attachments abroad for 7 and 84 days in a country with an MYR5,000 allocation.

4.2. Return on Investment

The simulation of ROI is completely based on the return values for each indicator as presented in Table 1. First, we calculated the returns of research output as reported in Table 3. Then we calculated the returns of research outcomes and impacts. The return values were all in financial form (MYR).

Table 3: Returns of Research Outputs, Outcomes and Impacts

OTGHEU_2021_v8n9_273_t0003.png 이미지

The returns for research output were calculated based on the total index-journal publication, total postgraduate students, international collaborations, and the number of authors. Table 3 presents the returns for the total publications of indexed journal records with a minimum financial value of MYR0 and a maximum value of MYR298,455. In terms of graduate student output, the minimum return value is MYR49,742.50, and the maximum return value is MYR198,970.00. The minimum and maximum values for networking are MYR0 and MYR1, 220,952.27 respectively. The minimum and maximum returns of the number of authors are MYR0 and MYR239,651.60, respectively.

Moreover, for research output, the first indicator is total citations, which contributed a maximum return value of MYR694,014.98. The second indicator is awards and recognitions, contributing a maximum return value of MYR511, 637.14. For facility and industrial attachment, the maximum returns were MYR29,000 and MYR5, 000, respectively. The return of invention is MYR537, 219. For research impact, only two indicators were identified: the first was the H-index, which contributed to MYR298,455, and external funding of MYR113,200.

Second, based on the return value for each indicator, we calculated the financial value of ROI in MYR. The calculation of ROI for research funds is based on the traditional ROI calculation. The return value for each research project can be calculated as reported in Table 3. The maximum return is reported by project FRGS59265 with a return value of MYR2,597,065.57. Meanwhile, the minimum return value was contributed by project FRGS59276 at MYR173,720.51. Accordingly, there were research projects that provided small returns compared to the average total amount of input.

Third, we calculated the ROI value:

\(\begin{aligned} \text { ROI of Output } &=\frac{\text { MYR } 7,162,920-\text { MYR } 1,790,730}{\text { MYR } 1,790,730} \\ &=\frac{\text { MYR } 5,372,190}{\text { MYR } 1,790,730} \\ &=\text { MYR } 3.00 \end{aligned}\)

\(\begin{aligned} \text { ROI of Outcome } &=\frac{\text { MYR } 5,416,190-\text { MYR } 1,790,730}{\text { MYR } 1,790,730} \\ &=\frac{\text { MYR } 3,625,460}{\text { MYR } 1,790,730} \\ &=\text { MYR } 2.02 \end{aligned}\)

\(\begin{aligned} \text { ROI of Impact } &=\frac{\text { MYR } 1,903,930-\text { MYR } 1,790,730}{\text { MYR } 1,790,730} \\ &=\frac{\text { MYR } 113,200}{\text { MYR } 1,790,730} \\ &=\text { MYR } 0.06 \end{aligned}\)

\(\begin{aligned} \text { Total ROI } &=\frac{\text { (MYR 7,162,920+MYR 5,416,190 }}{\text { +MYR 1,903,930) }-\text { MYR 1,790,730 }} \\ &=\frac{\text { MYRR 1,790,730 }}{\text { MYR 1,790,730 }} \\ &=\text { MYR 7.09 } \end{aligned}\)

In calculating the ROI values, we divided the calculation into the ROI of output, ROI of the outcome, ROI of impact, and total ROI (the total ROI of outputs, outcomes, and impacts).

This study measured financial value using the MOHE’s funds for researchers at UMT. There were 4 items and 12 indicators used to measure the return of research fund investments. Overall, the ROI was MYR7.09 for each MYR channeled by the MOHE to UMT. More specifically, the ROI values were MYR3 for output, MYR2.02 for an outcome, and MYR0.06 for impact. By considering the output indicators, each MYR invested by the MOHE will generate MYR3. Meanwhile, for outcomes and impacts, each MYR will generate MYR2.02 and MYR0.06, respectively. The results show that for a research project, it is more difficult to produce impacts and outcomes compared to research outputs.

Several studies measured ROI by Wahab et al. (2016) stated that every MYR1 invested in library subscriptions generates MYR1.28 through research grants. Other than that, every dollar spent by the NIH typically generates an additional economic output of $2.21 within 12 months.

Meanwhile, Morgan (2005) found that every dollar invested in the development of community college grants earns an average return of $78.84. The results showed a reasonable ROI at a minimum return of $1.04 per dollar to a maximum return of $554.30.

It is quite challenging to conduct a similar study that reports the ROI for each MYR from the federal government to universities in Malaysia. Accordingly, this section will focus on findings related to general ROIs.

5. Conclusion

In conclusion, ROI would be much higher if its measurement considers outputs, outcomes, and impacts. However, ROI would be slightly lower if its measurement is solely based on outputs or outcomes, or impacts. A more interesting finding shows that ROI values for outputs are much higher than those for outcomes and impacts. This shows that for a research project, it is more difficult to produce impacts and outcomes compared to research outputs. Our findings also suggested that the government should consider the financial value of ROI in evaluating the universities’ ranking as well as yearly fund allocations. Financial performance is measured through return on assets, return on equity and return on investment.

A joint report by Malaysia’s Ministry of Education; Department of Higher Education; Elsevier, a global information analytics business specializing in science and health; and QS Quacquarelli Symonds shows that Malaysia’s gross expenditure on research and development (GERD) has increased by nearly $4 billion to reach over $12 billion in 2018, representing 1.4 percent of the country’s GDP that year. From 2014 to 2018, Malaysian researchers produced a cumulative research output of over 150,000 publications, which grew at a five-year compound annual growth rate (CAGR) of 4.9 percent. The volume of Malaysia’s top 10 percent most-cited publications grew at an even faster pace – with a five-year CAGR of 12.7 percent – accounting for a relatively high number of top 10 percent publications produced per million of GERD dollars. Taken together, the findings suggest that Malaysia is realizing a return on research investment dollars and is one of the most productive nations compared with five other Asian nations and territories analyzed in the report (Elsevier, 2019).

Acknowledgements

We are grateful to the Centre of Research and Innovation Management (CRIM), University Malaysia Terengganu for funding this project (Vot number: SRG 55194), as well as providing the data and also to Normi Azura Ghazali who indirectly contributed to the completion of this article.

References

  1. Ahmed, K. K. M., & Gupta, B. M. (2018). Nano-enabled drug delivery research: A scientometric assessment of Indian publications during 1995-2018. International Journal of Pharmaceutical Investigation, 8(4), 182-191. https://doi.org/10.4103/jphi.JPHI_11_19
  2. Ahmed, Z., Shakoor, Z., Khan, M. A., & Ullah, W. (2021). The role of financial risk management in predicting financial performance: A case study of commercial banks in Pakistan. Journal of Asian Finance, Economics, and Business, 8(5), 0639-0648. https://doi.org/10.13106/jafeb.2021.vol8.no5.0639
  3. Aldrich, H. E. (2012). The emergence of entrepreneurship as an academic field: A personal essay on institutional entrepreneurship. Research Policy, 41(7), 1240-1248. https://doi.org/10.1016/j.respol.2012.03.013
  4. Altbach, P. G. (2009). Peripheries and centers: Research universities in developing countries. Asia Pacific Education Review, 10, 15-27. https://doi.org/10.1007/s12564-009-9000-9
  5. Anderson, D. L., Smart, W., & Tressler, J. (2013). Evaluating research: Peer review team assessment and journal based bibliographic measures: New Zealand PBRF research output scores in 2006. New Zealand Economic Papers, 47(2), 140-157. https://doi.org/10.1080/00779954.2013.772879
  6. Audretsch, D. B. (2014). From the entrepreneurial university to the university for the entrepreneurial society. The Journal of Technology Transfer, 39(3), 313-321. https://doi.org/10.1007/s10961-012-9288-1
  7. Audretsch, D. B., & Keilbach, M. (2004). Entrepreneurship capital and economic performance. Regional Studies, 38(8), 949-959. https://doi.org/10.1080/0034340042000280956
  8. Australian Research Council. (2017). Engagement and impact assessment pilot 2017. Commonwealth of Australia. http://www.arc.gov.au/sites/default/files/filedepot/Public/EI/Engagement_and_Impact_Assessment_Pilot_2017_Report.pdf.
  9. Azoulay, P., Zivin, J. S. G., Li, D., & Sampat, B. N. (2019). Public R&D investments and private-sector patenting: Evidence from NIH funding rules. Review of Economic Studies, 86(1), 117-152. https://doi.org/10.1093/restud/rdy034
  10. Belenzon, S., & Schankerman, M. (2013). Spreading the word: Geography, policy, and knowledge spillovers. Review of Economics and Statistics, 95(3), 884-903. https://doi.org/10.1162/REST_a_00334
  11. Botchkarev, A., & Andru, P. (2011). A return on investment as a metric for evaluating information systems: Taxonomy and application. Interdisciplinary Journal of Information, Knowledge, and Management, 6(6), 245-269. https://doi.org/10.28945/1535
  12. Bunn, F., Trivedi, D., Alderson, P., Hamilton, L., Martin, A., Pinkney, E., & Iliffe, S. (2015). The impact of Cochrane Reviews: A mixed-methods evaluation of outputs from Cochrane Review Groups supported by the National Institute for Health Research. Health Technology Assessment, 19(28), 1-99. https://doi.org/10.3310/hta19280
  13. Carpenter, C. R., Cone, D. C., & Sarli, C. C. (2014). Using publication metrics to highlight academic productivity and research impact. Academic Emergency Medicine: Official Journal of the Society for Academic Emergency Medicine, 21(10), 1160-1172. https://doi.org/10.1111/acem.12482
  14. Corbyn, Z. (2009). Bang for your bucks. The Times Higher Education Supplement, 32, 43-56.
  15. Costello, L. (2020). A survey of Canadian academic librarians outlines the integration of traditional and emerging services. Evidence-Based Library and Information Practice, 15(3), 184-186. https://doi.org/10.18438/eblip29789
  16. Donovan, C., Butler, L., Butt, A. J., Jones, T. H., & Hanney, S. R. (2014). Evaluation of the impact of national breast cancer foundation-funded research. The Medical Journal of Australia, 200(4), 214-218. https://doi.org/10.5694/mja13.10798
  17. Duryea, M., Hochman, M., & Parfitt, A. (2007). Measuring the impact of research. http://www.atn.edu.au/docs/Research%20Global%20-%20.Measuring%20the%20impact%20of%20research.pdf.
  18. El-Boghdadly, K., Docherty, A. B., & Klein, A. A. (2018). Analysis of the distribution and scholarly output from National Institute of Academic Anaesthesia (NIAA) research grants. Anaesthesia, 73(6), 679-691. https://doi.org/10.1111/anae.14277
  19. Elsayed, A. M. & Saleh, E. I. (2015). Measuring the return on investment of academic libraries in Arab countries: A proposed model. Information Development, 31(3), 219-228. https://doi.org/10.1177/0266666913512146
  20. Elsevier. (2019). Malaysia's R&D investment paying off with higher research productivity and improved university ranking. https://www.elsevier.com/about/press-releases/corporate/malaysias-r-and-d-investment-paying-off-withhigher-research-productivity-and-improved-universityranking
  21. European Commission. (2010). Assessing Europe's universitybased research. https://ec.europa.eu/research/sciencesociety/document_library/pdf_06/assessing-europeuniversity-based-research_en.pdf
  22. Frank, C., & Nason, E. (2009). Health research: Measuring the social, health, and economic impacts. Canadian Medical Association Journal, 180(5), 528-534. https://doi.org/10.1503/cmaj.090016
  23. Friedman, J., & Silberman, J. (2003). University technology transfer: Do incentives, management, and location matter? The Journal of Technology Transfer, 28(1), 17-30. https://doi.org/10.1023/A:1021674618658
  24. Fursov, K., Roschina, Y., & Balmush, O. (2016). Determinants of research productivity: An individual-level lens. Foresight and STI Governance, 10(2), 44-56. https://doi.org/10.17323/1995-459X.2016.2.44.56
  25. Gellings, C. (2007). Outsourcing relationships: The contract as IT governance tool. 40th Annual Hawaii International Conference on System Sciences (HICSS '07), 3-6 Jan 2007, Big Island, Hawaii, USA (pp.236c-236c). US, Europe: The IT Governance Institute. https://doi.org/10.1109/HICSS.2007.421
  26. Guerrero, M., Cunningham, J. A., & Urbano, D. (2015). The economic impact of entrepreneurial universities' activities: An exploratory study of the United Kingdom. Research Policy, 44(3), 748-764. https://doi.org/10.1016/j.respol.2014.10.008
  27. Guthrie, S., Bienkowska-Gibbs, T., Manville, C., Pollitt, A., Kirtley, A., & Wooding, S. (2015). The impact of the national institute for health research health technology assessment program, 2003-13: A multimethod evaluation. Health Technology Assessment, 19(67), 1-291. https://doi.org/10.3310/hta19670
  28. Hammond, G. W., Le, M., Novotny, T., Caligiuri, S. P. B., Pierce, G. N., & Wade, J. (2017). An output evaluation of a health research foundation's enhanced grant review process for new investigators. Health Research Policy and Systems, 15, 57. https://doi.org/10.1186/s12961-017-0220-x
  29. Hausman, N. (2012). University innovation, local economic growth, and entrepreneurship (Working Paper No. CESWP-12-10). U.S. Census Bureau Center for Economic Studies Paper. Maryland, US: U.S. Census Bureau. https://ssrn.com/abstract=2097842
  30. Heyeres, M., Tsey, K., Yang, Y., Yan, L., & Jiang, H. (2019). The characteristics and reporting quality of research impact case studies: A systematic review. Evaluation and Program Planning, 73, 10-23. https://doi.org/10.1016/j.evalprogplan.2018.11.002
  31. Hicks, D. M. (2004). Bibliometric evaluation of federally funded research in the United States. Research Evaluation, 13(2), 76-86. https://doi.org/10.3152/147154404781776446
  32. Higher Education Funding Council. (2011). REF Assessment framework and guidance on submissions. Bristol, UK: HEFCE.
  33. Hinrichs-Krapels, S., & Grant, J. (2016). Exploring the effectiveness, efficiency, and equity (3e's) of research and research impact assessment. Palgrave Communications, 2, 16090. https://doi.org/10.1057/palcomms.2016.90
  34. Ho, C. S. M., Lu, J. F., & Bryant, D. A. (2020). The impact of teacher entrepreneurial behavior: A timely investigation of an emerging phenomenon. Journal of Educational Administration, 58(6), 697-712. https://doi.org/10.1108/JEA-08-2019-0140
  35. Hsu, D. W., Shen, Y.-C., Yuan, B. J., & Chou, C. J. (2015). Toward successful commercialization of university technology: Performance drivers of university technology transfer in Taiwan. Technological Forecasting and Social Change, 92, 25-39. https://doi.org/10.1016/j.techfore.2014.11.002
  36. Hulten, C. (2001). Total factor productivity: A short biography. In C. R. Hulten, E. R. Dean, & M. J. Harper (Eds.), New developments in productivity analysis (pp. 1-54). Chicago, IL: University of Chicago Press.
  37. Johnson, I. M., Williams, D. A., Wavell, C., & Baxter, G. (2004). Impact evaluation, professional practice, and policymaking. New Library World, 105(1/2), 33-46. https://doi.org/10.1108/03074800410515255
  38. Jones, L., Schedler, K., & Mussari, R. (2004). Strategies for public management reform. Elsevier.
  39. Joosse, G. (2009). Revolutionary changes in Alberta postsecondary funding: Comparison of two college's responses [Unpublished doctoral dissertation]. ProQuest database.
  40. Joshi, M. A. (2014). Bibliometric indicators for evaluating the quality of scientific publications. The Journal Contemporary Dental Practice, 15(2), 258-262. https://doi.org/10.5005/jpjournals-10024-1525
  41. Kalutkiewicz, M. J., & Ehman, R. L. (2014). Patents as proxies: NIH hubs of innovation. Nature Biotechnology, 32(6), 536-537. https://doi.org/10.1038/nbt.2917
  42. Kantor, S., & Whalley, A. (2014). Knowledge spillovers from research universities: Evidence from endowment value shocks. Review of Economics and Statistics, 96(1), 171-188. https://doi.org/10.1162/REST_a_00357
  43. Kelly, U., & McNicoll, I. (2011). Through a glass, darkly: Measuring the social value of universities. National Co-ordinating Centre for Public Engagement. https://www.publicengagement.ac.uk/sites/default/files/80096%20NCCPE%20Social%20Value%20Report.pdf.
  44. Kelly, U., McNicoll, I., & White, J. (2014). The impact of universities on the UK economy. London, UK: University UK.
  45. Kingma, B., & McClure, K. (2015). Lib-value: Values, outcomes and return on investment of academic libraries, phase III: ROI of the Syracuse University Library. College & Research Libraries, 76(1), 63-80. https://doi.org/10.5860/crl.76.1.63
  46. Le, O. T. T., & Bui, N. T. (2020). Responsibility accounting in public universities: A case in Vietnam. Journal of Asian Finance, Economics, and Business, 7(7), 169-178. https://doi.org/10.13106/jafeb.2020.vol7.no7.169
  47. League of European Research Universities. (2013). Research universities and research assessment. Leuven: LERU.
  48. Lin, Y. Y. C., & Chen, Y. C. M. (2007). Does innovation lead to performance? An empirical study of SMEs in Taiwan. Management Research News, 30(2), 115-132. https://doi.org/10.1108/01409170710722955
  49. Luther, J. (2008). University investment in the library: What's the return? A case study at the University of Illinois at UrbanaChampaign. Amsterdam, The Netherlands: Elsevier.
  50. Macilwain, C. (2010). Science economics: What science is really worth. Nature, 465, 682-684. https://doi.org/10.1038/465682a
  51. Mahmoud, A. S., Sanni-Anibire, M. O., Hassanain, M. A., & Ahmed, W. (2019). Key performance indicators for the evaluation of academic and research laboratory facilities. International Journal of Building Pathology and Adaptation, 37(2), 208-230. https://doi.org/10.1108/IJBPA-08-2018-0066
  52. Mansfield, E. (1991). Academic research and industrial innovation. Research Policy, 20(1), 1-12. https://doi.org/10.1016/0048-7333(91)90080-A
  53. Mirani, M. A., & Yusof, M. (2016). Entrepreneurial engagements of academics in engineering Universities of Pakistan. Procedia Economics and Finance, 35, 411-417. https://doi.org/10.1016/S2212-5671(16)00051-4
  54. Moaniba, I. M., Lee, P. C., & Su, H. N. (2020). How does external knowledge sourcing enhance product development? Evidence from drug commercialization. Technology in Society, 63, 101414. https://doi.org/10.1016/j.techsoc.2020.101414
  55. Morgan, N. (2005). Characteristics associated with the effectiveness of resource development programs at Florida community colleges. Electronic Theses and Dissertations, 362, 55-129. https://stars.library.ucf.edu/etd/362
  56. Moriarty, P. (2011). False economies. The Times Higher Education Supplement 2010. Retrieved January 20, 2021, from: https://www.timeshighereducation.com/news/falseeconomies/417033.article
  57. Morton, S. (2015). Progressing research impact assessment: A 'contributions' approach. Research Evaluation, 24(4), 405-419. https://doi.org/10.1093/reseval/rvv016
  58. Oakleaf, M. (2010). Value of academic libraries: A comprehensive research review and report. Chicago, IL: Association of College and Research Libraries.
  59. Oliveira, M. C. L. A., Martelli, D. R., Quirino, I. G., Colosimo, E. A., Silva, A. C. S., Junior, H. M., & De Oliveira, E. A. (2014). Profile and scientific production of the Brazilian Council for scientific and technological development (CNPq) researchers in the field of hematology/oncology. Revista da Associacao Medica Brasileira, 60(6), 542-547. https://doi.org/10.1590/1806-9282.60.06.012
  60. Ozanne, J. L., Davis, B., Murray, J. B., Grier, S., Benmecheddal, A., Downey, H., Ekpo, A. E., Garnier, M., Hietanen, J., Le Gall-Ely, M., Seregina, A., Thomas, K. D., & Veer, E. (2016). Assessing the societal impact of research: The relational engagement approach. Journal of Public Policy and Marketing, 36(1), 1-14. https://doi.org/10.1509/jppm.14.121
  61. Peik, S., Schimmel, E., & Hejazi, S. (2019). Projected return on investment of a corporate global health program. BMC Public Health, 19, 1476. https://doi.org/10.1186/s12889-019-7857-z
  62. Penfield, T., Baker, M. J., Scoble, R., & Wykes, M. C. (2014). Assessment, evaluations, and definitions of research impact: A review. Research Evaluation, 23(1), 21-32. https://doi.org/10.1093/reseval/rvt021
  63. Preuss, M. (2016). Return on investment and grants: A review of present understandings and recommendations for change. Research Management Review, 21(1), 1-26. https://files.eric.ed.gov/fulltext/EJ1134098.pdf
  64. Ranasinghe, P., Jayawardena, R., & Katulanda, P. (2012). Sri Lanka in global medical research: A scientific analysis of the Sri Lankan research output during 2000-2009. BMC Research Notes, 5, 121. http://www.biomedcentral.com/1756-0500/5/121 https://doi.org/10.1186/1756-0500-5-121
  65. Rau, H., Goggins, G., & Fahy, F. (2018). From invisibility to impact: Recognising the scientific and societal relevance of interdisciplinary sustainability research. Research Policy, 47(1), 266-276. https://doi.org/10.1016/j.respol.2017.11.005
  66. Reddy, K. S., Xie, E., & Tang, Q. (2016). Higher education, highimpact research, and world university rankings: A Case of India and comparison with China. Pacific Science Review B: Humanities and Social Sciences, 2(1), 1-21. https://doi.org/10.1016/j.psrb.2016.09.004
  67. Research Excellence Framework. (2014). Results. http://results.ref.ac.uk/
  68. Rhoten, D., & Calhoun, C. (2011). Knowledge matters: The public mission of the research university. New York: Columbia University Press.
  69. Sanusi, N. A., Johari, M. S., & Padli, J. (2019). Science social scientific research. Kuala Nerus: UMT Publisher.
  70. Sanusi, N. A., Moosin, A. F., & Kusairi, S. (2020). Neural network analysis in forecasting the Malaysian GDP. Journal of Asian Finance, Economics, and Business, 7(12), 109-114. https://doi.org/10.13106/jafeb.2020.vol7.no12.109
  71. Sekaran, U. (2002). Research methods for business (4th ed.). Hoboken, NJ: Wiley.
  72. Seppala, N., & Smith, C. (2020). Teaching awards in higher education: A qualitative study of motivation and outcomes. Studies in Higher Education, 45(7), 1398-1412. https://doi.org/10.1080/03075079.2019.1593349
  73. Silver, J. K., Bank, A. M., Slocum, C. S., Blauwet, C. A., Bhatnagar, S., Poorman, J. A., Goldstein, R., Reilly, J. M., & Zafonte, R. D. (2018). Women physicians were underrepresented in the American Academy of Neurology recognition awards. American Academy of Neurology, 91(7), e603-e614. https://doi.org/10.1212/WNL.0000000000006004
  74. Sim, S. Y., Watts, E., Constenla, D., Brenzel, L., & Patenaude, B. N. (2020). Return on investment from immunization against 10 pathogens in 94 low and middle-income countries, 2011-30. Health Affairs, 39(8), 1343-1353. https://doi.org/10.1377/hlthaff.2020.00103
  75. Strouse, R. (2003). Demonstrating value and return on investment: The ongoing imperative. Information Outlook, 7(3), 45-59. https://www.semanticscholar.org/paper/Demonstratingvalue-and-return-on-investment%3A-The-Strouse
  76. Svider, P. F., Husain, Q., Folbe, A. J., Couldwell, W. T. Liu, J. K., & Eloy, J. A. (2014). Assessing national institutes of health funding and scholarly impact in neurological surgery: Clinical article. Journal of Neurosurgery, 120, 1-295. https://doi.org/10.3171/2013.8.JNS13938
  77. Sweileh, W. M. (2018). Global research output on HIV/AIDS: Related medication adherence from 1980 to 2017. BMC Health Services Research, 18, 765. https://doi.org/10.1186/s12913-018-3568-x
  78. Sweileh, W. M., Al-Jabi, S. W., Zyoud, S. H., & Sawalha, A. F. (2018). Outdoor air pollution and respiratory health: A bibliometric analysis of publications in peer-reviewed journals (1900-2017). Multidisciplinary Respiratory Medicine, 13, 15. https://doi.org/10.1186/s40248-018-0128-5
  79. Sweileh, W. M., Huijer, H. A., Al-Jabi, S. W., Zyoud, S. H., & Sawalha, A. F. (2019). Nursing and midwifery: Research activity in Arab countries from 1950 to 2017. BMC Health Services Research, 19, 340. https://doi.org/10.1186/s12913-019-4178-y
  80. Tenopir, C. (2012). Beyond usage: Measuring library outcomes and value. Library Management, 233(1/2), 5-13. https://doi.org/10.1108/01435121211203275
  81. Tenopir, C., & King, D. W. (2007). Perceptions of value and value beyond perceptions: Measuring the quality and value of journal article readings. Serials, 20(3), 199-207. https://doi.org/10.1629/20199
  82. Tenopir, C., Love, A., Park, J., Wu, L., Baer, A., & Mays, R. (2010). University investment in the library, phase II: An international study of the library's value to the grants process. Library Connect White Paper, Elsevier Library Connect.
  83. Toivanen, O., & Vaananen, L. (2016). Education and invention. Review of Economics and Statistics, 98(2), 382-396. https://doi.org/10.1162/REST_a_00520
  84. Valero, A., & Van Reenen, J. (2016). The economic impact of universities: Evidence from across the globe. Economic of Education Review, 68, 53-67. https://doi.org/10.1016/j.econedurev.2018.09.001
  85. Van Der Steen, M., & Enders, J. (2008). Universities in evolutionary systems of innovation. Creativity and Innovation Management, 17(4), 281-292. https://doi.org/10.1111/j.1467-8691.2008.00496.x
  86. Wahab, E., Shamsuddin, A., Abdullah, N. H., & Abdul Hamid, N. (2016). Users' satisfaction and return on investment (ROI) for online database library databases: A Malaysian technical university perspective. Procedia-Social and Behavioral Sciences, 219, 777-783. https://doi.org/10.1016/j.sbspro.2016.05.075
  87. White, L. N. (2007). An old tool with potential new uses: Return on investment. The Bottom Line, 20(1), 5-9. https://doi.org/10.1108/08880450710747407
  88. Whitehead, A. N. (1929). The aims of education and other essays: Universities and their function. New York: Macmillan Company.
  89. Wooding, S., Hanney, S., Buxton, M., & Grant, J. (2005). Payback arising from research funding: Evaluation of the arthritis research campaign. Rheumatology, 44(9), 1145-1156. https://doi.org/10.1093/rheumatology/keh708
  90. Xing, Z., Yu, F., Du, J., Walker, J. S., Paulson, C. B., Mani, N. S., & Song, L. (2019). Conversational interfaces for health: Bibliometric analysis of grants, publications, and patents. Journal of Medical Internet Research, 21(11), e14672. https://doi.org/10.2196/14672
  91. Yeo, B. (2018). The societal impact of university innovation. Management Research Review, 41(11), 1309-1335. https://doi.org/10.1108/MRR-12-2017-0430
  92. Yunus, M. A. S., & Pang, V. (2015). Academic promotion in Malaysia: Meeting academics' expectations and institutional needs. RIHE International Seminar Reports, 23, 61-81. https://files.eric.ed.gov/fulltext/ED574241.pdf
  93. Zikmund, W. G. (2003). Business research method (7th ed.). Nashville, United States: South-Western.