Browse > Article

Development of Evaluation Perspective and Criteria for the DataON Platform  

Kim, Suntae (Department of Library and Information Science, Jeonbuk National University)
Publication Information
Journal of Information Science Theory and Practice / v.8, no.2, 2020 , pp. 68-78 More about this Journal
This study is a preliminary study to develop an evaluation framework necessary for evaluating the DataON platform. The first objective is to examine expert perceptions of the level of DataON platform construction. The second objective is to evaluate the importance, stability, and usability of DataON platform features over OpenAIRE features. The third objective is to derive weights from the evaluation perspective for future DataON platform evaluation. The fourth objective is to examine the preferences of experts in each evaluation perspective and to derive unbiased evaluation criteria. This study used a survey method for potential stakeholders of the DataON platform. The survey included 12 professionals with at least 10 years of experience in the field. The 57 overall functions and services were measured at 3.1 out of 5 for importance. Stability was -0.07 point and usability was measured as -0.05 point. The 42 features and services scored 3.04 points in importance. Stability was -0.58 points and usability was -0.51 points. In particular, the stability and usability scores of the 42 functions and services provided as of 2018 were higher than the total functions were, which is attributed to the stable and user-friendly improvement after development. In terms of the weight of the evaluation point, the collection quality has the highest weight of 27%. Interface usability is then weighted 22%. Subsequently, service quality is weighted 19%, and finally system performance efficiency and user feedback solicitation are equally weighted 16%.
evaluation perspective; evaluation criteria; research data platform; DataON; Digital Library;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Adams Becker, S., Cummins, M., Davis, A., Freeman, A., Giesinger Hall, C., Ananthanarayanan, V., ... Wolfson, N. (2017). NMC Horizon report: 2017 library edition. Retrieved November 3, 2019 from
2 Buttenfield, B. (1999). Usability evaluation of digital libraries. Science & Technology Libraries, 17(3-4), 39-59.   DOI
3 Barsky, E., Brown, H., Ellis, U., Ishida, M., Janke, R., Menzies, E., ... Vis-Dunbar, M. (2017). UBC Research data management survey: Health sciences: Report. Retrieved from University of British Columbia website:
4 Candela, L., Castelli, D., Pagano, P., Thanos, C., Ioannidis, Y., Koutrika, G., ... Schuldt, H. (2007). Setting the foundations of digital libraries: The DELOS manifesto. D-Lib Mag, 13(3/4).
5 Diepenbroek, M., Glockner, F. O., Grobe, P., Guntsch, A., Huber, R., Konig-Ries, B., ... Triebel, D. (2014, September 22-26). Towards an integrated biodiversity and ecological research data management and archiving platform: The German federation for the curation of biological data (GFBio). In E. Plodereder, L. Grunske, E. Schneider, & D. Ull (Eds.), Informatik 2014 (pp. 1711-1721). Gesellschaft fur Informatik e.V.
6 Foster, E. D., & Deardorff, A. (2017). Open Science Framework (OSF). Journal of the Medical Library Association, 105(2), 203-206.   DOI
7 Fuhr, N., Tsakonas, G., Aalberg, T., Agosti, M., Hansen, P., Kapidakis, S., ... Solvberg, I. (2007). Evaluation of digital libraries. International Journal on Digital Libraries, 8(1), 21-38.   DOI
8 Lagzian, F., Abrizah, A., & Wee, M. C. (2013). An identification of a model for digital library critical success factors. The Electronic Library, 31(1), 5-23.   DOI
9 Mayernik, M. S. (2015). Research data and metadata curation as institutional issues. Journal of the Association for Information Science and Technology, 67(4), 973-993.   DOI
10 Ministry of Science and ICT. (2019). Announcement of 2020 ministry of transitional and monetary affairs budget and government R&D budget. Retrieved November 9, 2019 from
11 Saracevic, T. (2004, October 4-5). Evaluation of digital libraries: An overview. In M. Agosti, & N. Fuhr (Eds.), DELOS WP7 Workshop on the Evaluation of Digital Libraries (pp. 13-30). Springer.
12 Singh, N. K., Monu, H., & Dhingra, N. (2018, February 21-23). Research data management policy and institutional framework. 2018 5th International Symposium on Emerging Trends and Technologies in Libraries and Information Services (ETTLIS) (pp. 111-115). IEEE.
13 Tenopir, C., Talja, S., Horstmann, W., Late, E., Hughes, D., Pollock, D., ... Allard, S. (2017). Research data services in European academic research libraries. LIBER Quarterly, 27(1), 23-44.   DOI
14 Tsakonas, G., Kapidakis, S., & Papatheodorou, C. (2004, October 4-5). Evaluation of user interaction in digital libraries. DELOS WP7 Workshop on the Evaluation of Digital Libraries (pp. 45-60). Elsevier.
15 Zhang, Y. (2010). Developing a holistic model for digital library evaluation. Journal of the American Society for Information Science and Technology, 61(1), 88-110.   DOI
16 Tsakonas, G., & Papatheodorou, C. (2011). An ontological representation of the digital library evaluation domain. Journal of the American Society for Information Science and Technology, 62(8), 1577-1593.   DOI
17 Xie, H. I. (2006). Evaluation of digital libraries: Criteria and problems from users' perspectives. Library & Information Science Research, 28(3), 433-452.   DOI
18 Xie, H. I. (2008). Users' evaluation of digital libraries (DLs): Their uses, their criteria, and their assessment. Information Processing & Management, 44(3), 1346-1373.   DOI
19 Xie, I., & Matusiak, K. K. (2016). Evaluation of digital libraries. In I. Xie, & K. K. Matusiak (Eds.), Discover digital libraries (pp. 281-318). Amsterdam: Elsevier.