DOI QR코드

DOI QR Code

Applying and Evaluating Visualization Design Guidelines for a MOOC Dashboard to Facilitate Self-Regulated Learning Based on Learning Analytics

  • Cha, Hyun-Jin (School of General Education, Dankook University) ;
  • Park, Taejung (College of Liberal Arts and Interdisciplinary Studies, Kyonggi University)
  • Received : 2018.09.03
  • Accepted : 2018.12.22
  • Published : 2019.06.30

Abstract

With the help of learning analytics, MOOCs have wider potential to succeed in learning through promoting self-regulated learning (SRL). The current study aims to apply and validate visualization design guidelines for a MOOC dashboard to enhance such SRL capabilities based on learning analytics. To achieve the research objective, a MOOC dashboard prototype, LM-Dashboard, was designed and developed, reflecting the visualization design guidelines to promote SRL. Then, both expert and learner participants evaluated LM-Dashboard through iterations to validate the visualization design guidelines and perceived SRL effectiveness. The results of expert and learner evaluations indicated that most of the visualization design guidelines on LM-Dashboard were valid and some perceived SRL aspects such as monitoring a student's learning progress and assessing their achievements with time management were beneficial. However, some features on LM-Dashboard should be improved to enhance SRL aspects related to achieving their learning goals with persistence. The findings suggest that it is necessary to offer appropriate feedback or tips as well as to visualize learner behaviors and activities in an intuitive and efficient way for the successful cycle of SRL. Consequently, this study contributes to establishing a basis for the visual design of a MOOC dashboard for optimizing each learner's SRL.

Keywords

1. Introduction

 Massive Open Online Courses (MOOCs) have made a paradigm change in higher education through the potentialities for diverse learners from all over the world to access open and free contents of high quality [1]. One of the main characteristics in MOOCs is to provide diverse learners with an opportunity to personalize their learning in terms of topic, time, place, and method. The learners in MOOCs have different learning objectives and plans to utilize online courses according to their personal needs. Such characteristics lead the learners in choosing their own personalized path through self-regulated learning (SRL). Therefore, low completion and high drop-out rates have been key issues in MOOC, even if their popularity has become increasingly high [2] [3] [4].

 In fact, obtaining a certificate could be the objective of applying to a MOOC for some students who need to study the entire content across the MOOC. However, other types of students might have plans to study only interesting topics or contents except for what they already studied. Such student’s purpose of applying the MOOC might be different. Thus, MOOCs require that students should be autonomous and further self-regulated during the learning process since learners start studying alone and keep learning without direct contact with teachers and peers [5].

 As the field of learning analytics and big data for education has been rapidly growing, MOOC platforms should provide personalized learning environments with differentiated feedback for each individual student who sets his or her own learning goals. In a MOOC learning context, all activities of learners and teachers might be digitalized and collected as meaningful educational big-data which can be used for data mining [6]. Through a dashboard, MOOCs can collect, analyze, and utilize students’ abundant log data about their conditions and learning styles. In spite of such effectiveness of MOOCs, most students do not have suitable SRL competencies or easily feel bored and then drop-out [7] [8]. Therefore, MOOC dashboards should support students’ continuous or persistent learning and their successful performance by promoting their SRL cycle suggested by Zimmerman [9]. Although a large body of prior research mentioned its importance in helping students to gain SRL abilities, there has been little research on improving SRL through the dashboard. Recently, Qu and Chen [10] indicated visual analytics could play a crucial role during the learning process of MOOCs in help of the revolutionary improvement of MOOC platforms. However, it is still challenging to extract meaningful information from data of students’ activities and visualize the data intuitively and effectively, especially in terms of SRL. Accordingly, this study aims to develop and validate design strategies for visualizing a MOOC dashboard to facilitate SRL based on learning analytics. To achieve this goal, firstly, we systemically reviewed relevant previous studies and analyzed the learning analytics functions of the extant typical MOOC dashboards such as edX, K-MOOCs, Coursera, Khan Academy, and FutureLearn, and then tried to derive design guidelines which can be applied to the design and development of MOOC dashboards.

 This study pays attention to the critical significance of visualization of MOOC Dashboards as visual feedback for students to facilitate SRL. This study, thus, aims to provide guidelines for visually designing and developing a MOOC dashboard to enhance students’ SRL. We conducted our study to address the following research questions:

 1) What are visualization guidelines of the MOOC dashboard to promote SRL based on learning analytics?

 2) How do educational technology experts validate a prototype of the MOOC dashboard applying such visualization guidelines to promote SRL?

 3) How do users/learners evaluate a prototype of the MOOC dashboard applying such visualization guidelines to promote SRL?

2. Theoretical Background

 MOOCs are more interactive than traditional online courses in the sense that MOOC dashboards play an essential role in analyzing students’ learning history, and in offering appropriate instructional interventions in addition to delivering learning resources such as video lectures, reading materials, textbooks, and exams. Another difference from other online learning and e-learning, which focused on proving participant’s knowledge achievement, is that the motivation of MOOC learners aims to their individual interests and learning experiences [11]. The majority of MOOC learners describe their intentions on taking courses as gaining fun, enjoying new experiences, and finding social relationships in their academic interests. Therefore, an indicator of MOOC completion might be process-focused rather than outcome-focused [12] [13] since aims and motivation of MOOC participants are widely varied [14]. It can be also demonstrated from a study which was done by Harvard University, whose results found that 39 percent of their learners have current and former teaching backgrounds [15]. It demonstrates that the main objective of MOOCs helps to promote active learning experiences and SRL.

 In fact, the SRL strategies have been studied for decades in traditional face-to-face (F2F) classroom settings. A widely accepted definition of SRL is the ability to plan, control, and manage learning activities and processes that enhance goal achievement [16] [17]. One of the most established models of SRL in the previous literature might be Zimmerman and Martinez-Pons’s [18] model that was comprised of ten SRL strategies. Furthermore, Zimmerman and Campillo [19] suggested the cyclic phase model of SRL, which comprises a forethought phase, a performance phase, and a reflection phase. The SRL strategies could help a learner develop and sustain learning for better engagement and learning outcomes. The present research builds on Zimmerman and Martinez-Pons’ [18] model of SRL and Zimmerman and Campillo’s [19] cyclic phase model of SRL. The main reason is since one of the MOOC characteristics is the course directed by users themselves which means to plan their own learning course by themselves, to perform learning directed by themselves, and to evaluate themselves (reflect themselves if they do not want to get a certificate).

 Students who appropriately apply SRL skills are more successful in higher-order thinking such as critical thinking, problem solving and reasoning [20] [21]. Moreover, they might have a higher academic performance, motivation, and learning interests [17]. In online learning environments as well as in face-to-face (F2F) learning environments, there is growing empirical evidence for the role of SRL strategies on student engagement and learning outcomes in online learning environments [1] [22] [23] [24] [25] [26]. Taken together, a large body of prior research in online learning environments as well as traditional learning environments suggests that prompting learners with SRL strategies can improve their academic performance and success in learning.

 Prior studies have investigated applications of SRL strategies in online learning environments. Such online SRL strategies include goal setting (e.g. [1]), planning (e.g. [7] [8]), time management (e.g. [7] [8] [24]) metacognition (e.g. [24]), motivations (e.g. [1]), self-monitoring (e.g. [7] [8]), and effort regulation (e.g. [24]).

Table 1. SRL strategies from successful MOOC learners [27]

 In the current research of Kizilcec, and colleagues [27], SRL strategies of self-monitoring, self-evaluation, goal setting/planning, time management, effort regulation, and help seeking can be used for learners to succeed in their MOOCs. Thus, the success of learning MOOCs depends on SRL which involves setting up students’ learning goals, exploring efficient, effective and appealing ways to learn, and monitoring their learning progress to achieve the goals. The new field of learning analytics is related to similar evolutions in big data, e-science, web analytics, and educational data mining [28] [29] [30] [31]. The use of such learning analytics in MOOC environments mainly results in its dashboard. A wide variety of heterogeneous students easily experience difficulty in utilizing the analytical tools provided to explore the data of MOOC platforms. To tackle this problem, visual dashboard systems to analyze learning behaviors should be developed for data on MOOCs based on learning analytics [10]. In accordance with results of the research conducted by Schaffer, Huynh, O'Donovan, Höllerer, Xia, and Lin [32], student behaviors relating to SRL in the context of MOOCs are categorized according to their features.

 In recent years, the success of the visual dashboard systems and relevant models depends on achieving the intended behaviors of the students. To visualize learning patterns in MOOCs, Chen, Davis, Lin, Hauff, and Houben [33] have developed statistical models to predict drop-outs based on learner activity logs of MOOCs. Duval [34] proposed goal-oriented visualizations which of all the activities helped learners perceive their learning progress and manage their future activities. Visualizing MOOC dashboards based on learning analytics can benefit instructors, operators and institutions (i.e. university or company administrators) to guide students’ behaviors interactively and support their persistent learning. In this respect, from the perspective of students, well-visualized MOOC dashboards might support their SRL by providing appropriate information on student’s behaviors. Currently, Authors [5] suggested design strategies for visualizing information of MOOC dashboards on a basis of learning analytics.

Table 2. Design Guidelines of visualizing MOOC learning dashboards to promote SRL based on learning analytics [5]

3. Research Method

3.1 LM-Dashboard Design and Visualization Strategies

 The LM-Dashboard in this study was designed as a dashboard prototype within MOOC environments based on the cyclic phase model of SRL suggested by Zimmerman and Campillo [19]. In particular, the LM-Dashboard focused on the reflection phase among three cyclic phases when a dashboard plays a role in providing and visualizing learning data by analyzing student’s behaviors and activities. Moreover, it helps students to become aware, reflect and sense-make in their progress and achievements through learning analytics toward the learning objectives [42]. This study aims to design and develop LM-Dashboard prototypes whose visualization strategies through learning analytics can promote SRL in the MOOC environments on the premise that goal setting and planning has been done by students in the forethought phase, and learning on the MOOC platforms would be in the performance phase. Therefore, the achievement figures and numbers of the LM-Dashboard represent more individual and customized quantitative and qualitative data compared to learner’s plans. The theoretical backgrounds of such quantitative and qualitative data would be on the sixth visualization strategies (6.1) to promote SRL on the MOOC dashboard shown in Table 3.

 In the first prototype, the LM-Dashboard consists of three main menus for Course Progress, Learning Activities, Learning Evaluation shown in Fig. 1. The Course Progress page contains (a) Weekly learning achievement progress (compared to learner’s plans); (b) Weekly learning records; (c) Weekly activities records; and (d) Pop-up menu for Weekly activity achievement records. The Learning Activities page includes (a) Activity records for quizzes; (b) Activity records within forums; (c) Activity records for assignments; (d) Activity records for reflection journals; and (e) a Pop-up menu for Content Analysis. On the learning evaluation, students can identify (a) My current status; (b) Course results and comparisons with peers; and (c) My achievement badges.

Table 3. The theoretical backgrounds for the menus of the LM-Dashboard

Fig. 1. Course progress page on the first LM-Dashboard prototype

3.2 Research Procedure and Methods

 To achieve this research objective, this study was conducted according to the following prodcure. First of all, the LM-Dashboard prototype was developed to represent the critical significance of visualization strategies of MOOC dashboards to promote individual student’s SRL based on design guidelines of visualizing MOOC dashboards from the learning analytics perspective in Table 2. Then, in order to empirically validate the visualization strategies and evaluate the perceived effectiveness of the LM-Dashboard prototype, the Phase I usability study was conducted to validate the visualization strategies and improving the usability of the LM-Dashboard with eight educational technology professionals. The Phase I usability study focused on validity of each visualization items and additional comments to improve the visualization strategies.

Fig. 2. Research procedure and methods

 After gathering comments from the Phase I usablity study, the LM-Dashboard prototypes were revised and redesigned based on the results of the Phase I study. The Phase II usability study was conducted with the revised version of LM-Dashboard to improve student’s user experience and evaluate the perceived SRL effectiveness in a iterative manner. Therefore, students as a user on the LM-Dashboard were participated in the Phase II usability study. From the user-centered design, iterative usability evaluations with both experts and users would contribute to make the final products more user-friendly [43]. Therefore, in this study, user-centered design methods were adopted to validate and improve user experience of the LM-Dashboard from both experts and user perspectives.

3.3 Participant Selection & Instruments

 In the Phase I study, eight educational technology professionals were recruited. All eight participants are experts in their field with a PhD in educational technology or computer science education and have been working in educational technology sectors for more than five years. Most of them also have experiences with MOOC design and development research. 

The instrument for evaluating the LM-Dashboard prototype in the phase I study was a questionnaire with the focus of the validation on the visualization strategies integrated into the MOOC dashboard for assisting learners with SRL. The questionnaire was designed to respond with the 5-point Likert scale on the validity with open and additional comments of whether each visualization items on the dashboard page can help to promote SRL. The pages on the prototype were captured and the visualization items on the page were explained based on previous studies.

Table 4. The theoretical backgrounds for the menus of the LM-Dashboard

 In the Phase II study, students participated in the usability evaluation on the revised LM-Dashboard prototype based on the results of the expert reviews from the phase I study. Nielsen [44] found that 5 users could find more than 85% of usability problems in the design, so he recommends that multiple usability evaluations with 5 users in an iterative manner might be more effective than an usability evaluation with many participants. In this study, to achieve usability objectives of the LM-Dashboard prototype, two different evaluations with experts and users were conducted in an iterative manner and the number of the participants in the phase II study were decided as six users in an efficient way which mean that 90% of usability problems might be found according to Nielsen’s [44] Graph. The consent form with the questionnaire was acquired, and a small incentive was offered to the participants.

Table 5. Learners’ profile

3.4 Data Analysis

 In the phase I study, the data on the quantitative validity scale were analyzed on the descriptive statistics of the mean and standard deviation. The content analysis was adopted to analyze the educational technology professional’s qualitative opinions. Since the purpose of this study is to validate the design features on the visualization of the LM-Dashboard to promote SRL, the coding scheme is focused on the relationship between the visualization strategies and the effects of SRL on each feature. Therefore, the qualitative comments categorized into the visualization items on the LM-Dashboard shown in Table 3.

 In the phase II study, the questionnaire with the working high-fidelity prototype was analyzed from both a quantative and a qualitative perspectives. The purpose of the phase II study focused on learners’ understandability, usefulness, and usability [43] and perceived SRL effectiveness [11] [19]. Therefore, in addition to the descriptive statistics, the qualitative analysis was conducted to classify participants into perceived strengths and weaknesses of the LM-Dashboard in terms of SRL and visualization.

3.5 Reliability and Validity

 The peer-review by debriefing one other as collaborative researchers was adopted for aiding qualitative verification techniques [45]. Throughout the research, collaborative researchers held in-depth discussions on the features and issues of the research process regularly. In addition, this study validated the visualization strategies integrated into the MOOC dashboard for learner-promoted SRL by triangulating data through both quantitative and qualitative study with diverse methodological approaches and with experts in the field and the MOOC users (students) [46].

3.6 Limitations

 This study’s limitations include the confined context as well as the small group of educational professionals and participants of the study. In fact, the context and participants are not representative of MOOC users all around the world, so these small sample groups inhibit generalizability for all MOOC users. Therefore, this study did not aim to generalize the results, but to demonstrate how the MOOC dashboard might promote SRL for users and to validate the visualization strategies (principles) and the dashboard (practice) on the MOOC environments from educational professionals and users’ perspectives. In addition, future research will be discussed to develop the visualization guidelines (principles) and dashboard design strategies (practice) for the learner’s SRL at the end of this paper.

4. Results

4.1 Phase I Study: Validity evaluation on the visualization strategies from experts

 The results of the educational technology practitioners’ responses to the five-point Likert scale on the validation of the MOOC dashboard designs based on the visualization strategies to promote SRL and the qualitative comments are presented in Table 6. First, the descriptive statistics including means (M) and standard deviations (SD) on the validation scale showed that the experts in educational technology evaluated most of visualization items that promoted SRL on the MOOC dashboard prototype, LM-Dashboard highly valid by rating more than 4.0, except for 4 items (1.2, 2.4, 2.5, 3.3).

 The reasons for the low-rated 4 items can be derived from the qualitative comments. Firstly, five experts argued that social comparisons might not be very effective on the course progress feature compared to the social comparison in terms of the achievements. Furthermore, four experts also commented that the individual comparisons on the social network might not be very positive (1.2 and 3.3). They argued that it could also invade the student’s private information. The comments on social comparisons from experts might have arisen from no accurate information about the criteria on how the social comparison might be calculated and who is compared on the learning analytics. In fact, experts had inquiries about accurate criteria on the x-axis and y-axis number of the graph rather than the visualization features themselves. On the learning activities page, the number difference (y-axis) between % of participation and % of efforts was ambiguous, so 5 experts said that the y-axis number should be clearer.

Table 6. Expert evaluation on the LM-Dashboard prototype to promote SRL

 Secondly, most of the experts discussed that reflective activities might not be conducted as many as to be analyzed and visualized in a word-cloud (2.4, 2.5). In the current MOOC environments, it is because the reflection is not a main learning activity in which students are encouraged to complete. From this perspective, analysis also revealed that experts evaluated the information derived from the learning analytics rather than the visualization features themselves.

 Most of experts presented concurring comments on the following aspects: firstly, they thought that the traffic light and digital badges might be very effective to encourage students in terms of SRL, but some of experts doubted which type of graph might be more intuitive. For instance, an expert disagreed that a radar graph might not be very informative. It can be translated that the experts are not professionals on visualization, so it seemed to be difficult for them to judge the visualization features in a professional way. Secondly, they recommended that simple information architect and design on the dashboard might be more effective and intuitive than too much information. Finally, it is noteworthy that an expert insisted that quantitative learning records on the MOOC environment are not as valuable as those on the offline learning and other online learning because massiveness and openness can lead to less rigorous evaluation from instructors.

 The revisions made in LM-Dashboard prototype for Phase 1 are as follows:

  • On the course progress page, the detailed information on the weekly basis such as weekly learning records and activity records is moved as a pop-up feature on the specific week clicked. Other repetitive features and data were deleted.
  • On the learning activity page, the criteria on the y-axis number of each activity are shown in an intuitive way, for example, time on video clips watched, the number of posts on discussions and Q&A in terms of comparing student’s individual progress to his/her plan.
  • On the learning evaluation page, private information about individual comparison was encrypted.

4.2 Phase II Study: Learner’s User experience and perceived SRL effectiveness evaluation

 The questionnaire items to evaluate learners’ experiences on the 2nd LM-Dashboard consisted of whether it is useful, understandable and usable for their SRL. The following in Table 7 presents the results of learners’ ratings to survey questions on the user experience according to each visualization feature on the LM-Dashboard. The average scores are distributed from 4.00 to 4.33 which are somewhat higher. From the results, it was proven that the visualization features and information on the LM-Dashboard offered a positive learning experience.

Table 7. Learners’ ratings of user experience evaluation questions on the LM-Dashboard

  The following results in Table 8 presents the learners’ ratings to survey questions on their perceived SRL effectiveness using the LM-Dashboard. These questions consisted of seven summative evaluation items about the LM-Dashboard prototype in terms of perceived SRL effectiveness, which were developed based on previous studies [18] [19]. The average scores are distributed from 2.67 to 4.33. As shown in Table 8, three questions (Q1, Q2, Q7) rated means higher than 4.0, which means that some parts of SRL such as monitoring learning progress, evaluating learning outcomes, and managing time effectively would be improved through the LM-Dashboard experience. However, four items (Q3, Q4, Q5, Q6) had means lower than 4.0, which can be interpreted that such SRL perspectives were not handled in the LM-Dashboard. The reasons were analyzed in alignment with the findings from the qualitative comments at the end of this section.

Table 8. Learners’ ratings of perceived SRL effectiveness evaluation on the LM-Dashboard

 The results from the analysis of the qualitative comments are presented in Appendix 1. The learners’ feedback or comments to open-ended questions regarding evaluating user experience and perceived SRL effectiveness on LM-Dashboard were also analyzed by classifying them into perceived strengths and weaknesses of the LM-Dashboard in terms of SRL and visualization. The learners’ qualitative comments on the strengths and weaknesses of the LM-Dashboard prototype support validity on some of the design guidelines for visualizing a MOOC learning dashboard to promote SRL based on learning analytics. To explain, the strengths from the results show that most learners discussed the positive impact on the design features of the LM-Dashboard prototype, regarding some aspects of their SRL improvements such as the effectiveness to monitor and evaluate their status of learning progress compared to their plans at a glance, usefulness on which part they have to study more, and a sense of accomplishment and motivation through digital badge and achievement reports.

 However, from the weaknesses of the designed prototype noted by learners, the following suggestions can be made for future improvements of the visualization design guidelines on MOOC dashboards. Firstly, Learners A, C, and F pointed out that social comparison of personal scores can demotivate students to learn. They suggested that the MOOC dashboard should provide only an average score of peers or achievement level of their own step-by-step goals for SRL. In fact, the social comparison feature was one of topics which should be argued with the opposite opinions. Secondly, on the course progress page, Learner B mentioned that it would be more useful if peers reviews or comments on their postings were added. Thirdly, on the learning activities page, Learner F reported that proper instructor’s tips can change from the ‘attention’ level to the ‘excellent’ level in the students’ performance. Fourthly, Learners B and C required the concrete and accurate criteria for acquiring badges or gaining scores. Additionally, Learner E claimed that the clear description of the figures in graphs was needed at a glance. With regards to the overall user interface, it is suggested to eliminate complex information, increase readability, and adjust the appropriate amount of information on one page.

 In summary, the results showed that most learners reported positive learning experiences on the 2nd LM-Dashboard in terms of understandability, usability, and usefulness, but they presented mixed opinions about the perceived SRL effectiveness. Learners gave high ratings on the help of managing their own learning progress, evaluating learning outcomes, and time management, but low ratings on promoting confidence to achieve their learning goals and keep their learning on track. Regarding the motivation, the average evaluation score for learners was 3.67, with a mean lower than 4.0, yet promising mean score. From the results, it can be shown that the 2nd LM-Dashboard prototype provided learners with a positive learning experience regarding monitoring their learning progress, evaluating themselves, and managing their learning time, which play a role in promoting parts of the SRL competencies [18] [19]. However, learners did not provide a confident answer about the effectiveness of keeping them on track according to their plans and achieving their goals without giving up. They thought that the information analyzed and visualized their learning data on the LM-Dashboard prototype would not be enough to promote SRL. The reasons can be also found in alignment with the qualitative analysis from the open-ended comments. Most of learners required the prescriptive tips and recommended comments based on their learning progress and status in addition to the visualization information analyzed. In this respect, the visualization information might be more helpful with prescriptions and recommendations in terms of promoting SRL.

 The revised and suggested features in visualizing the LM-Dashboard prototype according to learner evaluation results in Phase II are as follows:

On the course progress page, the prescriptive tips on the progress and recommended stage are provided to keep on proper track and succeed in learning objective. In addition, detailed information on the status and date of the assignment submission is offered. On the learning activity page, instructor advice or peer tips for motivating students whose level of learning activities is significantly lower than others are added.

On the learning evaluation page, self-set goals or step-by-step recommended goals are presented to help students who struggle with confidence. Additionally, the fair and clear achievement criteria or terms of relevant information is offered.

On the overall interface design, optional buttons on social comparisons and badges features as well as individual data analyzed are offered to allow learners to visualize their own selected information according to their preferences. Duplicate, repetitive or redundant information should be minimized.

 The prescriptive comments on the dashboard in MOOC environments is a big research topic, so this feature is suggested, so the completed features could not be revised on our LM-Dashboard prototype.

5. Discussion and Implications

 As mentioned in the earlier limitations section, the purpose of this study is not to generalize the findings, but to provide MOOC designers and professionals on learning analytics with design guidelines and visualization strategies on the MOOC dashboard to promote SRL.

 Overall, as discussed in the results section, both expert and learner participants responded with a positive impact on LM-Dashboard in terms of the visualization features to promote SRL. Expert participants validated the visualization features by rating more than 4.0, except for 4 items (1.2, 2.4, 2.5, 3.3) on the 1st LM-Dashboard prototype. The four items with low ratings were revised through the 2nd LM-Dashboard prototype. Then, in the 2nd LM-Dashboard prototype, student participants evaluated that learning experiences on the visualization features were highly positive with the mean score higher than 4.0 on all three pages in terms of usefulness, understandability, and usability for their SRL.

 However, the results from the perceived SRL effectiveness by student participants revealed that some of SRL aspects were not addressed through the LM-Dashboard. Learners evaluated that the visualization features might positively influenced on monitoring their learning progress and assessing their achievement with efficient time management but might not help to completing their learning goals with the persistence by solving difficulties in learning. They additionally requested prescriptive tips and recommendations to keep on track and achieve their learning objectives.

 The implications from the results of this study can be discussed as follows. First of all, to make decisions on the type of graph which is more understandable and intuitive to learners was one crucial and difficult task. Depending on experts, opinions about which type of graph might be more effective on some specific visualization features were different. To explain, it was agreeable to adopt the line chart on the comparisons among two or three numbers related to student’s activity plan and progress, but the slider chart was recommended instead of the traffic lights, and an expert thought that the radar chart was not informative. For the decisions on revisions of the second prototype, previous research about the visualization on the dashboard was studied. The traffic light can be utilized to emphasize an early warning to signal the status of the learners compared to a progress bar [47], so we made decisions on retaining the traffic light feature since our purpose on use of the traffic light is to give learners appropriate signals on their progress. However, we made the traffic light more varied in color from 3 steps to 5 steps to adopt the advantage of the slider chart. In terms of radar graph, Lupu-dima, Corbu, and Edelhauser [48] also argued that the type of graph shows multi-dimensional data in a more intuitive way, so it represents embodied metrics in a small space and is easily comparable with different colors layered. It was judged that the advantages of the radar chart might be attractive to learners during the first revision of the prototype, and learners from the Phase II evaluated that the radar graph was useful to look at information at a glance. From the examples, it was revealed that the visualization type and method on individual information from the learning analytics has its own characteristics and merits, so designers should make decisions on the type of visualization based on the purpose as Santos, Govaerts, Verbert, and Duval [49] emphasized and that the visualization should be linked to the intended purpose.

 Secondly, opposite opinions on the social comparisons from both experts and students were discussed. Some experts argued that the purpose of the dashboard in this study is to facilitate individual student’s SRL, so self-definition of goals and progress comparison is a more important than the social comparison. On the other hand, other experts agreed on the merits of the social comparison. Student participants also were divided on the positive and negative sides regarding the peer comparison feature. In fact, since this study did not conduct experimental or quantitative research, selection or generalization on a specific feature was not on the focus. Davis, Jivet, and Kizilcec [50] recently found that the social comparisons in the MOOC environments contributed to increasing student’s completion rates, and Zimmerman and Campillo [19] also suggested social comparison even if the studies were done in the traditional learning environment. However, it should be carefully designed to provide an individual comparison with a specific peer group, as a learner participant suggested that the average score of the peers might be more informative than the individual score of peers. In the case of research by Davis and colleagues [50], successful group of learners and role models among peer groups were adopted to facilitate the social comparisons and give social cues in the visualization. In addition, it might be better to give accurate criteria and guidelines on how students translate the social comparison in terms of SRL. Davis and colleagues [50] developed the social comparison visualization based on the analysis of the role models and successful learners in a personalized feedback system. Therefore, our conclusion regarding the social comparison feature is to give students an option to toggle the feature on/off based on the visualization criteria from Davis and colleagues [50]. However, future in-depth research about the criteria on the selection of peers and more effective visualization for social comparison in the MOOC environments to promote SRL is suggested.

 Thirdly, the effectiveness of the digital badge feature was also discussed by some of the expert and student participants. From the previous studies, Anderson and Staub [51] emphasized the merit of the badge as an authentic assessment tool of performance-based activity and Fain [52] argued that the digital badges play a role in representing student’s learning experiences. In particular, Mah [53] insisted that the digital badges influenced the student’s motivation, planning and retention in a positive way. In this respect, even if two experts and one learner argued the merit of the digital badge, the final prototype retained the digital badge feature based on the opposite opinions among both experts and learners as well as findings from the previous studies. Whether the digital badges are effective or not, it is more important to visualize accurate criteria on awarding the digital badges to improve the effectiveness as two student participants asked in the usability evaluation. It can be suggested to make criteria on the award and validation of the effectiveness for the digital badges in the MOOC environments. Furthermore, the personalized menu options should be developed so that students can select a specific button to make them see only their preferred information.

 Fourthly, learner participants required the prescriptive comments with the visualization of the learning progress on the dashboard. They argued that intuitive visualization on their learning patterns and analysis data would be more helpful with the recommendations on their learning. There are many previous studies on learning analytics and recommendations [34], but the research contexts were different though not in MOOC environments. In fact, MOOC makes it possible to collect big data from diverse learners all over the world. In this respect, in-depth studies on the customized prescriptions and recommendations in accordance with individual learning paths or progresses based on learning analytics to promote SRL are suggested.

 Finally, both experts and students required simple visualization on the important components from learning analytics without repetitive contents. They argued that vivid colored badges and many graphs analyzed in a varied way interfere with their intuitive understanding. Therefore, it can be concluded that it is more efficient and effective to present crucial features with simple and intuitive visualization on the dashboard.

6. Conclusion and Future Directions

 Most dashboards fail to communicate efficiently and effectively, not because of inadequate technology (at least not primarily), but because of poorly designed implementations [54]. The LM-Dashboard prototype has informed us of how important it is to offer instructional interventions such as feedback, advice and tips as well as to collect and analyze the learning experiences of MOOCs and visualize individual learner’s progress and activities in an intuitive and efficient way for the successful cycle of SRL. This study provides useful insights into the visual design of MOOC dashboards and the ways how expert validation and learner evaluation could inform continuous improvements throughout iterations. The findings on the LM-Dashboard prototype in this study can help instructional designers, system developers, and UI/UX visual designers to establish a fundamental understanding into designing a visualization dashboard in MOOC environments by offering meaningful quantitative and qualitative information from analyzed data of students’ learning progress and activities intuitively and effectively. In particular, this study showed some parts of SRL competencies of learners improved through the visualization information and features on the MOOC dashboard based on learning analytics. Therefore, the results of this study can provide instructional designers, system developers, and UI/UX visual designers with the exemplary design and development process of the MOOC dashboards. 

 However, there are still challenges in accommodating individual preferences and characteristics of massive learners and raising a confidence to achieve learner’s goal without giving up when they faced learning difficulties. In fact, the SRL includes the (meta-) cognitive, behavioral, and affective aspects of learning [55]. While self-regulated cognitions and behaviors are critically important, self-regulated affective components seem as significant in determining learners’ attitudes and abilities for SRL [56] [57]. In this study, the learners’ reactions to the LM-Dashboard prototype reveal somewhat low confidence or motivation persisting in their learning. Therefore, it is required to assess comments that promote students’ affective SRL competencies to enhance their self-regulated emotions and motivation in accordance with their learning progress and status in addition to the visualization information.

 Future studies can collect real data from the system and evaluate the user log data with completion rate by implementing a dashboard that promotes SRL based on the design implications induced from this study. MOOCs and learning analytics are well reciprocal in which learning behaviors could provide instructors, instructional designers, IT and HCI professionals, and institutional operators with a broad sense of the opportunities of personalization and prediction in educational big data [58]. The customized dashboard based on state-of-the-art AI technology in the recent future could solve the problem with individual preferences by offering the individualized interface and information on some of features which showed mixed opinions about the perceived preferences and effectiveness of social comparisons and digital badges in this study. Thus, it is more important that the dashboard be designed to promote a student’s SRL based on the quantitative and qualitative data through learning analytics in a more intuitive and effective way. Research on visualizing MOOC dashboards to facilitate SRL is not limited to the field of visual/graphic design but is also found in a wide range of areas such as educational psychology, instructional design, computer science, HCI and engineering. The collaboration and interaction in these research areas are needed for its development.

References

  1. A. Littlejohn, and C. Milligan, "Designing MOOCs for professional learners: Tools and patterns to encourage self-regulated learning," eLearning, vol.42, no.4, pp.1-10, 2015.
  2. S. Halawa, D. Greene, and J. Mitchell, "Dropout prediction in MOOCs using learner activity features," Experiences and best practices in and around MOOCs, vol.7, pp.3-12, 2014.
  3. D. F. Onah, J. E. Sinclair, and R. Boyatt, "Exploring the use of MOOC discussion forums," in Proc. of London International Conference on Education, pp.1-4, November, 2014.
  4. R. Rivard, "Measuring the MOOC dropout rate," Inside Higher Ed, vol.8, 2013.
  5. T. Park, H. Cha, and G. Lee, "A study on design guidelines of learning analytics to facilitate self-regulated learning in MOOCs," Educational Technology International, vol.17, no.1, pp.1-34.
  6. K. Jordan, "MOOC completion rates: the data," 2013.
  7. R. F. Kizilcec, and E. Schneider, "Motivation as a lens to understand online learners: Toward data-driven design with the OLEI scale," ACM Transactions on Computer-Human Interaction (TOCHI), vol.22, no.2, p.6, 2015.
  8. S. Zheng, M. B. Rosson, P. C. Shih, and J. M. Carroll, "Understanding student motivation, behaviors and perceptions in MOOCs," in Proc. of the 18th ACM conference on computer supported cooperative work & social computing, pp.1882-1895, February, 2015.
  9. B. J. Zimmerman, "Becoming a self-regulated learner: An overview," Theory into practice, vol.41, no.2, pp.64-70, 2002. https://doi.org/10.1207/s15430421tip4102_2
  10. H. Qu, and Q. Chen, "Visual analytics for MOOC data," IEEE computer graphics and applications, vol.35, no.6, pp.69-75, 2015. https://doi.org/10.1109/MCG.2015.137
  11. I. Frolov, and S. Johansson, "An adaptable usability checklist for MOOCs: A usability evaluation instrument for Massive Open Online Course," Master Thesis, Department of Informatics, HCI, UMEA, 2013.
  12. I. de Waard, M. S. Gallagher, R. Zelezny-Green, L. Czerniewicz, S. Downes, A. Kukulska-Hulme, and J. Willems, "Challenges for conceptualising EU MOOC for vulnerable learner groups," in Proc. of the European MOOC Stakeholder Summit 2014, pp.33-42, 2014.
  13. S. Downes, "The quality of Massive Open Online Courses," International Handbook of E-learning, vol.1, pp.65-77, 2013.
  14. A. Creelman, U. Ehlers, and E. Ossiannilsson, "Perspectives on MOOC quality-An account of the EFQUEL MOOC Quality Project," INNOQUAL-International Journal for Innovation and Quality in Learning, vol.2, no.3, pp.78-87, 2014.
  15. A. Ho, I. Chuang, J. Reich, C. A. Coleman, J. Whitehill, C. G. Northcutt, J. J. Williams, J. D. Hansen, G. Lopez, and R. Petersen, "HarvardX and MITx: Two years of Open Online Courses Fall 2012-Summer 2014," March, 2015.
  16. M. Boekaerts, "Self-regulated learning: where we are today," International Journal of Educational Research, vol.31, pp.445-457, 1999. https://doi.org/10.1016/S0883-0355(99)00014-2
  17. P. R. Pintrich, "The role of goal orientation in self-regulated learning," Handbook of self-regulation, pp.451-502, 2000.
  18. B. J. Zimmerman, and M. M. Pons, "Development of a structured interview for assessing student use of self-regulated learning strategies," American educational research journal, vol.23, no.4, pp. 614-628, 1986. https://doi.org/10.3102/00028312023004614
  19. B. J. Zimmerman, and M. Campillo, "Motivating self-regulated problem solvers," The psychology of problem solving, pp.233-262, 2003.
  20. M. C. English, and A. Kitsantas, "Supporting student self-regulated learning in problem-and project-based learning," Interdisciplinary journal of problem-based learning, vol.7, no.2, p.6, 2013.
  21. R. A. Kuiper, and D. J. Pesut, "Promoting cognitive and metacognitive reflective reasoning skills in nursing practice: self-regulated learning theory," Journal of Advanced Nursing, vol.45, no.4, PP.381-391, 2004. https://doi.org/10.1046/j.1365-2648.2003.02921.x
  22. R. Azevedo, D. C. Moos, J. A. Greene, F. I. Winters, and J. G. Cromley, "Why is externally-facilitated regulated learning more effective than self-regulated learning with hypermedia?," Educational Technology Research and Development, vol.56, no.1, pp.45-72, 2008. https://doi.org/10.1007/s11423-007-9067-0
  23. M. Bannert, and C. Mengelkamp, "Scaffolding hypermedia learning through metacognitive prompts," International handbook of metacognition and learning technologies, pp. 171-186, Springer, New York, NY, 2013.
  24. J. Broadbent, and W. L. Poon, "Self-regulated learning strategies & academic achievement in online higher education learning environments: A systematic review," The Internet and Higher Education, vol.27, pp.1-13, 2015. https://doi.org/10.1016/j.iheduc.2015.04.007
  25. X. Lin, and J. D. Lehman, "Supporting learning of variable control in a computer-based biology environment: Effects of prompting college students to reflect on their own thinking," Journal of research in science teaching, vol.36, no.7, pp.837-858, 1999. https://doi.org/10.1002/(SICI)1098-2736(199909)36:7<837::AID-TEA6>3.0.CO;2-U
  26. M. Taub, R. Azevedo, F. Bouchet, and B. Khosravifar, "Can the use of cognitive and metacognitive self-regulated learning strategies be predicted by learners' levels of prior knowledge in hypermedia-learning environments?," Computers in Human Behavior, vol.39, pp.356-367, 2014. https://doi.org/10.1016/j.chb.2014.07.018
  27. R. F. Kizilcec, M. Perez-Sanagustin, and J. J. Maldonado, "Self-regulated learning strategies predict learner behavior and goal attainment in Massive Open Online Courses," Computers & Education, vol.104, pp. 18-33, 2017. https://doi.org/10.1016/j.compedu.2016.10.001
  28. T. Anderson (Ed.), "The theory and practice of online learning," Athabasca University Press, 2008.
  29. A. Croll, and S. Power, "Complete web monitoring: watching your visitors, performance, communities, and competitors," O'Reilly Media, Inc., 2009.
  30. T. Hey, and A. E. Trefethen, "Cyberinfrastructure for e-Science," Science, vol.308, no. 5723, pp.817-821, 2005. https://doi.org/10.1126/science.1110410
  31. C. Romero, and S. Ventura, "Educational data mining: A survey from 1995 to 2005," Expert systems with applications, vol.33, no.1, pp.135-146, 2007. https://doi.org/10.1016/j.eswa.2006.04.005
  32. J. Schaffer, B. Huynh, J. O'Donovan, T. Hollerer, Y. Xia, and S. Lin, "An analysis of student behavior in two massive open online courses," in Proc. of 2016 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining, pp. 380-385, August, 2016.
  33. G. Chen, D. Davis, J. Lin, C. Hauff, and G. J. Houben, "Beyond the MOOC platform: gaining insights about learners from the social web," in Proc. of the 8th ACM Conference on Web Science, pp. 15-24, ACM, May, 2016.
  34. E. Duval, "Attention please!: learning analytics for visualization and recommendation," in Proc. of the 1st international conference on learning analytics and knowledge, pp. 9-17, February, 2011.
  35. L. Ali, M. Hatala, D. Gasevic, and J. Jovanovic, "A qualitative evaluation of evolution of a learning analytics tool," Computers & Education, vol.58, no.1, pp.470-489, 2012. https://doi.org/10.1016/j.compedu.2011.08.030
  36. SAM DashBoard S/W. https://www.samlearning.com
  37. FutureLearn dashboard. https://www.futurelearn.com
  38. F. Grunewald, C. Meinel, M. Totschnig, and C. Willems, "Designing MOOCs for the support of multiple learning styles," in Proc. of European Conference on Technology Enhanced Learning, pp. 371-382, Springer, Berlin, Heidelberg, September, 2013.
  39. B. Rienties, A. Boroowa, S. Cross, C. Kubiak, K. Mayles, and S. Murphy, "Analytics4Action Evaluation Framework: A Review of Evidence-Based Learning Analytics Interventions at the Open University UK," Journal of Interactive Media in Education, vol.2016, no.1, pp.2, 2016. https://doi.org/10.5334/jime.394
  40. S. B. Shum, and R. Ferguson, "Social learning analytics : five approaches," in Proc. of the 2nd International Conference on Learning Analytics and Knowledge, pp. 23-33, 2012.
  41. S. Knight, and K. Littleton, "Discourse-centric learning analytics: mapping the terrain," Journal of Learning Analytics, vol.2, no.1, pp.185-209, 2015. https://doi.org/10.18608/jla.2015.21.9
  42. K. Verbert, S. Govaerts, E. Duval, J. L. Santos, F. Van Assche, G. Parra, and J. Klerkx, "Learning dashboards: an overview and future research opportunities," Personal and Ubiquitous Computing, vol.18, no.6, pp.1499-1514, 2014. https://doi.org/10.1007/s00779-013-0751-2
  43. Y. Ishikawa, and M. Hasegawa, "T-scroll: Visualizing trends in a time-series of documents for interactive user exploration," in Proc. of International Conference on Theory and Practice of Digital Libraries, pp. 235-246, Springer, Berlin, Heidelberg, September, 2007.
  44. J. Nielsen, "Why you only need to test with 5 users," Nielsen Norman Group: World leaders in research-based user experience, 2000.
  45. D. Henriksen, C. Richardson, and R. Mehta, "Design thinking: A creative approach to educational problems of practice," Thinking Skills and Creativity, vol.26, pp.140-153, 2017. https://doi.org/10.1016/j.tsc.2017.10.001
  46. S. Patton, "Admissions professionals ask: Are graduate schools ready for MOOCs?," The Chronicle of Higher Education, APRIL 26, 2013.
  47. D. Gasevic, S. Dawson, and G. Siemens, "Let's not forget: Learning analytics are about learning," TechTrends, vol.59, no.1, pp.64-71, 2015.
  48. L. Lupu, E. C. Corbu, and E. Edelhauser, "Dashboards and Radar Charts, Performance Analytics Instruments in Higher Education," in Proc. of International Conference on Current Economic Trends in Emerging and Developing Countries (TIMTED-2017), Timisoara, May, 2017.
  49. J. L. Santos, S. Govaerts, K. Verbert, and E. Duval, "Goal-oriented visualizations of activity tracking: a case study with engineering students," in Proc. of the 2nd International Conference on Learning Analytics and Knowledge, pp.143-152, Vancouver, Canada, April 29 - May 02, 2012.
  50. D. Davis, I. Jivet, R. F. Kizilcec, G. Chen, C. Hauff, and G. J. Houben, "Follow the successful crowd: raising MOOC completion rates through social comparison at scale," in Proc. of the Seventh International Learning Analytics & Knowledge Conference, pp.454-463, ACM, March, 2017.
  51. D. M. Anderson, and S. Staub, "Postgraduate digital badges in higher education: Transforming advanced programs using authentic online instruction and assessment to meet the demands of a global marketplace," Procedia-Social and Behavioral Sciences, vol.195, pp.18-23, 2015. https://doi.org/10.1016/j.sbspro.2015.06.165
  52. P. Fain, "Badging From Within," Changing Student Pathways Washington DC: Inside Higher Ed. 2014.
  53. D. K. Mah, "Learning analytics and digital badges: potential impact on student retention in higher education," Technology, Knowledge and Learning, vol.21, no.3, pp. 285-305, 2016. https://doi.org/10.1007/s10758-016-9286-8
  54. S. Few, "Information dashboard design: The effective visual communication of data. Sebastopol," CA: O'Reilly Media, Inc., 2006.
  55. E. Panadero, "A review of self-regulated learning: six models and four directions for research," Frontiers in psychology, vol.8, p. 422, 2017. https://doi.org/10.3389/fpsyg.2017.00422
  56. A. R Artino, and K. D. Jones, "Exploring the complex relations between achievement emotions and self-regulated behaviors in online learning," The Internet and Higher Education, vol.15, no.3, pp. 170-175, 2012. https://doi.org/10.1016/j.iheduc.2012.01.006
  57. A. Ben-Eliyahu, and L. Linnenbrink-Garcia, "Extending self-regulated learning to include self-regulated emotion strategies," Motivation and Emotion, vol.37, no.3, pp.558-573, 2013. https://doi.org/10.1007/s11031-012-9332-3
  58. J. Knox, "From MOOCs to Learning Analytics: Scratching the surface of the 'visual'," eLearn, November, 2014.