DOI QR코드

DOI QR Code

The Development of Digital Age Literacy: A Case Study in Indonesia

  • MUJTAHID, Iqbal Miftakhul (Public Administration Study Program, Universitas Terbuka) ;
  • BERLIAN, Mery (Agribusiness Study Program, Universitas Terbuka) ;
  • VEBRIANTO, Rian (Department of Master of Elementary School Teacher Education, Universitas Islam Negeri Sultan Syarif Kasim Riau) ;
  • THAHIR, Musa (Department of Mathematic Education, Universitas Islam Negeri Sultan Syarif Kasim Riau) ;
  • IRAWAN, Dedi (Physics Education Department, Universitas Riau)
  • Received : 2020.11.05
  • Accepted : 2021.01.15
  • Published : 2021.02.28

Abstract

This research aims to develop the instrument of feasible and reliable digital-age literacy to be used in the learning process. This needs to be done due to a different region, tribe, and gender involved in this research. The digital-age literacy developed in this research consisting of 8 constructs, including 1) basic literacy, 2) scientific skill, 3) economy skill, 4) information skill, 5) technology skill, 6) visual skill, 7) various cultures skill, and 8) global awareness. As many as 650 respondents were chosen through stratified and random sampling in this survey. Those respondents were students at Universitas Terbuka based on gender and Ethnicity comparison. To see the internal consistency, the data were then analyzed using SPSS 23.00 version for Windows. It was obtained that all questionnaire constructs were valid and reliable, proven by obtaining a high mean reliability value of Cronbach Alpha (0.816 > 0.6), in which each item had a high value (0.778-0.841). Therefore, the number obtained the results proved that this research had produced a quality instrument that can be used to evaluate the students' mastery of the digital-age literacy of the learning process in Universitas Terbukain Asia, especially in Indonesia.

Keywords

1. Introduction

In this 21st century, technology development becomes more rapid as it enters the industrial revolution era 4.0, making the globalization flow cannot be stopped anymore. The 21st century is a century where significant global development is in the technology and information sector (Rahayu et al., 2018). This century is marked by the globalization flow causing tight competition so that excellent knowledge and skills are required. Skills needed in the 21st century include information and communication technology and information literacy (digital-age literacy) as working tools (Griffin & Care, 2015). Digital-age literacy is defined as an individual’s ability to use the digital device to find and choose information, think critically, be creative, collaborate with other people, communicate effectively and mind electronic safety and develop social-culture context (Hague & Payton, 2010). The use of information technology also has an impact on economic effectiveness and company conditions (Dwirandra & Astika, 2020).

Hague and Payton conducted educational research finding that good digital-age literacy plays a role in developing an individual’s knowledge regarding particular lesson material by encouraging him to develop his curiosity and creativity (Akbar & Anggaraeni, 2017). A survey conducted on 424 students of the tenth and twelfth grade in the rural area of Canada regarding digital-age literacy activity obtained information that the frequency of activities using digital technology potentially increased inside and outside the classroom (Briere, 2015). Furthermore, another survey conducted on 60 University students provided information that most of the students accepted to the university had good ability to use social media, email or Skype, and the internet as community members in cyberspace. However, their competence in applying technology in the learning process was still lacking (Shopova, 2014). This is supported by another survey carried out by the Ministry of Communication and Information cooperated with UNICEF that 79.5% of children and teenagers at the age of 10-19 years old throughout Indonesia representing the rural and urban area were internet and digital media users (Febliza & Oktariani, 2020). It has a significant effect on the transition process of children as children and teenagers search for information on the internet. Thus, this digital era requires the teachers to follow the development in science and technology so that the learning process is by the current developments and is in sync with the students’ needs. However, the educational environment phenomenon indicates that the teachers are still lacking in following the modernization rate; many still use 80s products and apply educational systems of the twentieth century, causing a significant difference between the teacher and student (Susilo & Sarkowi, 2018). Digital-age literacy involves life skills, which does not only involve ability in using technology, information, and communication devices but also involves ability in socializing, ability in learning and behaving, as well as thinking critically, creatively and in an inspirational way to match with the requirements of the digital age.

Research conducted by Kharizma concluded that the Senior High School Teachers of Surabaya City had high digital-age literacy ability (Kharisma, 2017). Other research detected that the literacy level of the Junior and Senior High School aged teenagers had high ability in internet searching, hypertextual navigation, and knowledge assembly. In contrast, the digital-age literacy ability in the content evaluation was still categorized as low (A’yuni, 2015). Another research project studied the significant relationship between digital literacy with self-directed learning and e-resources (Akbar & Anggaraeni, 2017); (Nurjanah et al., 2017), indicating that a student has a high literacy level, then his learning independence and mastery in accessing electronic source will also increase. Based on several research projects above, it can be understood that through digital-age literacy, someone can access, manage, integrate, analyze and evaluate information, build new knowledge, make and communicate with other people so that he can actively participate in the community. According to Potter’s concept (Widyastuti et al., 2016), the effort to liberate the community based on digital is not merely introducing digital media but also synergizes daily activities aiming to improve their productivity. Furthermore, according to Sholihah, digital-age literacy is an effort to find, use, and distribute information effectively (Sholilhah, 2016).

Digital literacy has become an essential factor in distance learning during the Covid-19 pandemic. Digital competence and literacy in using computers and surfing in cyberspace are the necessary skills needed in implementing PJJ (Latip, 2020). Furthermore, Shopova stated that competency and level of ICT literacy affect the effectiveness and efficiency of the learning process (Shopova, 2014). Meanwhile, ICT literacy is more specific to the use of digital media. Clark said that ICT literacy is influenced by the level of generation and age of technology users. The younger generation finds it is easier to manage technology than the older generation (Kirschner, 2006). In the context of the ongoing implementation of PJJ, differences in generations and ages between teachers and learners can be a barrier to the smooth implementation of PJJ. Therefore, the improvement and standardization of teachers and learners in the mastery of information and communication technology should be pursued by all parties involved in PJJ.

In connection with standardization, the International Technology Education Association (ITEA) released a technology literacy standard that covers various competencies and abilities of learners from 2 to 12 years of age. This standard relates to the ICT competencies that learners must demonstrate in using information and communication technology to support the learning process. In another part, the International Society for Technology in Education (ISTE) also released seven aspects related to technology mastery standards that learners must master in facing the digital world. The competency standards and ICT capabilities of the ITEA and ISTE are closely related to the use of technology to support the learning process. If it is linked to the context of implementing distance learning during the Covid-19 pandemic, the ICT literacy standards developed by ITEA and ISTE can be a reference for teachers and learners in utilizing technology for the smooth implementation of PJJ.

Besides, Kats stated that digital-age literacy assessment is needed to be involved in the educational framework in order to support institution digital literacy initiative and individual learning guidelines as well as to define skill and knowledge (Covello, 2010). Another advantage of the assessment is to prove the understanding and learning of an individual (Corrigan & Gunstone, 2019). Several studies focus on assessing students’ skills, such as assessing successful Intelligence (Vianney Mitana et al., 2019) and critical thinking skills (Dwi Saputra et al., 2018). The main finding in this study is that digital literacy has the greatest influence on the performance of SME entrepreneurs, both directly and indirectly (Sariwulan et al., 2020). But not many studies have developed the instruments to measure digital literacy skills. Based on the explanation above, assessment instrument of digital-age literacy ability is needed so that the present research aimed to measure to the level of university students’digital literacy level. It is impossible that a university can follow the science and technology development while it does not have adequate potential and unprepared human resource. Therefore, digital-age literacy instrument needs to be developed to identify the preparedness of university and its students in encountering the digital era in industrial revolution 4.0.

2. Literature Review

Abundant digital information resource nowadays is caused by advanced information technology and the internet. On the other hand, information technology development is similar to the two sides of a coin, which affects the community positively and negatively. Digital-age literacy learning cannot be ignored. This demand produces thought that digital-age literacy is also essential in the education aspect. Besides, education in the 21st century demands the education institution to respond to the era changing and development by mastering information technology, also called digital-age literacy. According to NCREL, Metiri Group, and Burkhardt et al., literacy ability is the ability to emphasize the literacy ability connected in the digital era and is limited to reading ability. However, it is also associated with listening, writing, and speaking abilities (Hasibuan & Ariyanti, 2019).

Stokes divided literacy terminology based on the theory into four definitions. First, reading and writing ability is a requirement for someone involved in social interaction. Second, reading, writing, and counting ability. Third, it refers to the quality of the intellectual and educated individual. He can participate fully in social activities, whether in social, economic, political, and cultural sectors. Fourth, literacy is the characteristics of a certain social or cultural group (Yanti, 2016). The definition of literacy develops through the times. Literacy is widely defined based on the context used. Therefore, it is no wonder if media literacy (Sibii, 2009), visual literacy (Jones, 2009), critical literacy(Roberts, 2000), and information literacy (Bawden, 2001) appears now. In more complete form, literacy has several types including (a) minimal literacy; (b) conventional literacy; (c) basic literacy; (d) functional literacy; (e) restricted literacy; (f) vernacular literacy; (g) elite literacy; and (h) multiliteracy (Wiley, 2008). However, digital-age literacy is different from other literacies because it is related to the local and global ecology, which are interconnected. It is also not monocultural and static (Hawisher et al., 2006). Although they are different, basic literacy capabilities greatly affect one’s connectivity to the digital environment (Warschauer, 2007).

Digital-age literacy can read, write, and count various digital texts/objects present in a digital environment. When information is presented in digital form, it can be manipulated (Ensmenger, 2012). Digital-age literacy is not an old topic since there are so many research projects which elaborate on this. Research conducted at the University of Qatar shows that the skills to look for information in a database are considered by students as one of the essential skills that students must have (Azmi, 2006). However, online skills also include the ability to look for information on the internet, which may be the reason for differences among the students (Santos & Pedro, 2013).

Digital-age literacy skills need to be improved to support the universities with lecturers and students who are ready with competencies in the IT field as described previously. The literacy movement is defined as more than just reading and writing. It includes the ability to think critically in processing information that has been read or will be written (Asari et al., 2019). Furthermore, digital-age literacy is also not limited to the technical competence of applying technology in finding information, but the skills in reading, understanding, criticizing, and presenting knowledge/information obtained through technological devices and internet networks that support it (Kurnianingsih et al., 2017). Digital-age literacy is not as simple as understanding technology and information as supporting device, but rather a sufficient connection between the experience of teachers/students in using information technology inside and outside the classroom (Buckingham, 2015). Thus, it can be concluded that digital-age literacy is the process and the ability of teachers/students in processing and presenting information from/through digital technology that is related to their experience from inside and outside the classroom.

Digital-age literacy is one of the critical aspects of supporting education in the 21st century. Education in the 21st century aims to encourage students to master essential and useful 21st-century skills to be responsive to the era changes and developments (Afandi et al., 2016). The 21st-century knowledge era has the interconnection characteristics between the science worlds comprehensively, where digital-age literacy is expected to present consequences in the educational aspect (Sudarisman, 2015). Through digital-age literacy, reading, listening, and writing skills can be done with digital media such as computers, the internet (blog, social media, and website), and smartphones. Students can be invited to distinguish between false and true news spread on the internet. Besides, students need to be told addresses of useful sites for learning and how to use them dramatically facilitates them in finding information related to courses they consider very dull. However, after learning to use digital media, they are increasingly becoming enthusiastic about learning.

Therefore, the researchers of this paper determined that the digital-age literacy for students of primary education of Universitas Terbuka consisted of eight constructs adapted from (NCREL, 2003), including 1) basic literacy, 2) scientific skill, 3) economic skill, 4) information skill, 5) technology skill, 6) Visual skill, 7) various cultural skill, and 8) global awareness. This research is significant, considering that currently, many children and young people are tightly following the technological advances seen from the smartphone devices and other technologies used. They are used to using and exploring these devices. However, does this closeness guarantee that they will have the right level of digital-age literacy to face the challenges of the 21st century?

In a study examining how critical aspects of digital literacy (i.e., SRL and EC) relate to student learning benefits when using the internet to investigate everyday public health and science topics (Greene et al., 2014). Furthermore, some studies focus on measuring the significant difference between before and after digital literacy education through tests and surveys before and after the performance (Lee, 2014). The development of tests attempts to measure students’ ability to handle digital information, communicate, and collaborate during problem-solving (Siddiq et al., 2017). Literate assesses the psychometric properties of a newly tested self-report assessment tool for media literacy, based on the twelve new media literacy (NML) skills developed by (Jenkins et al., 2015).

The definitions of digital and ICT literacy that have been adopted in cross-national studies investigate the approaches to digital literacy and ICT assessments that have been used in these studies and articulate the criteria that should guide the development of global measures of digital and ICT literacy skills. (Ainley et al., 2016). Another study investigated digital literacy among junior high school students to compare participants’ perceptions of digital literacy competence and their actual performance on relevant digital assignments. (Porat et al., 2018). Furthermore, to the best of our knowledge, no research addresses digital literacy skills in open-ended questions.

3. Methodology

3.1. Design

This survey research was done to guide and produce digital-age literacy instruments for students. Research data were collected using a questionnaire (Creswell, 2012). The development of this questionnaire instrument is based on the adaptations that have been carried out by (NCREL, 2003) with a scale of seven consisting of eight constructs that were assessed for their validity and reliability level so that they can produce quality instruments and can measure what should be measured. The eight constructs meant are: 1) basic literacy, 2) scientific skill, 3) economic skill, 4) information closeness, 5) technology skill, 6) Visual skill, 7) various cultural skill, and 8) global awareness. This needs to be done to evaluate, know, and identify the digital-age literacy skills of the students because this research was carried out in Asia, especially in Indonesia, which has different gender and ethnic characteristics.

3.2. Sample

This research involved 650 necessary education students of Universitas Terbuka, both domestic and abroad. It is expected that through such a large number of respondents involved, the quality of instruments that are fostered and developed by researchers can be improved well. Samples were selected using stratified and random sampling to ensure that each member of the population has the same possibility to be chosen. The sample size taken is representative of the subgroup for gender and Ethnicity Comparison. Many respondents in this study coming from various regions, including from foreign countries, big cities as well as rural villages, speaks about the importance of this research for making policies that need to be taken for a university or institution.

3.3. Data Analysis Procedure

The data was collected from digital-age literacy questionnaire instruments for necessary education students of Universitas Terbuka of Riau Region. It was then processed using Statistical Package for Social Sciences (SPSS) version 23.00 for Windows to see the quality of the instruments that have been developed. The research instrument validity was obtained from the corrected item-total correlation value with scores without items regarding dimensions or constructs. In contrast, the reliability index was obtained from the use of Cronbach Alpha. The analysis result was carried out using corrected item-total correlation value, which must have a minimum value of 0.3 (Nunnally, 1978), while the instrument reliability was based on the results of the Cronbach Alpha analysis, which must have values in the range of 0.6 -1 (Hair et al., 2006) thus this research could produce the quality instrument.

4. Results

4.1. Instrument Development

The development of digital-age literacy instruments for necessary education students used three stages of the approach. Those are stage 1, starting with scale identification; stage 2, involving writing individual items on a scale; and stage 3, involving field testing items followed by item analysis and validation procedures. The stages done by the researcher are explained as follows.

Table 1: Digital-age literacy Instrument Construct and Aspect Assessed

OTGHEU_2021_v8n2_1169_t0001.png 이미지

Stage 1 – Scale Identification and Development. Stage 1 contained three steps that lead to scale identification and development. The first step includes reviewing the literature related to digital-age literacy instruments for students. This step is essential to be done because it identifies the main components that will be considered by researchers, educators, and practitioners as digital-age literacy needed in this challenging era. The second step is to conduct focus group discussions with experienced lecturers to obtain advice concerning digital-age literacy. The researchers also asked for approval and correctness of the constructs and items that were fostered in the instrument. The third step is to classify and rearrange the newly developed scale related to digital-age literacy, as suggested by the experts. Digital-age literacy in the present study includes nine primary components, those are: 1) basic literacy, 2) scientific skill, 3) economic skill, 4) information skill, 5) technology skill, 6) Visual skill, 7) various cultural skill, and 8) global awareness.

Stage 2 – Writing Individual Item. A set of questionnaires were compiled based on digital literacy instrument by the research, where each component consists of constructions which are guidelines for developing a set of questionnaires in introducing digital-age literacy to the students. The following presents a whole set of digital-age literacy items that are shown to the expert panel to ensure the validity of the instrument’s construct and content.

Stage 3 – Analysis of Instrument Validity and Reliability. One of the essential stages in this research is designing an instrument for measurements complemented with validity and reliability tests. The construct validity was used to see the extent to which the instrument describes a theoretical construct that is to be measured by carrying out trials (Setyawati et al., 2017). Emory said that there are methods that can be used in calculating construct validity, which is by considering the correlations between research data with existing measurement methods in the form of the convergent discriminant method, factor analysis, and multi-method analysis (Fahruna & Fahmi, 2017). Besides, the value of the validity coefficient should be between +1.00 to −1.00. The coefficient value of +1.00 indicates that the items on the test instrument and construct test have relatively the same results.

In contrast, if the validity coefficient is 0, it indicates that there is no relationship between the instrument and its construct. If the validity coefficient value is high, the instrument is considered acceptable (Yusup, 2018). The instrument validity in this study used a corrected item-total correlation with the total score without items regarding dimensions or constructs.

The instrument reliability of this research was also tested by looking at the value of the composite reliability of the indicator blocks to measure the construct and Cronbach’s Alpha coefficient. Mallery & George (2016) revealed that the questionnaire reliability above 0.7 has an acceptable consistency level (Aksoy & Gökçe, 2016). Furthermore, the Cronbach alpha value is considered high if it is in the range of 0 and 1 (Straub & Gefen, 2004). Meanwhile, Morris stated that the researcher generally accepted Cronbach’s alpha if the value is higher or 0.60 (Zettel, 2001).

4.2. Analysis of Instrument Validity

As many as necessary education students of Universitas Terbukain May 2020 were involved in this research. The respondents can be described and explained in terms of ethnicity and gender as follows.

Based on the data in Figure 1 above, information was obtained regarding the number of respondents by gender, where male students had the same number as female tutors totaling 325 people or 50%. Thus, it can be known that respondents involved based on gender have met the margin error requirements.

OTGHEU_2021_v8n2_1169_f0001.png 이미지

Figure 1: Graph of Respondents’ Gender Comparison (%)

Figure 2 above obtained information about the number of respondents based on Malay, Mining, Javanese, Batak, and others. Malay ethnic respondents consisted of 160 people (25%); Mining ethnic group consisted of 80 people (12%); Javanese ethnic consisted of 220 people (34%); Batak ethnicity consisted of 100 people (15%), and the others consisted of 90 people (14%). From this diagram, we can understand that most respondents are from the Javanese and Malay ethnic groups.

OTGHEU_2021_v8n2_1169_f0002.png 이미지

Figure 2: Graph of Respondents’ Ethnicity Comparison (%)

In addition to respondent data based on gender and ethnicity, the researcher also analyzed the instrument’s validity using item correlation values corrected by item-total correlation. The following presents the results of the instrument validity test from the data of this study, as shown in Table 2 below.

Table 2: Instrument Validity Using Item Correlation Values with Corrected item-total Correlationfor Each Study Constructs

OTGHEU_2021_v8n2_1169_t0002.png 이미지

Table 2 above obtained information that the value of r-table is equal to 0.080 obtained from tables with a degree of freedom (df) of 648 from 29 questionnaires distributed as trials. From the overall calculation, all items are declared valid because the value of r-count > r-table so that all question items can be used to measure the digital-age literacy of necessary education students of Universitas Terbuka.

4.3. Analysis of Instrument Reliability

In developing digital-age literacy instruments for necessary education students of Universitas Terbuka, each item’s internal consistency was assessed. This is a measure of the extent to which items on a scale measure the same construct as other items on the same scale. The following presents the reliability analysis results using Cronbach’s alpha coefficient for the questionnaire based on the digital-age literacy instrument of necessary education students of Universitas Terbuka.

Table 3: Cronbach Alpha Reliability Index for Each Study Constructs

OTGHEU_2021_v8n2_1169_t0003.png 이미지

Table 3 above obtained the Cronbach Alpha Reliability Index for each study construct and the overall alpha value showing the constructs of physical/kinesthetic, existential/ spiritual, interpersonal, intrapersonal, logical thinking, musical/rhythmic, naturalistic, verbal, and visual by 0.819; 0.816; 0.876; 0.877; 0.871; 0.874; 0.869; and 0.875, respectively. It was found that the reliability value (α) was more significant than 0.60 for each construct studied. Thus, the nine results of the reliability analysis above indicate that all constructs are reliable. This means that this instrument can measure what is rightly measured and can be used to measure and evaluate digital-age literacy instruments for the necessary education students of Universitas Terbuka.

5. Discussion

Online questionnaires were distributed, obtaining that the highest dominant digital-age literacy skill among the necessary education students is a multicultural construct, which is the skills of various cultures by 78.97%. The lowest digital-age literacy is in the basic construct, which is the necessary skill of 62.63%. Digital-age literacy is critical in finding information and implementing that information. Therefore, students involved in this research must be able to access and evaluate information, access information efficiently and effectively, evaluate information critically and competently, use and manage information, use information accurately and creatively to solve problems encountered, manage information flows from various sources, as well as applying a basic understanding of ethical/legal issues surrounding the access and use of the information. Learning in the 21st-century digital era, such as independent learning and collaborative learning, places students and society at the center of the learning process while introducing differences between students, with learning levels adjusted to the individual and their abilities, preferences, and needs (Eyal, 2012). In an educational context, this development provides opportunities for teachers and learners to provide new nuances in studying and learning social interaction and professional work. So, this digital-age literacy skill becomes lecturers and students’ foundation and the core of knowledge in improving other skills to develop better. It is similar to the old saying that: reading is a window of science. With this skill, students and lecturers can distinguish between real news and hoax. The hoax news problem in the era where information is easily obtained digitally causes much information to be clarified and evaluated to conclude that the information obtained is correct or a hoax. Thus, it is clear that these digital-age literacy skills will save themselves, their families, and the country because people who have good digital-age literacy skills will avoid hoax in the form of news or information.

Digital-age literacy has significance for teaching and learning. For example, from the teacher side, mastering digital-age literacy provides convenience and effectiveness in planning, implementing, and evaluating learning programs that it does. The use of digital literacy also has an impact on economic growth which leads to financial stability, especially in savings, investment, insurance, and entrepreneurship (Monsura, 2020). From the learner side, it is excellent to live in the millennial era now since they are facilitated by qualified digital technology in which makes college assignments can be typed on a computer, and learning resources from all over the universe are available (Harjono, 2019). Thus, digital-age literacy for students is considered very important in modern times like today. IT is not only widely used in learning but personal daily activities and the common good. Because of the importance of digital-age literacy, students must be able to: 1) use technology as a tool for researching, organizing, evaluating, and communicating information, 2) using digital technology, communication tools, and social networking appropriately to access, manage, integrate, evaluate and create information, and 3) understand fundamentally about ethical/legal issues surrounding access to and use of information technology (Trilling & Fadel, 2009). Students need to own digital-age literacy skills in order to obtain and utilize media in learning on campus. Therefore, the use of digital in learning does not only help to explain abstract concepts to be more concrete but further than to explore various skills possessed by students so that they can analyze and create media for learning purposes.

According to the analysis conducted, the digital-age literacy questionnaire developed by necessary education students has a valid and reliable construct. Thus, digital-age literacy instruments can be used in research to identify digital-age literacy among students in the learning process. We can acknowledge that this research instrument has been tested and meet the eligibility standards to be used and trusted to measure the digital-age literacy of necessary education students. This is confirmed by research that stated that instruments that are already valid and reliable could be used as a measurement tool (Suratno, 2016). Furthermore, the use of assessment instruments that can be used by teachers must meet the criteria of validity and should be very good and appropriate to be used (Pinilih et al., 2013). The development of digital-age literacy instrument is one proof of the application of education standardization policies through the issuance of PP No. 19/2005, Article 63–72 of the Ministry of National Education Number 20 of 2007 concerning Educational Assessment Standards that the assessment of education at the tertiary level is regulated by each tertiary institution by applicable laws and regulations (Astuti et al., 2015). Implementing this policy requires each tutor to produce many assessment instruments so that the teaching and learning process can be achieved by established competency standards.

6. Conclusion

Results and discussion of this paper answer our question that the researchers have developed a useful and theoretically feasible instrument to measure the level of digital-age literacy skills of students in the university to face the challenges of the 21st century. Besides, this assessment instrument also met the empirical eligibility criteria in the validity and reliability tests. This is proven from all valid and reliable questionnaire questions based on the count Cronbach’s alpha (α) value, which is greater than the Cronbach’s alpha (α) value, which is 0.816 > 0.60. Based on the instrument analysis results carried out at the field trials stage, the 29 questionnaire items tested had met the criteria for the proper use and quality. This has implications for stakeholders and policyholders at the university to assess and identify the level of digital-age literacy skills of students. This is important to direct and implement appropriate methods in teaching and learning process performed in the future to improve various student skills and uphold the quality of a university because the quality of students will carry the right name and quality of the university where students draw and develop their knowledge and skills.

References

  1. A'yuni, Q. Q. (2015). Youth Digital Literacy in Surabaya City. Jurnal Fakultas Ilmu Sosial Dan Ilmu Politik Universitas Airlangga Surabaya, 4(2), 1-15. http://journal.unair.ac.id/literasi-digital-remaja-di-kota-surabaya-article-9195-media136-category-8.html
  2. Afandi, Junanto, T., & Afriani, R. (2016). Implementasi Digital-Age Literacy Dalam Pendidikan Abad 21 di Indonesia. Prosiding Seminar Nasional Pendidikan Sains, 113-120. https://media.neliti.com/media/publications/173402-ID-none.pdf
  3. Ainley, J., Schulz, W., & Fraillon, J. (2016). A global measure of digital and ICT literacy skills. : Creating Sustainable Futures for All.
  4. Akbar, M. F., & Anggaraeni, F. D. (2017). Technology in Education: Digital Literacy and Self-Directed Learning in Thesis Students. Indigenous: Jurnal Ilmiah Psikologi, 2(1), 28-38. https://doi.org/10.23917/indigenous.v1i1.4458
  5. Aksoy, Z. ., & Gokce, P. (2016). Measuring Digital Literacy Skill: Development, Reliability, and Validity of Open-ended Test. International Journal of Educational Research Review, 1(1). https://doi.org/10.24331/ijere.309958
  6. Asari, A., Kurniawan, T., Ansor, S., Bagus, A., & Rahma, N. (2019). Digital Literacy Competencies for Teachers and Students in Malang Regency Schools. BIBLIOTIKA : Jurnal Kajian Perpustakaan Dan Informasi, 3(2), 98-104.
  7. Astuti, W. P., Wibawanto, H., & Khumaedi, M. (2015). Development of Competency-Based Facial Skin Care Practices Performance Assessment Instruments at Semarang State University. Journal of Educational Research and Evaluation, 4(2), 8-14.
  8. Azmi, H. (2006). Teaching Information Literacy Skills: A case study of the QU-core program in Qatar University. Innovation in Teaching and Learning in Information and Computer Sciences, 5(4), 145-164. https://doi.org/10.11120/ital.2006.05040145
  9. Bawden, D. (2001). Information and Digital Literacies: A Review of Concepts. Journal of Documentation, 57(2), 218-259. https://doi.org/10.1108/EUM0000000007083
  10. Briere, J. (2015). Rural High School Students' Digital Literacy. Journal of Literacy and Technology, 16(2).
  11. Buckingham, D. (2015). Defining Digital Literacy: What do Young People Need to Know About Digital Media? Nordic Journal of Digital Literacy, 2015(4), 21-34. https://doi.org/10.18261/ISSN1891-943X-2015-Jubileumsnummer-03
  12. Corrigan, D., & Gunstone, R. (2019). Valuing Assessment in Science Education: Pedagogy, Curriculum, Policy. In Springer (Vol. 53, Issue 9). Springer. https://doi.org/10.1017/CBO9781107415324.004
  13. Covello, S. (2010). A Review of Digital Literacy Assessment Instruments. In Syracuse University. http://www.apescience.com/id/wp-content/uploads/DigitalLiteracyAssessmentInstruments_Final.pdf
  14. Creswell, J. W. (2012). Educational Research: Planning, Conducting Abd Evakuating Quantitative dan Qualitative Research (4th ed.). Pearson Education, Inc.
  15. Dwi Saputra, M., Joyoatmojo, S., & Kusuma Wardani, D. (2018). The Assessment of Critical-Thinking-Skill Tests for Accounting Students of Vocational High Schools. International Journal of Educational Research Review, 3(4), 85-96. https://doi.org/10.24331/ijere.453860
  16. Dwirandra, A. A. N. B., & Astika, I. B. P. (2020). Impact of Environmental Uncertainty, Trust and Information Technology on User Behavior of Accounting Information Systems. The Journal of Asian Finance, Economics and Business, 7(12), 1215-1224. https://doi.org/10.13106/jafeb.2020.vol7.no12.1215
  17. Ensmenger, N. (2012). The Digital Construction of Technology: Rethinking the History of Computers in Society. Technology and Culture, 53(4), 753-776. https://doi.org/10.1353/tech.2012.0126
  18. Eyal, L. (2012). Digital assessment literacy - the core role of the teacher in a digital environment. Educational Technology and Society, 15(2), 37-49.
  19. Fahruna, Y., & Fahmi, M. (2017). Validity and Reliability of UserBased Ideal Library Measurement Construct with LIBQUAL Approach. Jurnal Ekonomi Bisnis Dan Kewirausahaan, 6(2), 161. https://doi.org/10.26418/jebik.v6i2.22989
  20. Febliza, A., & Oktariani. (2020). Development of Digital Literacy Instruments for Students and Teachers. Jurnal Pendidikan Kimia Universitas Riau, 5(1), 1-9. https://doi.org/10.33578/jpk-unri.v5i1.7776
  21. Greene, J. A., Yu, S. B., & Copeland, D. Z. (2014). Measuring critical components of digital literacy and their relationships with learning. Computers and Education, 76, 55-69. https://doi.org/10.1016/j.compedu.2014.03.008
  22. Griffin, P., & Care, E. (2015). Assessment and Teaching of 21st Century Skills. In Springer Dordrecht Heidelberg. Springer. https://doi.org/10.1007/978-94-017-9395-7_15
  23. Hague, C., & Payton, S. (2010). Digital Literacy Across the Curriculum. Curriculum & Leadership Journal, 9(10).
  24. Harjono, H. S. (2019). Digital Literacy: Prospects and Implications in Language Learning. Pena : Jurnal Pendidikan Bahasa Dan Sastra, 8(1), 1-7. https://doi.org/10.22437/pena.v8i1.6706
  25. Hasibuan, N. S., & Ariyanti, T. D. (2019). Strengthening Literacy Through ICT for Students of the University of Muhammadiyah Tapanuli Selatan (UMTS). Prosiding Seminar Nasional PBSI II Tahun 2019, 111-117.
  26. Hawisher, G. E., Selfe, C. L., Guo, Y.-H., & Liu, L. (2006). Globalization and Agency: Designing and Redesigning the Literacies of Cyberspace. College English, 68(6), 619-636. https://doi.org/10.2307/25472179
  27. Jenkins, H., Chauhan, S., & Panda, N. K. (2015). Online Anonymity. Hacking Web Intelligence, 147-168. https://doi.org/10.1016/b978-0-12-801867-5.00008-2
  28. Jones, R. B. (2009). Visual Literacy. In Encyclopedia of the Social and Cultural Foundations of Education (pp. 851-853). SAGE Publication Ltd. https://doi.org/http://doi.org/http://dx.doi.org/10.4135/9781412963992
  29. Joseph F. Hair, J., Black, W. C., Babin, B. J., & Anderson, R. E. (2006). Multivariate Data Analysis. Pearson Educational International.
  30. Kharisma, H. V. (2017). Digital Literacy among High School Teachers in the City of Surabaya. Journal Universitas Airlangga, 6(4).
  31. Kurnianingsih, I., Rosini, & Ismayati, N. (2017). Efforts To Improve Digital Literacy Capabilities For School Libraries And Teachers In The Central Jakarta Area Through Information Literacy Training. Jurnal Pengabdian Kepada Masyarakat (Indonesian Journal of Community Engagement), 3(1), 61-76. https://doi.org/10.22146/jpkm.25370
  32. Latip, A. (2020). The Role of Information and Communication Technology in Distance Learning during the Covid-19 Pandemic. EduTeach: Jurnal Edukasi Dan Teknologi Pembelajaran, 1(2), 107-115. https://www.researchgate.net/profile/Abdul_Latip/publication/341868608_PERAN_LITERASI_TEKNOLOGI_INFORMASI_DAN_KOMUNIKASI_PADA_PEMBELAJARAN_JARAK_JAUH_DI_MASA_PANDEMI_COVID-19/links/5ed773c245851529452a71e9/PERAN-LITERASI-TEKNOLOGI-INFORMASI-DANKOMUNIKASI
  33. Lee, S.-H. (2014). Digital Literacy Education for the development of digital literacy. International Journal of Digital Literacy and Digital Competence, 5(3). https://doi.org/10.4018/ijdldc.2014070103
  34. Mallery, P., & George, D. (2016). IBM SPSS Statistics 23 Step by Step: A Simple Guide and Reference. Routledge.
  35. Monsura, M. P. (2020). The Importance of Financial Literacy: Household's Income Mobility Measurement and Decomposition Approach. The Journal of Asian Finance, Economics and Business, 7(12), 647-655. https://doi.org/10.13106/jafeb.2020.vol7.no12.647
  36. Ncrel, M. G. (2003). enGauge 21st Century Skills: Digital Literacy for Digital Age. CA: Ncrel & Metiri.
  37. Nunnally, J. (1978). The Study of Change in Evaluation Research: Principal Concerning Measurement, Experimental Design and Analysis. Sage Publication.
  38. Nurjanah, E., Rusmana, A., & Yant, A. (2017). The Relationship between Digital Literacy and the Quality of E-Resources Use. Lentera Pustaka: Jurnal Kajian Ilmu Perpustakaan, Informasi Dan Kearsipan, 3(2), 117. https://doi.org/10.14710/lenpust.v3i2.16737
  39. Paul A. Kirschner, J. S. & Richard E. C. (2006). Why Minimal Guidance During Instruction Does Not Work: An Analysis of the Failure of Constructivist, Discovery, Problem-Based, Experiential, and Inquiry-Based Teaching. Journal Educational Psychologist, 41(2), 75-86. https://doi.org/10.1207/s15326985ep4102_1
  40. Pinilih, Wahyu, F., Budiharti, R., & Ekawati, E. Y. (2013). Development of Product Assessment Instruments in Science Learning for Junior High School Students. Jurnal Pendidikan Fisika, 1(2). http://www.jurnal.fkip.uns.ac.id/index.php/%0Apfisika/article/viewFile/2798/1914
  41. Porat, E., Blau, I., & Barak, A. (2018). Measuring digital literacies: Junior high-school students' perceived competencies versus actual performance. Computers and Education, 126, 23-36. https://doi.org/https://doi.org/10.1016/j.compedu.2018.06.030
  42. Rahayu, T., Mayasari, T., & Huriawati, F. (2018). Development of Digital Literacy Ability Instruments in the Application of Hybrid Learning Media Based on Physics Websites. Seminar Nasional Pendidikan Fisika IV 2018, 1, 141-148.
  43. Roberts, P. (2000). Knowledge, Informationa and Literacy. Foresight, 46(5), 433-453.
  44. Santos, R., & Pedro, L. (2013). Digital Divide in Higher Education Students' Digital Literacy. In S. S. S. Kurbanoglu, E. Grassian, D. Mizrachi, R. Catts, S. Akca (Ed.), European Conference on Inform ation Literacy (ECIL) (Issue November, pp. 12-13). Hacettepe University.
  45. Sariwulan, T., Suparno, S., Disman, D., Ahman, E., & Suwatno, S. (2020). Entrepreneurial Performance: The Role of Literacy and Skills. The Journal of Asian Finance, Economics and Business, 7(11), 269-280. https://doi.org/10.13106/jafeb.2020.vol7.no11.269
  46. Setyawati, R. D., Happy, N., & Murtianto, Y. H. (2017). Instrumen Angket Self-Esteem Mahasiswa ditinjau Dari Validitas dan Reliabitas. Jurnal Phenomenon, 7(2), 174-186. https://doi.org/10.21580/phen.2017.7.2.1932
  47. Sholilhah, K. (2016). Digital Literacy Analysis: A Study on the Use of Electronic Journals by Management Masters Students at the SWCU Salatiga Library. UIN Sunan Kalijaga.
  48. Shopova, T. (2014). Digital Literacy of Students and Its Improvement at The University. Journal on Efficiency and Responsibility in Education and Science, 7(2), 26-32. https://doi.org/10.7160/eriesj.2014.070201
  49. Sibii, R. (2009). Media Literacy. In Encyclopedia of Journalism (pp. 884-887). SAGE Publication Ltd. https://doi.org/http://doi.org/http://dx.doi.org/10.4135/9781412972048
  50. Siddiq, F., Gochyyev, P., & Wilson, M. (2017). Learning in Digital Networks-ICT literacy: A novel assessment of students' 21st century skills. Computers and Education, 109, 11-37. https://doi.org/https://doi.org/10.1016/j.compedu.2017.01.014
  51. Straub, D., & Gefen, D. (2004). Validation Guidelines for IS Positivist Research. Communications of the Association for Information Systems, 13(March). https://doi.org/10.17705/1cais.01324
  52. Sudarisman, S. (2015). Understanding the Nature and Characteristics of Biology Learning to Respond to the Challenges of the 21st Century and Optimizing the Implementation of 2013. Florea : Jurnal Biologi dan Pembelajarannya, 2(1), 29-35. https://doi.org/10.25273/florea.v2i1.403
  53. Suratno, A. (2016). Developing Assessment Instruments in Competence Practice Engine Student in SMK Automotive Engineering Program. ANOS Journal Of Mechanical Engineering Education, 11(1), 2528-2700.
  54. Susilo, A., & Sarkowi, S. (2018). The Role of 21st Century History Teachers in Facing the Challenges of Globalization. Historia: Jurnal Pendidik Dan Peneliti Sejarah, 2(1), 43. https://doi.org/10.17509/historia.v2i1.11206
  55. Trilling, B., & Fadel, C. (2009). 21st Century Skills: Learning for Life in Our Times. John Wiley & Sons.
  56. Vianney Mitana, J. M., Muwagga, A. M., & Ssempala, C. (2019). Assessment for Successful Intelligence: A Paradigm Shift in Classroom Practice. International Journal of Educational Research Review, 4(1), 106-115. https://doi.org/10.24331/ijere.490162
  57. Warschauer, M. (2007). The Paradoxical Future of Digital Learning. Learning Inquiry, 1(1), 41-49. https://doi.org/10.1007/s11519-007-0001-5
  58. Widyastuti, D. A. R., Nuswantoro, R., Sidhi, & Purnomo, T. A. (2016). Digital Literacy in Productive Business Women in Yogyakarta Special Region. Jurnal ASPIKOM, 3(1), 1. https://doi.org/10.24329/aspikom.v3i1.95
  59. Wiley, T. G. (2008). Literacy and Biliteracy. In Encyclopedia of Bilingual Education (pp. 530-534). SAGE Publication Ltd. https://doi.org/http://doi.org/http://dx.doi.org/10.4135/9781412963985
  60. Yanti, M. (2016). Determinan Literasi Digital Mahasiswa: Kasus Universitas Sriwijaya [Determinants of Students Digital Literacy: the Case of Sriwijaya University]. Buletin Pos Dan Telekomunikasi, 14(2), 79. https://doi.org/10.17933/bpostel.2016.140202
  61. Yusup, F. (2018). Test the Validity and Reliability of Quantitative Research Instruments. Jurnal Tarbiyah : Jurnal Ilmiah Kependidikan, 7(1), 17-23. https://doi.org/10.18592/tarbiyah.v7i1.2100
  62. Zettel, J. (2001). Methodological Constraints, Critics, and Technology Acceptance: An Experiment. http://www.iese.fhg.de/ISERN/technical_reports/isem-01-04.pdf.

Cited by

  1. Factors Affecting Student Performance in E-Learning: A Case Study of Higher Educational Institutions in Indonesia vol.8, pp.4, 2021, https://doi.org/10.13106/jafeb.2021.vol8.no4.0993