Development of an instrument to assess independent online learning readiness of high school students in Indonesia

Tian Belawati (Faculty of Education, Universitas Terbuka, Tangerang Selatan, Indonesia)
Daryono Daryono (Faculty of Law, Social, and Political Sciences, Universitas Terbuka, Tangerang Selatan, Indonesia)
Sugilar Sugilar (Faculty of Education, Universitas Terbuka, Tangerang Selatan, Indonesia)
Udan Kusmawan (Faculty of Education, Universitas Terbuka, Tangerang Selatan, Indonesia)

Asian Association of Open Universities Journal

ISSN: 2414-6994

Article publication date: 14 February 2023

Issue publication date: 31 May 2023




The paper reports a study that was intended to develop a self-assessment instrument to measure high school students' readiness for pursuing independent online learning.


The instrument was developed through the following steps: (1) developing the draft, (2) checking the instrument's face validity and (3) testing the instrument's validity, reliability and discriminant capacity using PLS analysis.


The study has developed a tool to self-assess high school students' readiness for independent online learning. The instrument consists of 36 statement items and is statistically proven to have good reliability, construct and indicator validity and a discriminating power.

Research limitations/implications

The instrument items were designed to fit the context of Indonesian high school students. However, only responses from high school students in rather urban areas were used to test the validity and reliability of the instrument. This could imply that the instrument is only accurate in urban settings.

Practical implications

As a result of the research, a tool to assess high school students' readiness for independent online learning has been created. To better prepare students for independent online learning endeavors, the school might use the results to enhance areas that need improvement.


The study succeeded in developing a contextualized self-assessment tool for measuring Indonesian students' independent online learning readiness.



Belawati, T., Daryono, D., Sugilar, S. and Kusmawan, U. (2023), "Development of an instrument to assess independent online learning readiness of high school students in Indonesia", Asian Association of Open Universities Journal, Vol. 18 No. 1, pp. 34-45.



Emerald Publishing Limited

Copyright © 2023, Tian Belawati, Daryono Daryono, Sugilar Sugilar and Udan Kusmawan


Published in the Asian Association of Open Universities Journal. Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) license. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this license may be seen at


Learning opportunities and the need for continuing education and training

Online learning has grown in popularity over the last decade and has reached mainstream status in some industrialized countries. The expansion of online education praxis has intensified during the COVID-19 pandemic. Physical mobility restrictions and physical distance recommendations have prompted governments around the world to implement policies that have resulted in the closure of schools, college campuses and other physical educational and learning facilities. The same is true in Indonesia.

The number of Universitas Terbuka (UT) students who used the “UT online” service (i.e. online tutorial services) increased by about 63% during the pandemic, and this had increased the number of online tutorial classes by 293% from before the pandemic (Belawati et al., 2020). The COVID-19 pandemic has increased the growth of online learning users and learning resource providers globally. According to Class Central data of 2021(, 2,800 Massive Open Online Course (MOOC) titles were added following the pandemic, and the number of new MOOC participants increased by 30% in 2020. In addition, there are approximately 360 more micro-credentialing opportunities being offered by various institutions. The enormous amount of interest of the international community in the availability of free educational resources is demonstrated by the 180 million people from around the world who registered and participated in MOOCs (Shah, 2020).

The data illustrates that, in the current digital age, there are ample free and open opportunities to study and enhance human resource competencies online. This is clearly a potential for Indonesia, which is striving to raise the quality of its human resources as a part of sustainable national development. It is asserted that the nation's higher education system, which still places a significant emphasis on in-person instruction, is no longer capable of progressively enhancing the competency of the human resources.

On the other hand, the COVID-19 epidemic had also accelerated the tremendous impact of the Industrial Revolution 4.0 on how different economic sectors performed, intensifying and promoting the development of online business models. Furthermore, the business and industrial landscapes have undergone significant change because of the Industrial Revolution 4.0. The numerous production processes' automation and digitization, as well as product marketing, significantly alter the required expertise. The numerous competences and skills that are often the “capital/asset” to work and to hunt for a job become irrelevant as various forms of work that are typically performed by humans are replaced by machines. The World Economic Forum ( predicted that by 2025, more than 50% of the global labor force would need to acquire new skills in order to succeed in their “new” professions.

For Indonesia, a substantial portion of its human resources needs to receive upskilling and reskilling to be productive. In total, 70.72% of Indonesia's 270,200,000 citizens are in the productive age group (15–64 years), which is a demographic advantage (BPS, 2021a). This means that adjustments must be carried out in a manner that is sustainable for more than 190 million people to continue to adapt and contribute to national development (BPS, 2021b). Furthermore, according to data on the gross participation rate in higher education (APK Dikti), only 30.85% of those between the ages of 19 and 23 who are eligible for admission to universities can afford to attend them. This shows that there are around 29 million high school graduates who are qualified for further study but are unable to enroll in any higher education institution due to enrollment limitations and a variety of personal reasons (that make it impossible to attend lectures full-time on campuses). Therefore, it appears that higher education and training for up- and re-skilling via distance and online systems are required.

Conventional in-person education and training programs for upskilling and reskilling human resources will not be able to keep up with how quickly things change in the workplace or in industry. If Indonesia intends to compete on an equal level with other nations throughout the world, it must take advantage of the potential provided by the online education and training system. The availability of various accessible and affordable online learning resources is certainly an opportunity to be utilized. On the other hand, learning online means distance learning, which necessitates the capacity for independent study. To become capable of managing themselves as independent learners who can take advantage of learning opportunities and improve their ability from a variety of current learning resources, Indonesian human resources must be prepared before they graduate from high school.

Based on the above background, to be able to provide the most appropriate strategy in preparing high school students to become independent learners in this digital era, it is necessary to know in advance the map of their current abilities. Knowledge of the condition of the ability of high school students to become independent learners in online learning will be the basis for developing the competency improvement program in an integrated manner with the overall learning process in schools.

In accordance with the need to map high school students' readiness for independent online learning, this study was intended to develop a reliable and validated instrument to measure students' readiness for independent online learning within the context of Indonesia. Specifically, this article reports on (1) the construction of an instrument to assess high school students' readiness for independent online learning, and (2) on the validity and reliability tests of the instrument.

Independent learning readiness measurement

Online learning is a development and one of the distance education methods that rely on students' independent learning. The term “independent learning” refers to a learning approach in which students take responsibility for and control over their learning. They lead, manage and evaluate their own learning as well as learn via their own actions. The ability to learn independently also includes the learner's ability to determine the learning goals and the learning strategies to use, as well as seek help when facing problems in learning (Livingston, 2012). The ability or competence of independent learning has a positive influence on or correlated to the success of learners in completing their learning process (Alem et al., 2016; Blayone et al., 2018; Geng et al., 2019).

Competencies for independent learning are competencies that can be taught and trained. Previous literature and research have produced many instruments to measure one's independent learning competence and readiness to participate in learning programs. Indicators of independent learning ability are often approached from a person's ability to become a self-directed learner. Knowles (1975) defines self-directed learners as someone who can take initiative (with or without the help of others), know their learning needs and goals, identify the learning resources they need, choose and apply the most appropriate learning strategies and evaluate their own learning outcomes. This notion of self-directed learners by Knowles is like the universal notion of independent learners. The concept of readiness for independent online learning, although similar to the readiness of independent learning in general, is unique because learning is carried out online. Thus, students' readiness to learn online also needs to be viewed from the aspect of student's preference and comfortability in learning online compared to face-to-face, as well as students' confidence and skills in using internet-based electronic communication media to communicate and to learn (Hung et al., 2010).

Many instruments have been developed to measure self-directed learning competence, but according to Merriam and Baumgartner (2007), the most widely used instrument is the self-directed learning readiness scale (SDLRS) developed by Lucy M. Guglielmino (in 1977 and revised in 1991). SDLRS is a measurement scale with a questionnaire format that can be filled out independently ( The SDLRS questionnaire version for adults or SDLRS-A, also known as learning preference assessment (LPA), has 58 statement items using the Likert scale grouped in eight dimensions, namely: (1) self-concept as an effective learner, (2) openness to learning opportunities, (3) initiative and independence in learning, (4) a sense of responsibility for learning alone, (5) love of learning, (6) creativity, (7) the ability to use basic skills for learning and problem solving and (8) a positive orientation toward the future. The grouping is based on the results of the analysis of the filling data of the 58 items in the SDLRS instrument (deBruin et al., 2001). In other words, the SDLRS instrument is designed to identify a person's perception of skills and attitudes related to self-directed learning (Darmayanti, 2008). The SDLRS questionnaire has been adapted and modified into Indonesian context and translated into Bahasa Indonesia by Darmayanti in 1993. The results of the adaptation of SDLRS reduced the number of statements from 58 items in the original SDLRS to only 32 items in four dimensions, namely (1) learning needs, (2) self-regulation, (3) self-autonomy and (4) learner control over learning.

In line with the new development, self-directed learning readiness instruments for online learning contexts have also been developed and reported in several studies. One of them is the one developed by Hung et al. (2010). Hung et al. developed an online learning readiness scale (OLRS) based on an instrument previously developed by McVay, which in essence looks at readiness for online learning from the dimensions of (1) self-directed learning, (2) learner control, (3) motivation for learning, (4) computer/Internet self-efficacy and (5) online communication self-efficacy (in Hung et al., 2010). Another example of such an instrument is the one developed by Pennsylvania State University (PSU) which uses five aspects that are influencing factors in the readiness to self-directed online study, namely: (1) self-direction, (2) learning preferences, (3) study habits, (4) learning technological skills and (5) the ability to use computer. The last two competencies of the Hung and PSU's instruments are aspects related to competence in the use of technology or known as digital skills/competencies/literacy. Specifically in the context of Indonesia, Kusmawan (2020) developed a self-assessment questionnaire to assess UT's students' readiness for online learning named Digital and Online Competencies (DOC) that emphasizes the assessment on students' digital and technology access and competencies.

Based on the various definitions of competencies and instruments for measuring digital and online learning, there are three main competency aspects required for online learning, which are related to: (1) readiness to become an independent learner, (2) motivation and love for learning and (3) ability and comfortability to using technology. Furthermore, referring to the SDLRS dimension of Guglielmino, the competency dimension of independent learning readiness, learning styles, and learning habits are further broken down into self-concept, self-concept about independent learning skills and learning motivation. These aspects have the potential to reveal the competencies of online learners that institutions can cultivate to facilitate distance learning or independent online learning. Substantively, the above aspects can be summarized into six dimensions of self-directed or independent online learning readiness as presented in Table 1.

The six dimensions of independent online learning readiness in Table 1 were used as a framework for developing the instrument in this study.


Based on the construct dimensions shown in Table 1, the draft instrument was created. The statement indicators were created by combining Darmayanti's adaption of Guglielmino's SDLRS, Hung's OLRS and Kusmawan's DOC with a number of new, pertinent statements that the study team considered were necessary. The draft instrument was then examined for face validity through focus group discussion (FGD) sessions with 30 invited teachers, in three different cities, to determine its suitability to the conditions of the population in this study based on nonscientific expert judgment but grasp the characteristics of the target respondents. The draft instrument was revised in light of the comments received from those FGDs.

The improved instrument was then subjected to a series of partial least square (PLS) analysis tests to determine its validity and reliability. The validity and reliability tests were specifically conducted using the following tests.

  1. Construct reliability test: To see the reliability of latent variable constructs (dimension constructs and final variable constructs). Reliable values must be greater than 0.70. Construct reliability is the same as Cronbach's alpha.

  2. Discriminant validity test: To determine how much the latent construct is actually distinct from other constructs. An indication that a concept is distinct and capable of explaining the measured phenomenon is one with a high discriminant validity rating. If the average extracted variant (AVE), the root value of the average value of the variant, is higher than the correlation between the latent variables, the construct is considered to be genuine. The construct's discriminating power must be at least 0.50 to be declared.

  3. Convergent validity test: To evaluate the indicator's reliability using the outer loading values of each statement item on a relevant dimension. The outer loading value demonstrates the validity of the degree of influence/correlation of each statement item. When the outer loading value for each indicator is greater than 0.70, an indicator is said to have strong validity.


Instrument construction

Development of instrument statement items

The first stage in constructing the instrument was to compile all statement items from the previous referenced instruments. In the interest of cohesiveness, the statements were reviewed and if the substance of more than one statement is the same then the items from the different instruments were combined; and for consistency, several statements were rephrased so that all items would have the same language style. Then all statement items were mapped into the dimensions. Several new statement items that were considered important to complete the dimensions according to the researchers' expertise judgment were also developed and added to the list. As the result, the final draft instrument consisted of 86 items: 19 items for the dimension self-concept, 22 items for the dimension self-concept of independent learning skills, 9 items for the dimension motivation to learning, 8 items for the dimension access to technology, 11 items for use of technology in daily activities and 17 items for digital/literacy skills for online learning.

Face validity of the instruments

The second step conducted was refining the instrument's statements by conducting the face validity test through FGDs. The teachers invited to the FGDs came from six schools in three different cities representing west, central and east Indonesia. The schools included both relatively district-level schools and schools in the city center. The teachers were asked to evaluate the instrument's statements for readability, appropriateness of the dimension used to measure readiness and suitability of each statement item. All teachers thought that all items and dimensions were suitable and appropriate. However, they advised editing several statement items to make them easier for high school students to understand. Following an assessment of these revisions' recommendations, the research team developed the final statements to be included in the instrument. The revised instrument, considered as Instrument Version 1, was then distributed to the students of the teachers invited to the FGDs. The instruments were fully completed by 334 students, allowing for the examination of the instrument's reliability and validity test to be done on the data.

Instrument validity and reliability test

As presented in Table 2, the results of the analysis showed that the values of Cronbach's alpha coefficients were all >0.7 and even >0.8. These showed that the overall instrument's construct was quite reliable in explaining the variable of “Independent Online Learning Readiness'. Likewise, the Composite Reliability values, all above 0.9, indicate that there is no problem in measuring the variable 'Readiness for Independent Online Learning” through the developed instrument. However, if we look at the Average Variance Extracted (AVE) values, it turns out to be only indicators in the dimensions “Motivation to Learn” and “Access to Technology” that had AVE values >0.5. This means that indicators in other dimensions did not have the discriminating power in measuring “Independent Online Learning Readiness”. In other words, this instrument did not yet have good discriminant validity.

To reconfirm the non-discriminating values of some of the dimensions, the discriminant validity test was also carried out with a Fornell–Larcker criterion analysis. This analysis would show whether the statements “belonged” to a certain dimension which is indicated by the correlation coefficient value between the same dimension as the highest compared to that between that dimension and other dimensions. Table 3 shows that in the correlation values between the dimension of “Self-Concept” with the dimension of “Self-concept of Independent Learning Skills” and with the dimension of “Motivation to Learn” were greater than the correlation among the indicators within the dimension of “Self-Concept” itself. Likewise, the correlation value between the dimension of “Motivation to Learn” was greater with the dimension of “Self-Concept of Independent Learning Skills” than with the correlation among the indicators within its own dimension; and the correlation value of the dimension of “Use of Technology for Daily Activities” with the dimension of “Digital Skills/Literacy for Online Learning” was greater than the correlation among indicators within its own dimension. The results of this analysis confirmed that the instrument did not yet have good discriminant validity.

To further investigate the problem, a convergent validity test was carried out. Table 4 presents the outer loading value of each statement item on the corresponding dimension as a result of the test. Based on outer loading values, which show the indicator validity coefficient values, it turns out that there were 46 statement items which had a value of 0.7 and below, which indicated that the 46 items were not valid indicators for their own dimensions. These items were likely to cause the instrument's poor discriminant validity. Based on the analysis result, the indicators whose outer loading values were below 0.7 are as follow:

  1. Self-concept dimensions: 1.3, 1.6, 1.7, 1.11, 1.12, 1.14, 1.15, 1.16, 1.19 = 9 items

  2. Dimensions of self-concept of independent learning skills: 2.1, 2.6, 2.7, 2.8, 2.9, 2.11, 2.12, 2.13, 2.14, 2.15, 2.17, 2.18, 2.19, 2.20, 2.22 = 15 items

  3. Dimensions of motivation to learn: 3.3, 3.6, 3.8 = 3 items

  4. Dimensions of access to technology: 4.7 = 1 item

  5. Dimensions of the use of technology in daily activities: 5.1, 5.2, 5.3, 5.5, 5.7, 5.10, 5.11 = 7 items

  6. Dimensions of digital skills/literacy for online learning: 6.1, 6.2, 6.3, 6.5, 6.9, 6.11, 6.12, 5.14, 5.15, 5.16, 5.17 = 11 items

A solution to such problems was to remove the statement items from the instrument, thus leaving the second version of the instrument with only 40 items (Instrument Version 2). After the items were removed, Instrument Version 2 (contained only 40 items) was re-analyzed, and the result showed that the outer loading value of item 4.8 that was previously >0.7 became <0.7 (0.687) so that item 4.8 was then also removed from the Instrument Version 2 and resulted in Instrument Version 3 with 39 items. To further examine whether all indicators were valid, Instrument Version 3 with 39 items was then re-analyzed once again using the convergent validity test. The results showed that the outer loading values of all indicators already >0.7. Based on these results, a validity and reliability analysis was carried out once again to see whether Instrument Version 3 became more valid and reliable after removing the noninfluencing indicator items. The results of the analysis are presented in Table 4.

As seen in Table 4, all coefficient values of both Cronbach's alpha and composite reliability remained >0.7 which means that Instrument Version 3 with 39 statement items had good validity and reliability. In addition, the AVE values were all >0.5 which indicated that the overall instrument also had a good discriminant validity. To ensure this, a test was re-conducted with the Fornell–Larcker criterion. The results, as can be seen in Table 5, still showed the correlation value among indicators within the dimension of “Self-Concept” (0.754) which was smaller than its correlation with the indicators in the dimension of “Motivation to Learn” (0.793). This indicated that there was still a discriminant validity problem in indicators of the dimension of “Self-Concept”.

To further explore which indicators/statement items are causing the problem, a correlation analysis was carried out between the indicators in the dimension of “Self-Concept” and the dimension of “Motivation to Learn”. The results (Table 6) show that the correlation of students' responses to the self-concept dimension's statement items 3.2 and 3.4 with the dimensions of “Motivation to Learn” item 1.17 was the same at 0.60. This same correlation coefficient indicated that the statements on the three items were interpreted as the same by the respondent students, and thus indistinguishable (have no discriminating power).

Based on this, a reanalysis was carried out by omitting one of the three items to obtain the best discriminant validity of the instrument. The results of the analysis with the alternative revocation of one or a combination of the three indicators (items 1.17, 3.2, and 3.4) showed that only if all three items were omitted, then all values of the correlation among indicators within the dimension have had a coefficient greater than their correlation with the indicators of other dimensions. In other words, instruments with 36 items (which means Instrument Version 4) already had good discriminant validity as indicated by the outer loading values of all indicators >0.7.

To see the validity and reliability of Instrument Version 4, another round of construct reliability, discriminant validity and model unidimensionality tests were re-conducted. Table 7 shows that all the coefficient values of Cronbach's alpha and composite reliability remained >0.7 as before, which means that the instrument with 36 statement items has had good validity and reliability. In addition, all AVE values were also still >0.5, which indicated that overall instrument also has had good discriminant validity.

Once again, to confirm the above results, a test was re-conducted using the Fornell–Larcker criterion. The results in Table 8 show that all correlation values among indicators in their respective dimensions were greater than their correlations with other dimensions. This showed that the overall final instrument has indeed had good discriminant validity.

Based on the analysis, it is concluded that the instrument's fourth iteration (Version 4), which contains 36 statement items, was statistically valid and reliable for assessing respondents' readiness for independent online learning. (Shown in Table 9). The final instrument is named the independent online learning readiness scale (IOLRS).

Nevertheless, it is worth mentioning that even though the final IOLRS has been shown to be statistically valid and reliable in assessing high school students' readiness for independent online learning, careful consideration must be given to its application across the country. This is because only students' responses from three cities in relatively urban areas were used to generate the data for the validation and reliability testing. The statement items, however, are formed so that responses range from “completely disagree” to “completely agree” on a scale of 1–4. As a result, it is possible that students from more rural areas will behave differently from those from more urban areas, like the sample in this study, depending on their situation. Therefore, even though the validity of the instrument was based only on students in more urban areas, it may be considered that the results are still accurate.


This study has successfully created a contextual self-assessment tool to assess high school students' readiness for independent online learning. Based on data from 334 high school students of Class X and Class XII, the instrument (the IOLRS) has been statistically proven to have a good reliability, construct and indicator validity and a discriminating power. The IOLRS consists of 36 statement items in six dimensions: (1) self-concept, (2) self-concept of independent learning skills, (3) motivation to learn, (4) access to technology, (5) the use of technology for daily activities and (6) digital skills/literacy for online learning. Furthermore, because the instrument was designed as a scale, it may be assumed that the results of the assessment of students in more rural areas are still accurate even though the validity of the instrument was only tested using students in more urban areas.

Having this instrument is significant for Indonesia since it allows researchers to map how prepared high school students are to pursue independent online learning. The data on this readiness can be used to formulate a training program that can be integrated into or even added to the high school curriculum. Preparing the next generation for the digital learning era during their high school years is the key to providing quality life-long learning opportunities for them and for Indonesia as a whole.

Dimensions of readiness for independent online learning

No.DimensionMeasurement objectives
1Self-concept (A)Reveals the learner's understanding of his own tendencies regarding new ideas, optimism, learning, etc.
2Self-concept of independent learning skills (B)Uncovering learners' perceptions of their skills to learn independently
3Motivation to learn (C)Reveal learners' motivation to carry out learning activities
4Access to technology (D)Uncovering learning access to technology
5The use of technology for daily activities (E)Capture the intensity and comfort of learners in using technology for daily activities
6Digital skills/literacy for online learning (F)Uncovering learners' digital skills for online learning purposes

Results of construct reliability test and discriminant validity of instrument version 1

DimensionCronbach's alphaComposite reliabilityAverage variance extracted (AVE)

Note(s): a Self-concept

b Self-concept of Independent Learning Skills

c Motivation to Learn

d Access to Technology

e The Use of Technology for Daily Activities

f Digital Skills/Literacy for Online Learning

Results of discriminant validity test with Fornell–Larcker criterion


Results of construct reliability test, discriminant validity test, and model unidimensionality of instrument version 3

DimensionCronbach's alphaComposite reliabilityAverage variance extracted (AVE)

Results of discriminant test with Fornell–Larcker criterion of instrument version 3


Correlations among indicators in the dimensions of “self-concept” and “motivation to Learn”


Results of construct reliability, discriminant validity, and model unidimensionality of instrument version 4

DimensionsCronbach's alphaComposite reliabilityAverage variance extracted (AVE)

Results of discriminant validity test using the Fornell–Larcker criterion of instrument version 4


The final instrument of independent online learning readiness scale

NoStatement in final instrumentOld numbering reference
A. Self-Concept
11.1. I am ready to accept new ideas1.1
21.2. I learned from mistakes to get better1.2
31.3. I always get the job done to the end1.4
41.4. I do not give up easily in the face of adversity1.5
51.5. I love to find answers/solutions to every problem1.8
61.6. I love discussing new ideas1.9
71.7. The more I learn, the more interesting the world will be1.10
81.8. Learning new things brought about a change in my life1.13
91.9. I have a strong desire to realize whatever I think1.18
B. Self-concept of Online Learning Skills
102.1. I can define my own learning goals2.2
112.2. I can set a deadline for my study2.3
122.3. I can execute the study plan I made2.4
132.4. I know how to seek help when dealing with learning problems2.5
142.5. I take responsibility for what I learn2.10
152.6. I know what I want to learn2.16
162.7. I know how to learn something2.21
C. Motivation to Learn
173.1. I have the motivation to always learn3.1
183.2. I can find different ways to learn something new3.5
193.3. I can learn effectively, alone or in groups3.7
203.4. I am responsible for my learning success, not anyone else's3.9
D. Access to Technology
214.1. I have a reliable personal devices (laptop, tablet, PC and smartphone)4.1
224.2. I have Internet access with pretty good speeds4.2
234.3. I have private Wi-Fi where I live at a high speed4.3
244.4. My personal devices (laptop, tablet, PC, Smartphone) have anti-virus software that is regularly updated4.4
254.5. My personal devices (laptop, tablet, PC and smartphone) can be used to do video conferencing (such as Skype, Zoom, etc.) just fine4.5
264.6. My personal devices (laptop, tablet, PC, Smartphone) can be used to access various information in multimedia formats (video and audio)4.6
E. Use of Technology for Daily Activities
275.1. Everyday I use online communication media (email, media social)5.4
285.2. I am active and comfortable using the Internet to search and share information with others5.6
295.3. I have a personal account and am active on various social media5.8
305.4. I actively follow groups on social media like Facebook, Instagram, etc.5.9
F. Digital Skills/Literacy for Online Learning
316.1. I confidently use online tools (email, discussion forums) to communicate with others6.4
326.2. I am confident in using the basic functions of Microsoft programs such as MS Word, MS Excel and MS PowerPoint6.6
336.3. I am confident in my abilities and skills in using various online learning software6.7
346.4. I am confident in using the internet to search and gather information in online learning6.8
356.5. My digital and online skills are already good6.10
366.6. I am comfortable surfing the Internet6.13


Alem, F., Plaisent, M., Zuccaro, C. and Bernard, P. (2016), “Measuring e-learning readiness concept: scale development and validation using structural equation modeling”, International Journal of e-Education, e-Business, e-Management and e-Learning, Vol. 6 No. 4, pp. 193-207, doi: 10.17706/ijeeee.2016.6.4.193-207.

Belawati, T., Daryono and Sembiring, M.G. (2020), “Potret awal perkuliahan di era Covid-19”, in Belawati, T. and Nizam (Eds), Pendidikan Tinggi di Era Covid-19, Direktorat Jenderal Pendidikan Tinggi, Jakarta, pp. 31-44.

Blayone, T.J.B., Mykhailenko, O., Kavtaradze, M., Kokhan, M., vanOostveen, R. and Barber, W. (2018), “Profiling the digital readiness of higher education students for transformative online learning in the post-soviet nations of Georgia and Ukraine”, International Journal of Educational Technology in Higher Education, Vol. 15 No. 1, p. 37, doi: 10.1186/s41239-018-0119-9.

BPS (2021a), “Results of the 2020 population census Infographic”, available at: (accessed 8 July 2021).

BPS (2021b), “Gross participation rate (APK) of universities (PT) by province 2018-2020”, available at: (accessed 8 July 2021).

Darmayanti, T. (2008), “Efektivitas intervensi keterampilan self-regulated learning dan keteladanan dalam meningkatkan kemampuan belajar mandiri dan prestasi belajar mahasiswa pendidikan jarak jauh”, Jurnal Pendidikan Terbuka Dan Jarak Jauh, Vol. 9 No. 2, pp. 68-88, available at:

deBruin, K., Jacobs, G.J., Schoeman, W.J. and de Bruin, G.P. (2001), “The factor structure of the self-directed learning readiness scale”, South African Journal of Higher Education, Vol. 15 No. 3, pp. 119-130, doi: 10.4314/sajhe.v15i3.25333, available at:

Geng, S., Law, K.M.Y. and Niu, B. (2019), “Investigating self-directed learning and technology readiness in blending learning environment”, International Journal of Educational Technology in Higher Education, Vol. 16 No. 1, pp. 1-22, doi: 10.1186/s41239-019-0147-0.

Hung, M.-L., Chou, C., Chen, C.-H. and Own, Z.-Y. (2010), “Learner readiness for online learning: scale development and student perceptions”, Computers and Education, Vol. 55 No. 3, pp. 1080-1090, doi: 10.1016/j.compedu.2010.05.004.

Knowles, M. (1975), Self-directed Learning: A Guide for Learners and Teachers, Follett Publishing Company, Chicago, IL.

Kusmawan, U. (2020), “A quadrant for student digital and online learning competencies”, doi: 10.13140/RG.2.2.32774.40009, available at: (accessed 10 July 2021).

Livingston, K. (2012), “Independent learning”, in Seel, N.M. (Ed.), Encyclopedia of the Sciences of Learning, Springer, Boston, MA, pp. 526-1529, doi: 10.1007/978-1-4419-1428-6_895.

Merriam, S. and Baumgartner, L.M. (2007), Learning in Adulthood: A Comprehensive Guide, 4th ed., Jossey-Bass, San Francisco.

Shah, D. (2020), “By the numbers: MOOCs in 2020. Boosted by the pandemic, MOOCs crossed 180 million learners in their ninth year”, available at: (accessed 15 July 2021).


This research was funded by Universitas Terbuka.

Corresponding author

Tian Belawati can be contacted at:,

Related articles