This report explored enriched notions and dimensions of quality massive open online courses (QMOOCs). The purpose of this paper is to visualize the quality measures adjacent to MOOCs and understanding distinctive outlooks to approaching them. It was also of interests to envisage how and in what routines those notions and dimensions interrelated.
Exploratory-design was employed to qualitatively establishing conceptual and operational frameworks first through reviewing processes and focus-group discussions. QMOOCs were reflected by four dimensions: scientifically provable, technically feasible, economically beneficial and socio-culturally adaptable. Besides, QMOOCs involved six notions (6P: presage, process, product, practicability, prospective and power) and affected knowledge, skills and professionalism (KSP). Quantitatively, QMOOCs, 6P and KSP were the moderating, independent and dependent variables, respectively. Associated data were accumulated through survey by distributing 600 questionnaires randomly to 708 Universitas Terbuka faculty members; 299 of them were completed.
Nine hypotheses were scrutinized utilizing structural-equation model and eight were validated by the analysis. It was statistically inferred that product was alluded as the prime notion to QMOOCs followed by process, practicability, presage and power; prospective was excluded. Professionalism, knowledge and skill were influenced by QMOOCs. Importance-performance analysis (IPA) and customer-satisfaction index were emulated (and applied) to quantify respondents opinion and relevance degree of those engaged notions and dimensions. IPA analysis revealed four prominent notions (corresponding, functional, well-defined and learner-focused) and one dimension (technically feasible).
Qualitative framework was imperfectly confirmed by the quantitative upshot. Further inquiry is crucial searching for plausible validation how this consequence was marginally distinctive in conjunction with authenticating QMOOCs.
Emerald Publishing Limited
Copyright © 2018, Maximus Gorky Sembiring
Published in the Asian Association of Open Universities Journal. Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode
Quality was comparatively not objective, it was commonly a kind of measure for a specific purpose. Any discussion of quality primarily in education is, therefore, challenging since it was not a constant construct. Quality issue in education is now getting more complex when it comes to defining the quality of MOOCs. In this context, the quality measure is even much more relative. There was no absolute threshold benchmark can positively be set. There was no definitive list of specific or fixed criteria that MOOCs can be precisely measured against (Hood and Littlejohn, 2016). Despite MOOCs by its own characteristic, as summarized by Gemage et al. (2015), had common nature in the form of short videos, quizzes, peer base and/or self-assignment and online forums but there are still some pedagogical differences in the courses despite in the same platform. Thus, any discussion of quality here must dynamically take into account the diversity amongst MOOCs and various frames of stakeholder references; and they are often differing one another.
In the conventional formal education, there had tended to be consensus among parties involved as to deceiving intention of a specific course or program. Nonetheless, there often remains debate over defined definitions and measures of quality as part of evaluation approach in higher education (Tran, 2015). MOOCs potentially interrupt many of existing conventions and assumptions of formal education, both in offline and online modes of delivery. Their unique features are challenging the parameters of learning and even education, raise new questions about their purposes and the roles that they can play in lifelong learning perspectives. In consequence, MOOCs quality can be viewed and measured in various different ways. The quality measures employed and the nature of data accumulated in each case act to privilege a specific facet of MOOCs, such as instructional design issues, quality of media and/or learner performance. Establishing a robust understanding of quality in MOOCs should consider activities and components that make up the experience of learners; this is a crucial consideration in measuring the range of notions and dimensions of quality (Sumner, 2000).
This inquiry is in essence adopting relativist approach by emphasizing the importance of the context. It implies that identification of variables (notions and dimensions) and their structure unified into the conceptual and operational frameworks, referring to Biggs (1996), do not necessarily provide a concrete answer to the conundrum in assessing the quality. At this stage, we are not establishing conclusive set of measures that can be employed to absolutely measure the quality despite some of the measures emerged from both literature and expert judgement. It is expected that ideas introduced here might underpin related stakeholders to critically thinking on quality in the frame of MOOCs. It is also anticipated that it can emphasize areas of research that might be proposing new instruments and different ways of approaching and measuring the quality of MOOCs. Most importantly, the critical success factors (notions and dimensions) that affecting the quality of MOOCs required to be investigated by and from real participants of MOOCs to make the results more appropriate (Gemage et al., 2015). This is to assure interactivity, colloborativeness, pedagogy and technology influences are considered viewed from user perspectives so that the MOOCs are really effective.
MOOCs in Universitas Terbuka ambiance were institutionally introduced and accessible to everyone within this several years back. As per 2016 for instance, seven programs were officially offered by the university. They were: Marketing Management (Faculty of Economics); Public Speaking; ASEAN Studies (Faculty of Law, Social and Political Science); Assorted Food Processing; Distance Education; Introduction to Moodle (Faculty of Mathematics and Natural Sciences); and Parenting (Faculty of Education and Teacher Training). The number of hits to those programs was promising up to 435,706 hits. The data showed, however, only 1,673 and 1,308 learners were participated (registered) and 182 and 137 completed (graduated) in the first and second semester of 2016, respectively, while the student body was approximately 297,000 (Universitas Terbuka, 2017). This implied that the participation rate was extremely low considering the total population of Indonesia was more than 250m. Most of them were now objectively necessitated self-continuous professional development through self-directed learning mode (MOOCs). What is more, Universitas Terbuka is one of few institutions in the country offering MOOCs without any restriction. Participation rate and demand for MOOCs in Indonesia context (through Universitas Terbuka tradition) should presumably be high.
Referring to those stated factual numbers, there are two prime possibilities why the participation rate and demand were considered to be enormously low. Externally, MOOCs are not well-adored by the society. Internally, the programs offered might not be in harmony with the demand (not market driven) in one hand and/or the quality of available program might still not meet the standard or expectation behold by learners as users on the other hands. This study is, therefore, focus on the latter case, on the quality issues. They were not in good quality might be owing to the notions of presage, process and/or product outlooks. All the same, the university strongly insisted to providing MOOCs to everyone nationally, regionally and even globally; since it was one of the main mission of the university—the dissemination of knowledge. Searching for reasonable reasons methodically on how and why the programs are not well-adored become important. This is supported by the facts that there are still some questions on the pedagogy aspect reflecting by the dropout rates related to the learning activity, learning resources and organization aspects (Jansen et al., 2017). Dealing with these essential issues, two fundamental factors should then be taken into account cautiously, they are requirements for users and recommendation for designers (Stracke, 2017).
The purpose of this exploratory inquiry is, therefore, to identify quality measures of MOOCs and to highlight some of tensions surrounding notions and dimensions of quality in a more detail manner. It is also of interests to envisage the need for new ways of thinking on and approaching agreeable quality Massive Open Online Courses (QMOOCs). It draws upon the literature on both by MOOCs and quality in education more generally. It was expected to provide practical framework to analytically consider the quality programs on different variables (notions and dimensions); these concerns must be considered when conceptualizing plausible quality conundrums in MOOCs.
The conceptual framework
Conceptually, the exploratory framework of the study starts with the general perspectives of MOOCs outlooks in Universitas Terbuka context in Indonesia. This is the basis of the university to provide broader opportunities in relations to making higher education open to all, as the tagline of the university, associated with improving knowledge, skill and professionalism for every Indonesian citizen worldwide behold by faculty members of the university (Figure 1).
The conceptual or the exploratory framework (Figure 1) is then utilized as a tool of weighing up QMOOCs and their inferences noticed from Universitas Terbuka perspectives as the only university operating single mode of delivery through open distance learning (ODL) in the country. This would let the university to modify important aspects related to operational aspects in accommodating learners’ needs. It might focus on institutional directions to accomplish learners need and expectation so the university is able to maintain and make progress on the size and growth of QMOOCs as projected and officially stated in the formal document of the university. In other words, this is the way on how the university searching for proper and adequate orientation to maintain its main role and function in eradicating access to quality education supplies (Universitas Terbuka, 2014).
Before introducing an operational framework for this study, it is worth perceiving that quality MOOCs were determined by six notions, they are here called as the 6P model. The 6P is an extension of 3P model (presage, process and product) as introduced by Biggs (1993) and later elaborated further by Hood and Littlejohn (2016). In addition, as again highlighted by Gemage et al. (2015), this conceptual framework considered to be essential issues to understand what drives motivation and interest as well as changing the culture from have to learn into want to learn so that the pedagogic aspect remains the same as it was delivered by face to face interaction in the tradition teaching and learning milieu.
In Universitas Terbuka, especially for this study, quality measures for MOOCS were determined by six main factors, called the 6P model (an enrichment of the 3P model). Each variable is elaborated into notions and/or dimensions along with their attributes related to QMOOCs. Besides, QMOOCs lead to knowledge, skill and professionalism. To ease the naming, variables with related notions, dimensions and the related indicators are prearranged in Table I.
Conceptual definitions: QMOOCs were defined as a manifestation of presage, process, product, practicability, prospective and power and at the same time they were influencing knowledge, skill and professionalism. Moreover, presage was referred to resource and factor related to teaching and learning process, including learners, instructors, institution and platform. Process was referred to the course of actions associated with the presage variable including instructional design issues, pedagogical approach and various learning resource supports. Product was the outputs and/or outcomes of the total educational processes. Practicability was referred to the easiness of learners to accsessing and using the products in terms of their operations and continuation. Prospect was defined as learners perceptions on the innovation and connectednes of the program related to the current circumstance. Power was expressed as an influential force to nurture learners as users to utilize the program and take advantage of them (Downes, 2013; Lin et al., 2015; Littlejohn et al., 2016; Margaryan et al., 2015; Hood and Littlejohn, 2016).
Operational definitions: In terms of their dimensions, QMOOCs should be scientifically provable, technically feasible, economically beneficial and socio-culturally adaptable (Sembiring, 2008). Additionally, factors close to platform, well-defined, methodical and natural outlooks were specified as the notions of presage. Factors on the point of pedagogy, inclusive, systematic and functional outlooks were specified as the notions of process. Factors resembling to learner-focused, well-presented, appealing and superior outlooks were specified as the notions of product. Factors reminiscent of innovative, advantageous, affable and manageable outlooks were specified as the notions of practicability. Factors indicative to novelty, corresponding, insightful and universal outlooks were specified as the notions of prospective. Factors symptomatic to encouraging, inspiring, satisfying and maintainable outlooks were specified as the notions of power.
Besides, features in the sense of conceptual and operational outlooks were identified as the notions of knowledge. Features in the connotation of hard skill and soft skill outlooks were identified as the notions of skill. Features in the terms of being creative and grit outlooks were identified as the notions of professionalism.
Design, operational framework and hypotheses
This study utilizes mixed methods, i.e. exploratory-design (Creswell and Clark, 2011). It is conducted under qualitative approach first and then followed by quantitative sequence. Two instruments are established; they are the list of unified questions for the processes of review, interview and/or focus-group discussion or FGD (for qualitative purpose) series and questionnaires for accumulating-related data (for quantitative purpose). Qualitative approach was entailed to construct the conceptual framework and eventually lead to propositioning the hypotheses arrangement after constructing the operational framework. To accomplish this stage, some queries were first developed derived from related literature reviews (for FGDs sessions afterwards) and they were finally utilized as the bases to established hypotheses to be examined statistically. With respect to the related literatures study conducted, another qualitative approach was used to form the basic elements of the conceptual framework by conducting FGDs and interviews with assigned three experts (designers) from the university, as well as selected five active learners.
Designers and learners, as resource persons for this research, were basically interviewed under five main questions for attaining deeper insight on the context related to: What would be predictive notions and dimension of accepted MOOCs as an moderating (M) variable with respect to QMOOCs? What would be potential indicators associated with each notion and dimension engaged? How would be the influence of QMOOCs to learners ability (performance) in terms of knowledge, skills and professionalism perspectives interrelated one another? How would designers rate on the MOOCs provided by the university? and How would learners rate on the MOOCs delivered by the university and directly experienced by them?
Having completed all qualitative series and amalgamated all the results derived from reviews and FGD sessions, they were all systematically summarized, as can be seen in Table I. The table is utilized as the basis of developing required and relevant instruments as well as establishing the initial operational framework, as illustrated in Figure 2.
Variables engrossed are explored through questionnaire inspired by Tjiptono and Chandra (2011) and Shahzavar and Tan (2011). Questions in quantitative phase for the questionnaires (referring to I11–I64 and M1–M4) were simultaneously answered two times by respondents. The first and second answers are to measure respondent opinion level and their importance degree, respectively, on QMOOCs. The rests (referring to D11–D32) were answered once to view the impact of QMOOCs related to learner knowledge, skill and professionalism. Plus, an extra closing question on the future of MOOCs with a good quality measures in Universitas Terbuka context. Survey is implemented to accumulate required data from respondents (Fowler, 2014). Purposive (for qualitative purpose) and simple random (for quantitative purpose) sampling techniques are chosen to select eligible respondents, respectively (Cochran, 1977). For qualitative series, three designers and five learners were involved. For quantitative series, 600 questionnaires were provided and randomly distributed to all academic staff (they are 708 in total). Importance-performance analysis customer and satisfaction index (IPA–CSI) are emulated and simultaneously employed to measure their opinion along with relevance degree on QMOOCs (Silva and Fernandes, 2010; Sembiring, 2016). Structural-equation model (SEM) is finally used to identify plausible relations among all variables involved (Marks et al., 2005; Wijayanto, 2008; Hair et al., 2009).
Figure 2 also describes features affecting QMOOCs (M) leading to learners’ knowledge (D1), skill (D2) and professionalism (D3). Dimensions of QMOOCs should be scientifically provable (M1), technically feasible (M2), economically beneficial (M3) and socio-culturally adaptable (M4). The QMOOCs (M) were assessed by perceiving the notions and attributes of presage (I1), process (I2), product (I3), practicability (I4), prospective (I5) and power (I6). The instrument (questionnaire) consisted of 2×28 questions related to the respondent opinion on the QMOOCs and the level of their importance. Plus six additional questions to validate knowledge, skill and professionalism; whether or not they were affected by and relatable to QMOOCs; and one closing specific question on the future of MOOCs in Universitas Terbuka tradition (consisted of 63 questions in total). Serially, these results will afterwards be unified and compared with earlier qualitative framework established.
This approach then statistically scrutinizes the following hypotheses (Figure 2):
QMOOCs are directly influenced by presage.
QMOOCs are directly influenced by process.
QMOOCs are directly influenced by product.
QMOOCs are directly influenced by practicability.
QMOOCs are directly influenced by prospective.
QMOOCs are directly influenced by power.
QMOOCs are directly influenced by learner knowledge.
QMOOCs are directly influenced by skill.
QMOOCs are directly influenced by professionalism.
Results and discussions
Before conversing to the end upshots, it is valuable to represent the respondent characteristics (Table II). This will enrich perspectives on the outcomes obtained afterwards. Other elaborative analyses are further detailed in the next clarification (Table III, Figures 3 and 4).
Before describing the final results, on the relations power amongst notions and dimensions engaged and how they were interrelated one another, it is good revealing the level of respondent opinion on QMOOCs and their relevance degree resulted from IPA–CSI chart. The analysis generates spots of opinion to the related quadrants (Q) to comprehend the degree of their relevance (Figure 3). The figure has four Qs: Q1 (Concentrate Here), Q2 (Maintain Performance), Q3 (Low Priority) and Q4 (Possible Overkill); following Wong et al. (2011).
Q1 has two attributes that should be carefully noted by the university: I23 (systematic process) and I64 (maintainable). Q1 indicates that respondent’s opinion is at a low level whereas the degree of its relevance is high. Here, the university must pay attention to this evidence and put them into the top of priority so the provision of MOOCs with high-quality measures might be fulfilled and the program itself (quality MOOCs) are more likely to be well-adored by most learners as users for they are encouraging and advantageous.
Q2 includes 11 points that should be cautiously recognized by the university: I52 (corresponding); I24 (functional); I22 (inclusive); I31 (learner-focused); I21 (pedagogy aspect); I51 (novelty); I34 (superior product); M2 (technically feasible); I33 (appealing); I11 (the platform) and I32 (well-presented). Q2 is a symptom of respondent opinion and the relevance degree are both being placed at a high level. The university must take care of these 11 points so that more learners get advantages and will pursue their programs through MOOCs with intent. All attributes fall into this quadrant are the strength and pillar of QMOOCs in Universitas Terbuka context; these are the critical points of authenticating the quality measures of MOOCs.
Q3 has 13 points that should also be remarked by the university: I53 (insightful); I43 (affable); I61 (encouraging); M1 (scientifically provable); I63 (satisfying); I42 (advantageous); I44 (manageable); I62 (inspiring); I41 (innovative); M3 (economically beneficial); M4 (socio-culturally adaptable); I14 (natural) and I13 (methodical). Q3 is an indication of both respondents opinion and degrees of their relevance are in a low category. The university should classify them all as the next focus after concentrating on critical points especially found in Q1 and maintaining all points in Q2. Any attribute falls into this quadrant is not so important and poses no direct threats with respect to assuring MOOCs with good quality measures.
Finally, Q4 has two points, they are: I12 (well-defined) and I54 (universal). Q4 indicates that these two points are considered much less important to approaching QMOOCs but most respondents considered them as high in relevance. Here, again, attention to attributes included in this quadrant might be less focused. So, the university can save costs by redirecting them to take up crucial point in Q1 and again maintaining all fundamental aspect found in Q2 to satisfy learners need in the provision of QMOOCs.
Now, let observe hypothesis analysis and the loading factor outcomes from the examined model, as illustrated in Figure 4, to witness how the real interrelations amongst notions and dimensions involved as well as the power of their relations one another.
Figure 4 clearly shows that one of the nine hypotheses examined was statistically not validated by the analysis (H5=1.42, namely prospective to QMOOCs), as the Hvalue ⩽1.96 for α=0.05. Contrariwise, the rests were all directly and positively validated by the analysis. They are: H1=2.21 (presage to QMOOCs), H2=4.46 (process to QMOOCs), H3=6.61 (product to QMOOCs), H4=2.49 (practicability to QMOOCs), H6=2.01 (power to QMOOCs), H7=11.14 (QMOOCs to knowledge), H8=9.69 (QMOOCs to skill) and H9=14.26 (QMOOCs to professionalism), as the Hvalue ⩾1.96 for α=0.05.
Having scrutinized the hypotheses and arranged all notions and dimensions in appropriate quadrants, we are now in the position of relating the loading factors of the tested quantitative framework to observe the power of relations under SEM approach to positively work out the end results. Figure 4 explicitly reveals five prime remarks that need to be elaborated and highlighted in further details:
The first is related to the five main variables that directly and positively influencing QMOOCs. They are orderly rank as: product (0.35), process (0.27), practicability (0.12), presage (0.10) and power (0.08). Note cautiously that the most critical aspect here is on the product of the MOOCs itself. Consulting to the initial 3P model, with an extension version called 6P for this study, it remains that the most influential factor is still from the initial model that is product (Biggs, 1993). It means that MOOCs is considered to be having good quality measures observed from the product itself first.
The second is relatable to the rank of attributes associated with the five validated notions as the independent variables, namely:
Product (I3): learner-focused (I31=0.84), superior (I34=0.83), well-presented (I32=0.82) and appealing (I33=0.78). It implies that learner-focused is the most important aspect in the product of QMOOCs according to the staff as compared to the other notions and dimensions in this setting. This result was also acknowledged by both designers and learners resulted from qualitative approach conducted beforehand.
Process (I2): pedagogical notion (I21=0.85), functional (I24=0.84), systematic (I23=0.81) and inclusive (X)22=0.79). Here, pedagogy is the most critical aspect behold by the staff. This was really the main concern to assure the quality of MOOCs as also highlighted by both Gemage et al. (2015) and Jansen et al. (2017).
Practicability (I4): innovative (I41=0.87), affable (I43=0.85), manageable (I44=0.73) and advantageous (I42=0.71). Here, innovativeness is very important in terms of providing QMOOCS. Innovativeness is one way of making MOOCs become more attracting viewed from learners standpoint.
Presage (I1): platform (I11=0.88), methodical (I13=0.84), well-defined (I12=0.72) and natural (I14=0.67). In this part, the platform of the measures is still the most valuable one according to most respondents.
Power (I6): maintainable (I64=0.86), satisfying (I63=0.78), encouraging (I61=0.76) and inspiring (I62=0.0.71). QMOOCs in the frame of their power are critically related to maintaining the substance experienced by learners.
The third is concerning the order of MOOCs dimensions viewed from quality measures outlook. It was discovered successively as follows: technically feasible (M2=0.80), scientifically provable (I1=0.78), economically beneficial (I3=0.68) and socio-culturally adaptable (I4=0.66). It is good to see respondents placed technical issue as the prime concern with respect to providing MOOCs with good quality measures. It also implies that what is crucial for them, as novice in MOOCs movement in Universitas Terbuka tradition, is on the technical issues rather than that of scientific and economic aspect and even from social and cultural notions. In other words, they are more interested in knowing the technical issues for the future so that learners have wider opportunity to have a better future after completing their MOOCs programs.
The fourth is on the relation powers of QMOOCs towards the dependent variables. Figure 4 evidently confirmed that MOOCs with accepted quality measures had significant effects on: professionalism (D3=0.35) then followed by knowledge (D1=0.31) and skill (D2=0.23). This implies that most respondents strongly believed that QMOOCs are able to assuring learner’s professionalism and knowledge rather than that of improving their skill; this is positive as the mode of delivery utilizing media of learning and fully ICT-based. This is to say that, according to most respondents, acquiring skill with more practical work through laboratory work for example is more difficult rather than that of acquiring knowledge and even professionalism (indicated by becoming more creative and grit).
The fifth is on the rank of professionalism, they orderly are: grit (D32=0.80) and followed by creative (D31=0.63). On the rank of knowledge, they are: operational (D12=0.83) and conceptual (D11=0.72). On the rank of skill, they are: hard skill (D21=0.80) and soft skill (D22=0.80). It is important here that respondents are willing to endorse MOOCs movement in Universitas Terbuka milieu as they are good for improving determination and creativity as part of professionalism in this fast growing environment. This is rightly positive related to educating the nations without time and geographical constraints anymore.
Before amalgamating the qualitative and quantitative results, it is worth considering the analysis of goodness of fit of the tested operational framework. The analysis shows that they are all in good fit category (Table III). It implies that the quantitative result is statistically reliable to be used as a point of reference to draw inferential closing remarks to be unified with the qualitative construct that had been established beforehand. It is important to bear in mind that under statistical approach one of the independent variable (prospective) was excluded by the analysis as one of the influencer to QMOOCs. Since the tested framework was categorized in good fit category, this upshot is reliable. All the same, the qualitative approach previously established that prospective was one of the main factor to QMOOCs. Further inquiry need to be done to see how this result ends up with this slight difference.
Having collected and aggregated outcomes accomplished by quantitative and qualitative inquiries, three major validities need to be noticed and elaborated thoughtfully. The first is on the conceptual and operational framework (Figures 2 and 4 and Table I). The second is on the IPA–CSI chart (Figure 3). The third is on the chosen methodology property (qualitative vs quantitative approaches).
It was quantitatively understood that professionalism was confirmed as the primary aspect and then followed by knowledge and skill as a results of MOOCs with good quality measures. This result is slightly distinctive with the qualitative inquiry previously obtained from literatures, interviews and FGDs series. Besides, in terms of their order, quantitative effects also showed slight dissimilarity. Nevertheless, they only varied in the attributes level and order of the dependent and independent variables. This is a good sign. This implies that the results obtained under quantitative approach are still in the same sphere with quite low contradictory effect as compared to the qualitative structure. In the qualitative structure, QMOOCs positively leads to knowledge first and then followed by skill and professionalism. In quantitative upshot, it goes to professionalism first and then followed by knowledge and skills.
The quantitative outcomes here partially excluded prospective variable with its notions as compared to qualitative framework; supplementary explanation is clearly needed for this difference. Why? MOOCs were initially intended to open more access and/or opportunities for lifelong learning movement and continuous professional development at the same time. It implies that the future of MOOCs is important. The quantitative end shows that the prospect of MOOCs was not included as one of the main important predictive factor. Whereas in qualitative structure, it was considered by both designers and learners as one of crucial element in QMOOCs. From Table II, it was detected that most respondents are novice in MOOCs movement in the university environment. It implies that most of them have fewer experiences in the movement of MOOCs as a new approach to developing human resources in Indonesia perspective. It also implies that the vast majority of them are more sensitive searching for learning resources with superior quality product and practicability rather than the prospect of the material in terms of their novelty and even the appropriateness. In short, this is the validation on how and why the prospective notion was statistically excluded by the analysis. The rests of quantitative outcomes are consistent with qualitative marks.
The IPA–CSI chart effects were reinforced quantitatively by SEM outputs. By combining these upshots, it will objectively direct the university to formulate alternative course of actions for future needs on anticipating and authenticating MOOCs with appropriate quality measures accordingly. Qualitative inquiry, here, was in line with the quantitative conclusion. It has been phenomenon that most universities are limited by the tangible resources, 5-M (man, money, material, machine and method) in exploring new ways of improving the quality of any service; especially in developing and maintaining QMOOCs. By considering this constraint, according to Sembiring (2016), it is just right to re-formulate new ideas how to effectively re-direct resources such that sufficient efforts and supports are available to deal with aspects in Q1 and maintaining critical aspects in Q2, as also suggested by Tileng et al. (2013). It should be cautiously taken into account that maintainable attribute (as a notion of power, I64) and systematic attribute (as a notion of process, I23) are two critical notions in developing QMOOCs. These two notions were in a low level but they are relevance with respect to providing QMOOCs.
This result will be incredibly useful to re-formulate on things that should be put as the top priority by the university to fulfill the quality measures of MOOCs in conjunction with satisfying the needs of learners. The attributes dropped into Q1 (Concentrate Here Quadrant—systematic and maintainable notions) should be brilliantly controlled. At the same time, 11 notions dropped into Q2 (maintaining performance) should also be repeatedly preserved as they are the pillar of QMOOCs presuming the university is going to pursue good quality measures in MOOCs. By all means, the notions from Q1 should be moved onto Q2. It will improve the possibility of learners getting satisfied. The more learners satisfied, the more likely they accessed and utilized the program (MOOCs). This implies that the university will be able to approaching the vision through the three missions, namely enlarging access, developing system and disseminating science and technology through ODL mode of delivery as it was initially projected in the strategic plan (Universitas Terbuka, 2017).
Looking up to the third effect, it appears that the mixed method used in this study (exploratory-design) is reliable despite the slight and minor difference on the end results still did take place. The differences in terms of the end results took place insignificantly in the hypotheses testing (still in positive relation); not in the conceptual outlooks within the dependent variables. Despite the difference, it does not indicate they are in high contradictory intensity. It can then be inferred that the difference took place are basically to amplify perspectives on the context supposing comparable study is arranged in the near future.
At the end, respondents were asked a closing question: what is your perception and expectation on the MOOCs movement and the possibility of their success operations through Universitas Terbuka experience? The answers provide a quite robust acceptance that in the future the university will be able to accomplish the initial planned in terms of providing and maintaining MOOCs movement. How? As the answers to this question is convincing. They are: completely disagree=3 percent, disagree=12 percent, agree=39 percent, strongly agree=37 percent and extremely agree=9 percent. Up to 85 percent of respondents believed the university is on the right path to uphold its righteous missions in providing MOOCs for all; this is a good indication for the university.
Having discussed all related notions and dimensions with respect to QMOOCs, it is important to note that MOOCs have the promise to transform and enhance the future of learning practices. It has been identified the need for new strategies and measures dealing with high attrition rates by looking at the validated five main factors previously discussed in conjunction with meeting learners’ requirements and intentions. It was suggested that quality indicators for MOOCs have to take into account and evaluate the individual goals and intentions viewed from learners attitude (Stracke, 2017). The design has to address and enable the variety of different individual motivations, intentions, as well as learners pursued attainments. All these aspects can be recognized by probing personal objectives and their reflection, offering personalization of learning pathways by measuring the success of MOOCs based on actual individual intentions.
The research has created both qualitative and quantitative frameworks of quality measures on MOOCs and their notions and dimensions in Universitas Terbuka milieu with respect to their links extended from a comprehensive analysis of educational perspective and staff attitudinal. The frame was validated using SEM through assessing empirical data by survey of 299 respondents (Universitas Terbuka faculty members). The study ascertains QMOOCs leads to professionalism and then followed by knowledge and skill characteristics. Additionally, QMOOCs are affected by product, process, practicability, presage and power. Three main variables, as introduced by Biggs (1993), are significantly influencing QMOOCs. Under IPA–CSI procedure, two notions that should be cautiously noticed in anticipating and fulfilling learners’ satisfaction in using MOOCs were maintainable and systematic notions; and stated eleven notions should be repeatedly highlighted at the same time.
Further research, however, is crucial and it should explore quality notions and dimensions level beyond what had been included in the tested framework here searching for reasons behind the slight differences as previously disclosed. The scope of the study should also be broadened beyond faculty members from Universitas Terbuka. It would put forward more comprehensive perspectives on professionalism, knowledge and skill variables with reference to QMOOCs to meeting learner’s needs as open and distance learners. This in turn will obviously improve learner’s completion rate utilizing MOOCs through Universitas Terbuka attempts; this is in line with Prena et al. (2014).
Referring to the analysis previously described, as also underlined by Stracke (2017), it needs to draw consequences (for learners) and recommendations (for designers) related to the provision of QMOOCs. Both learners and designers are required to realize on the objective, realization and achievement aspects of involving in MOOCs. Consequently, learners should be aware of different individual goals, various learning strategies and different intention to achieve in the MOOCs in one hand. On the other hands, designers should also be more sensitive asking for individual learning objectives and their reflection, offer personalization of learning pathways and measure the success of MOOCs related to individual goals or intentions.
In short, this will provide opportunity for the university to be more contributively in supporting the government of Indonesia to eradicate restraints access to quality higher education. If this awareness is emblematical worldwide, as indicated by Zhenghao et al. (2015), management and academic elsewhere are also well-advised to ruminate on the notions and dimensions of QMOOCs explored here. It was to prolonged continued existence of their institution in the provision of MOOCs with supreme quality and more importantly they are learner-centered and well-presented.
For the nations, through Universitas Terbuka experiences, professionalism, knowledge and skill can be conquered through the provision of MOOCs with utmost quality measures. This means that the university is on the right path to encourage its righteous mission of making higher education open to all with respect to protecting the nation through flexible quality education. The university will be in harmony to reorganize the vision of becoming the world quality institution preparing the world quality graduates (Universitas Terbuka, 2014, 2017; Sembiring, 2015, 2016).
Variables, dimensions and remarks
|1||Presage I1||I11: platform
|I, M and D stand, respectively, for independent, moderating and dependent
I1–6, M and D1–3 were independent, moderating and dependent variables successively
|2||Process I2||I21: pedagogy
|Each independent variable (I) has four dimensions and four questions
These questions should be answered two times concurrently
|3||Product I3||I31: learner-focused
|The first part of each question measured their opinion level and the second part measured its importance degree|
|4||Practicability I4||I41: innovative
|M was influenced by I1–6. Others (D1–3) are determined by M and questions on these three variables were answered one time only by respondents
Total questions: 63 ((2×28)+(1×6)+1)
|5||Prospective I5||I51: novelty
|6||Power I6||I61: encouraging
|7||Quality MOOCs M||M1: scientifically provable
M2: technically feasible
M3: economically beneficial
M4: socio-culturally adaptable
|8||Knowledge D1||D11: conceptual level
D12: operational level
|9||Skill D2||D21: hard skill
D22: soft skill
|10||Professionalism D3||D31: being creative
D32: being grit
|Faculty 299 respondents||Education 31%||Social science 26%||Economics 21%||Math 22%||Total 100%|
|Echelon||One: 0||Two: 1||Three: 4||Four: 9||Non: 86|
|Experience year||1–5: 2||6–10: 10||11–15: 21||16–20: 26||21+: 41|
|Age year||<29: 4||30–39: 12||40–49: 52||50-59: 24||60++: 8|
|Involved in MOOCs year||<1=43||1–2=49||3–4=8||5–6=0||7+=0|
|Credential||Professor: 0||Senior lecturer: 8||Lecturer: 89||Assistant lecturer: 3|
|Education||Doctoral: 6||Master: 92||Bachelor: 2||Diploma: 0|
Notes: Three designers and five learners were involved in this study (for qualitative approach). Besides, 299 completed questionnaires (out of 600 distributed questionnaires for all 708 academic staff) were finally processed. It is worth to note that most of staff are relatively less experienced in MOOCs despite they are actually senior in terms of working experiences and age. Besides, the vast majority of them are holding master degree and non-echelon staff. To certain extent, they can be categorized as novice in the movement of MOOCs in Universitas Terbuka tradition
Goodness of fit of the model
|Goodness of Fit||Cut-off values||Results||Remarks|
|Root mean square of residual (RMSR )||⩽0.05 or ⩽0.1||0.071||Good fit|
|Root mean square error of approximation (RMSEA)||⩽0.08||0.056||Good fit|
|Goodness of fit index (GFI )||⩾0.90||0.970||Good fit|
|Adjusted goodness of fit index (AGFI)||⩾0.90||0.960||Good fit|
|Comparative fit index (CFI)||⩾0.90||0.980||Good fit|
|Normed fit index (NFI)||⩾0.90||0.960||Good fit|
|Non-normed fit index(NNFI)||⩾0.90||0.980||Good fit|
|Incremental fit index (IFI)||⩾0.90||0.980||Good fit|
|Relative fit index (RFI)||⩾0.90||0.950||Good fit|
Biggs, J. (1993), “From theory to practice: a cognitive systems approach”, Higher Education Research & Development, Vol. 12 No. 1, pp. 73-85, available at: http://dx.doi.org/10.1080/0729436930120107 (accessed March 2, 2017).
Biggs, J. (1996), “Enhancing teaching through constructive allignment”, Higher Education, Vol. 32 No. 3, pp. 347-364.
Cochran, W.G. (1977), Sampling Techniques, 3rd ed., John Wiley & Son, New York, NY.
Creswell, J.W. and Clark, V.L.P. (2011), Designing and Conducting Mixed Methods Research, 2nd ed., Sage Publication, Los Angeles, CA.
Downes, S. (2013), “The quality of massive open online corses”, available at: http://mooc.cfquel.org/files/2013/05/week2-The-quality-of-massive-open-online-courses-StephenDownes.pdf (accessed April 24, 2013).
Fowler, F.J. Jr (2014), Survey Research Methods, 5th ed., Sage Publication, Los Angeles, CA.
Gemage, D., Fernando, S. and Perera, I. (2015), “Quality of MOOCs: a review of literature on effectiveness and quality aspect”, 8th IEEE Ubi-Media Conference, August, Colombo, pp. 224-229, doi: 10.1109/UMEDIA.2015.7297459.
Hair, J.F. Jr, Black, W.C., Babin, B.J. and Anderson, R.E. (2009), Multivariate Data Analysis with Readings, 7th ed., Prentice Hall, NJ.
Hood, N. and Littlejohn, A. (2016), Quality in MOOCs: Surveying the Terrain, Commonwealth of Learning, Burnaby.
Jansen, D., Rosewell, J. and Kear, K. (2017), “Quality frameworks for MOOCs”, in Jemni, M. and Kinshuk, M.K. (Eds), Open Education: From OERs to MOOCs, Lecture Notes in Educational Technology, Springer, Heidelberg and Berlin, pp. 261-281.
Lin, Y.-L., Lin, H.-W. and Hung, T.-T. (2015), “Value hierarchy for massive open online courses”, Computers in Human Behaviour, Vol. 53 No. C, pp. 408-418.
Littlejohn, A., Hood, N., Milligan, C. and Mustain, P. (2016), “Learning in MOOCs: motivations and self-regulated learning in MOOCs”, Internet and Higher Education, Vol. 29, pp. 40-48.
Margaryan, A., Bianco, M. and Littlejohn, A. (2015), “Instructional quality of massive open online courses (MOOCs)”, Computers & Education, Vol. 80 No. 2015, pp. 77-83.
Marks, R.B., Sibley, S.D. and Arbaugh, J.B. (2005), “A structural equation model of predictor for effective online learning”, Journal of Management Education, Vol. 29 No. 4, pp. 531-563.
Prena, L., Ruby, A., Boruch, R., Wang, N., Scull, J., Ahmad, S. and Evans, C. (2014), “Moving through MOOCs: understanding the progression of users in massive open online courses”, Education Researcher, Vol. 43 No. 9, pp. 421-432.
Sembiring, M.G. (2008), The Art of Great Teaching Series: Menjadi Guru Sejati (of Becoming the Great Teacher), Galang Press, Yogyakarta.
Sembiring, M.G. (2015), “Validating student satisfaction related to persistence, academic performance, retention and career advancement within open distance learning perspectives”, Open Praxis, Vol. 7 No. 4, pp. 311-323.
Sembiring, M.G. (2016), “Exposing academic excellence and satisfaction related to persistence perceived by open distance learning graduates”, paper presented at the Educational Technology World Conference, Bali, July 31–August 3, available at: www.seminars.unj.ac.id/etwc (accessed August 5, 2016).
Shahzavar, Z. and Tan, B.H. (2011), “Developing a questionnaire to measure students’ attitude toward the course blog”, Turkish Online Journal of Distance Education, Vol. 13 No. 1, pp. 200-210.
Silva, F. and Fernandes, O. (2010), “Using importance-performance analysis in evaluating of higher education: a case study”, International Conference on Education and Management Technology, IEEE, pp. 121-123.
Stracke, C.M. (2017), “The quality of MOOCs: how to improve the design of open education and online courses for learners?”, in Zaphiris, P. and Loannou, A. (Eds), Learning and Collaboration Technology. Novel Learning Ecosystems, Lecture Notes in Computer Science, Vol. 10295, Springer, Cham, pp. 285-293.
Sumner, J. (2000), “Serving the system: a critical history of distance education”, Open Learning, Vol. 15 No. 3, pp. 267-285.
Tileng, M.Y., Wiranto, H.U. and Latuperissa, R. (2013), “Analysis of servqual using servqual method and IPA in population department, Tomohon City, South Sulawesi”, International Journal of Computer Applications, Vol. 70 No. 19, pp. 23-30.
Tjiptono, F. and Chandra, G. (2011), Service, Quality and Satisfaction, Penerbit Andi, Yogyakarta.
Tran, N.D. (2015), “Reconceptualisation of approaches to teaching evaluation in higher education”, Issues in Educational Research, Vol. 25 No. 1, pp. 50-61, available at: www.iier.org.au/iier25/tran.html
Universitas Terbuka (2014), Strategic and Operational Planning of Universitas Terbuka 2014–2021, Universitas Terbuka, Tangerang Selatan.
Universitas Terbuka (2017), Rector Office Yearly Report of 2016, Universitas Terbuka, Tangerang Selatan.
Wijayanto, S.H. (2008), Structural Equation Modeling – Lisrel 8.80, Graha Ilmu Press, Yogyakarta.
Wong, M.S., Hideki, N. and George, P. (2011), “The use of importance-performance analysis in evaluating Japan’s e-government services”, Journal of Theoretical and Applied Electronic Commerce Research, Vol. 6 No. 2, pp. 17-30, available at: www.jtaer.co (accessed March 9, 2016).
Zhenghao, C., Alcorn, B., Christensen, G., Eriksson, N., Koller, D. and Emanuel, E. (2015), “Who’s benefiting from MOOCs, and why”, Harvard Business Review, available at: https://hbr.org/2015/09/whos-benefiting-from-moocs-and-why