Although student-centered learning (SCL) has been encouraged for decades in higher education, to what level instructors are practicing SCL strategies remains in question. The purpose of this paper is to investigate a university faculty’s understanding and perceptions of SCL, along with current instructional practices in Qatar.
A mixed-method research design was employed including quantitative data from a survey of faculty reporting their current instructional practices and qualitative data on how these instructors define SCL and perceive their current practices via interviews with 12 instructors. Participants of the study are mainly from science, technology, engineering and mathematics (STEM) field.
Study results show that these instructors have rather inclusive definitions of SCL, which range from lectures to student interactions via problem-based teamwork. However, a gap between the instructors’ perceptions and their actual practices was identified. Although student activities are generally perceived as effective teaching strategies, the interactions observed were mainly in the form of student–content or student-teacher, while student–student interactions were limited. Prevailing assessment methods are summative, while formative assessment is rarely practiced. Faculty attributed this lack of alignment between how SCL could and should be practiced and the reality to external factors, including students’ lack of maturity and motivation due to the Middle Eastern culture, and institutional constraints such as class time and size.
The study is limited in a few ways. First regarding methodological justification the data methods chosen in this study were mainly focused on the faculty’s self-reporting. Second the limited number of participants restricts this study’s generalizability because the survey was administered in a volunteer-based manner and the limited number of interview participants makes it difficult to establish clear patterns. Third, researching faculty members raises concerns in the given context wherein extensive faculty assessments are regularly conducted.
A list of recommendations is provided here as inspiration for institutional support and faculty development activities. First, faculty need deep understanding of SCL through experiences as learners so that they can become true believers and implementers. Second, autonomy is needed for faculty to adopt appropriate assessment methods that are aligned with their pedagogical objectives and delivery methods. Input on how faculty can adapt instructional innovation to tailor it to the local context is very important for its long-term effectiveness (Hora and Ferrare, 2014). Third, an inclusive approach to faculty evaluation by encouraging faculty from STEM backgrounds to be engaged in research on their instructional practice will not only sustain the practice of innovative pedagogy but will also enrich the research profiles of STEM faculty and their institutes.
The faculty’s understanding and perceptions of implementing student-centered approaches were closely linked to their prior experiences – experiencing SCL as a learner may better shape the understanding and guide the practice of SCL as an instructor.
SCL is not a new topic; however, the reality of its practice is constrained to certain social and cultural contexts. This study contributes with original and valuable insights into the gap between ideology and reality in implementation of SCL in a Middle Eastern context.
Saed Sabah and Xiangyun Du (2018) "University faculty’s perceptions and practices of student centered learning in Qatar: Alignment or gap?", Journal of Applied Research in Higher Education, Vol. 10 No. 4, pp. 514-533Download as .RIS
Emerald Publishing Limited
Copyright © 2018, Saed Sabah and Xiangyun Du
Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode
In general, higher education (HE) faces challenges in providing students with reasoning and critical thinking skills, problem formulation and solving skills, collaborative skills and the competencies required to cope with the complexity and uncertainty of modern professions (Henderson et al., 2010; Seymour and Hewitt, 1997; Martin et al., 2007; Smith et al., 2009). HE research often reports that traditional lecture-centered education does not provide satisfactory solutions to these challenges (Du et al., 2013; Smith et al., 2009), thereby failing to facilitate students’ meaningful learning of their subjects (Henderson et al., 2010). In some cases, it has resulted in a deficit of university graduates from certain fields, in particular, science, technology, engineering and mathematics (STEM) fields (Graham et al., 2013; Seymour and Hewitt, 1997; Watkins and Mazur, 2013). A change in instructional practices is believed to be necessary to provide students with the requisite skills and competencies, and could potentially serve as a retention strategy in these particular fields such as STEM (Graham et al., 2013; Seymour and Hewitt, 1997; Watkins and Mazur, 2013). Therefore, it is essential to innovate the pedagogical methods and practices used in these fields (American Association for the Advancement of Science (AAAS), 2013; Henderson et al., 2010).
Instructional change has resulted in a variety of pedagogical reform initiatives that have been encouraged in STEM classroom practices, including active learning, inquiry-based learning, collaborative learning in teams, interactive learning, technology-enhanced learning, and peer instruction. A substantial body of literature has reported research results regarding how these innovative instructional strategies affect student learning (Graham et al., 2013; Henderson et al., 2010; Watkins and Mazur, 2013). Despite a worldwide trend in instructional change toward student-centered learning (SCL), to what extent university instructors are implementing or practicing these strategies and how they perceive this change is still in question. The international literature has reported that lecture remains the prevailing instructional practice in STEM classrooms despite the waves of pedagogical innovation encouraged at an institutional level (Hora and Ferrare, 2014; Froyd et al., 2013; Prince and Felder, 2006; Walczyk and Ramsey, 2003). In addition, STEM faculty may discontinue its practice of certain types of instructional innovation at certain stages of innovation diffusion due to various reasons including institutional challenges such as a heavy work load and large class sizes, and the lack of individual interests (Henderson and Dancy, 2009). Furthermore, the fidelity of the implementation of SCL approaches is also in question (Borrego et al., 2013). Therefore, this study aims to investigate how faculty who work as instructors in STEM undergraduate programs report their instructional practices and how they perceive the implementation of SCL instructional strategies in their situated contexts.
2. Literature review
Over the past few decades, a global movement emerges and calls for a new model of learning for the twenty-first century and several key elements are highlighted including solving complex problems, communication, collaboration, critical thinking, creativity, responsibility, empathy, management, among others (NEA, 2010; Scott, 2015). Following this trend, university teaching and learning has transformed from being lecture based and teacher centered to focusing more on engaging and enhancing student learning (Barr and Tagg, 1995; Kolmos et al., 2008; Slavich and Zimbardo, 2012). In the process of this transformation, SCL has become a well-used concept. Defined as an approach that “allows students to shape their own learning paths and places upon them the responsibility to actively participate in making their educational process a meaningful one” (Attard et al., 2010, p. 9), SCL is focused on providing an active-learning environment in flexible curricula with the use of learning outcomes to understand student achievements (pp. 10-12). Rooted in a constructivist approach that moves beyond mere knowledge transmission, such learning is conceived as a process whereby learners search for meaning and generate meaningful knowledge based on prior experiences (Biggs and Tang, 2011; Dewey, 1938).
In the STEM fields, instructional practices of instructors are changing from teacher-directed approaches to student-centered approaches to improve the quality of undergraduate education (Justice et al., 2009). A substantial number of studies have reported the positive effects of a variety of approaches to student-centered pedagogy in STEM HE, such as active learning (Felder et al., 2000; Freeman et al., 2014), small-group learning (Felder et al., 2000; Freeman et al., 2014; Springer et al., 1999; Steinemann, 2003), and inquiry-based pedagogy (Anderson, 2002; Curtis and Ventura-Medina, 2008; Duran and Dökme, 2016; Ketpichainarong et al., 2010; Martin et al., 2007; Simsek and Kabapinar, 2010). Furthermore, problem- and project-based pedagogy has been well documented as an effective way to help students not only construct subject knowledge meaningfully, but also develop the skills necessary for many professions, including critical thinking, problem solving, communication, management and collaboration (Bilgin et al., 2015; Du et al., 2013; He et al., 2017; Kolmos et al., 2008; Lehmann et al., 2008; Steinemann, 2003; Zhao et al., 2017).
Definitions of these terminologies vary and the term SCL in particular is not always used with consistent meaning. However, a few points of agreement can be summarized (Rogers, 2002): who the learners are, the context, the learning activities and the processes. Weimer (2002) identifies five key areas for change in the process of transformation from teacher-centered to learner-centered classrooms: the balance of power, the function of content, the role of the teacher, the responsibility for learning, and the purpose and process of evaluation. In relation to the practice and implementation of a student-centered approach, Brook (1999) provides a list of guiding principles for the development of constructivist teachers who prioritize SCL strategies in HE. These are: using problems that are relevant to students, constructing learning around principal concepts, eliciting and appreciating students’ perspectives, adjusting the curriculum and syllabus to address students’ prior experience, and linking assessment to learning goals and student learning.
A wide range of perspectives has been addressed in previous studies on SCL in HE. Brook (1999), Rogers (2002) and Weimer (2002) provide a synthesis of guiding principles suggesting three dimensions of focus: instructors (how they understand and perceive the instructional innovation they are expected to adopt), student activity and interaction, and assessment.
The instructor represents an important and challenging aspect of instructional change, particularly, regarding innovative pedagogy and SCL (Ejiwale, 2012; Kolmos et al., 2008; Weimer, 2002). In a teacher-centered environment, instructors play the dominant role in defining objectives, content, student activities and assessment. Whereas in an SCL environment, instructors facilitate learning via providing opportunities for students to be involved in decision-making regarding goals, content, activities and assessment. Nevertheless, in the reality of instructional practice, instructors face the dilemma of, on the one hand, giving students the freedom to make decisions on their own, and on the other hand, retaining control of classroom activities (Du and Kirkebæk, 2012). In addition, how instructors handle the changes in their relationships with students is a determining factor in the extent to which SCL can be established. In their meta-analysis of student-teacher relationships in a student-centered environment, Cornelius-White (2007) suggests that positive teaching relationship variables, such as empathy, warmth, encouragement and motivation, are more associated with learner participation, critical thinking, satisfaction, drop-out prevention, positive motivation and social connection. In their proposal for developing pedagogical change strategies in STEM, Henderson et al. (2010) emphasize that the beliefs and behaviors of individual instructors should be targeted because they are essential to any strategy for changing the classroom practices and environment. In general, the existing literature agrees that for pedagogical change strategy development, it is essential to work with the instructors and to understand their current instructional practices as well as their perceptions of the change.
A student-centered approach emphasizes providing students with opportunities to participate and engage in activities while interacting with the subject matter, the teacher and each other. Student responsibility and ownership of their own learning is regarded as essential in facilitating classroom interactions. Self-governance of the interactions can be enhanced through collaborative group work when students are expected to negotiate and reach consensus on how to work and learn together. Instead of meeting an objective set by the instructors, students should take responsibility for organizing learning activities in order to reach goals they themselves set (Du et al., 2013; Weimer, 2002). The function of teaching content lies in aiding students in learning how to learn, rather than in the transmission of factual knowledge (Du and Kirkebæk, 2012).
Student-centered instructional strategies and practices require a change of assessment methods. Formative assessment, which refers to assessment methods that are intended to generate feedback on learner performance to improve learning, is often used to facilitate self-regulated learning (Nicol and Macfarlane-Dick, 2006). In their review of formative assessment, Black and William (1998) summarize the effectiveness of this method in relation to different types of outcomes, educational levels and disciplines. As they emphasize, the essential aspect that defines the success of formative assessment is the quality of the feedback provided to learners, both formally and informally. Furthermore in formative assessment, the process of learning through feedback and dialogue between teachers and students and among students is highly accentuated. Various formative assessment methods have been reported as additional or alternative methods to the prevailing summative assessment methods in STEM in order to align assessment constructively with the implementation of SCL (Downey et al., 2006; Prince and Felder, 2006).
To plan and implement meaningful initiatives for improving undergraduate instruction, it is important to collect data on the instructors’ instructional practices (Williams et al., 2015). Nevertheless, the existing literature has mainly focused on students’ attitudes, performance and feedback on SCL. A limited number of studies have examined the outcomes of faculty development activities that encourage research-based instructional strategies for SCL. These studies report a good level of faculty knowledge and awareness of various alternative instructional strategies in the fields of physics education (Dancy and Henderson, 2010; Henderson et al., 2012) and engineering and science education (Brawner et al., 2002; Borrego et al., 2013; Froyd et al., 2013). However, instructors’ adoption of teaching strategies varies according to individual preferences and beliefs, the contexts of disciplines, and institutional policy (Borrego et al., 2013; Froyd et al., 2013), and their persistence in the adoption and current use of these strategies (Hora and Ferrare, 2014; Henderson and Dancy, 2009; Walter et al., 2016) and their fidelity (how closely the implementation follows its original plan) (Borrego et al., 2013) are still in question.
Therefore, there is a need for additional studies addressing instructors’ understanding, beliefs and perceptions about practicing SCL that impact their instructional design for classroom interactions, and how they construct assessment methods to align with their adoption of instructional strategies. Further research should examine how instructors perceive their roles and experiences in the process of instructional change.
3. Present study
The state of Qatar has the vision of transforming itself into a knowledge-producing economy (General Secretariat for Development Planning, 2008; Rubin, 2012). Accordingly, advancement in the fields of science and technology is a critical goal, as is promoting pedagogical practices that support engagement in science and technology education (Dagher and BouJaoude, 2011). Qatar University (QU) is the country’s foremost institution of HE and aims to become a leader in economic and social development in Qatar. In its strategic plan for 2013–2016 (Qatar University (QU), 2012), the leadership of QU has called for instructional innovation toward SCL by developing “the skills necessary in the 21st century such as leadership, teamwork, communications, problem-solving, and promoting a healthy lifestyle” (QU, 2012, p. 13). It is expected that these initiatives will be implemented at the university level, particularly in the STEM fields.
Research on general university instructional practices in Qatar remains sparse, with little information available on current instructional practices and to what extent student-centered teaching and learning strategies are being implemented. In a recent study, the first on university instructional practices in Qatar, Al-Thani et al. (2016) reported that across disciplines, instructors’ prioritized lecture-based and teacher-centered instructional practices. For example, most participants stressed lecture and content clarity as the most important and effective practices. In contrast, instructors mentioned less about student–student interaction, the integration of technology and instructional variety received less interest, according to the perceptions of the participants. However, little is known about either actual classroom practices or the instructors’ perception of SCL, in particular in STEM fields.
To develop feasible change strategies that could be applied in the Qatar context with the aim of facilitating innovation in HE in general and STEM education in particular, it is essential to understand current instructional practices and how instructors perceive SCL, as well as what strategies are being implemented (Henderson et al., 2010). Therefore, this study aims to investigate STEM faculty’s perceptions and instructional practices of SCL and in Qatar. The purpose is to generate knowledge on the research-based evaluation of STEM faculty’s instructional practices. The study formulates the following research questions:
What are the instructional practices of STEM faculty in Qatar?
To what extent are instructors’ current practices student-centered?
How do STEM faculty perceive SCL, possibilities for implementation and challenges in classroom practice?
4. Research methods
4.1 Research design
Ideally, the study of STEM instructional practices involves the use of multiple techniques. The methods commonly used to investigate university teaching practices include interviews with instructors and students, portfolios written by instructors, surveys of instructors and students, and observations in educational settings (AAAS, 2013). However in reality, research conditions limit the choice of data collection methods (Creswell, 2013). Although classroom observation and portfolios are widely practiced in schools and can be a potential method for improving university teaching and learning, these rarely occur in practice except in cases of faculty promotion, evaluation or professional development requests (AAAS, 2013). In addition, peer and protocol-based observations demand significant resources of human labor, materials, equipment and physical conditions, which makes them challenging to implement on a larger scale (Walter et al., 2016). Therefore, a mixed-methods research design combining the strengths of quantitative and qualitative data – surveys and interviews – was employed as the major data generation method in this study (Creswell, 2002).
An open invitation was sent to the entire faculty in the science, engineering, mathematics and health sciences fields, asking them to consider participating on a voluntary basis. A sample of 65 faculty members (23.4 percent female and 76.4 percent male) completed the questionnaire.
4.3 Data generation methods
Survey and instruments
A self-reported questionnaire survey is one of the most efficient ways to gain information due to its accessibility, convenience to administrate and relative time efficiency (AAAS, 2013, p. 7). Despite the common concern that the faculty may inaccurately self-report their teaching practices, recent literature reports that some aspects of instruction can be accurately reported by instructors (Smith et al., 2014); this approach helps to identify instructional practices that are otherwise difficult to observe (Walter et al., 2016).
The Postsecondary Instructional Practices Survey (PIPS) (Walter et al., 2016) is a newly developed instrument aimed at investigating university teaching practices cross-disciplinarily from the perspective of instructors. The PIPS was developed on the basis of a conceptual framework constructed from a critical analysis of existing survey instruments (Walter et al., 2015), the observation codes of the Teaching Dimensions Observational Protocol (Hora et al., 2012), and the Reformed Teaching Observation Protocol (Piburn et al., 2000). The PIPS has been proven to be valid and reliable while providing measurable variables, and results from initial studies have shown that PIPS self-reported data are compatible with the results of several Teaching Dimensions Observational Protocol codes (Walter et al., 2016).
The PIPS includes 24 items for statements and reports regarding instructional practice and demographic questions on items such as gender, rank and academic titles. An intuitive, proportion-based scoring convention is used to calculate the scores. Two models are used for the supporting analysis – a two-factor or five-factor solution. Factors in the five-factor model include: six items for student–student interactions, four items for content delivery, four items for formative assessment, five items for student–content engagement and four items for summative assessment. Factors in the two-factor model include: nine items for instructor-centered practices and 13 items for student-centered practices. The responses from participants were coded as (0) not at all descriptive of my teaching, (1) minimally descriptive of my teaching, (2) somewhat descriptive of my teaching, (3) mostly descriptive of my teaching and (4) very descriptive of my teaching.
An interview can provide opportunities to explore teaching practices through interactions with the participants. It can also provide the space for in-depth questions on specific teaching practices as well as perceptions, beliefs, opinions and potentially unexpected findings (Creswell, 2013). During the interviews (interview guidelines see Appendix), participants were asked questions about their understanding of and past experiences with SCL, their perceptions of the effectiveness of practicing SCL in general and in their current environments in particular, what challenges and barriers they had experienced, and what institutional support is needed.
The questionnaire was sent to all participants in early spring 2017 and was administered by Qualtrics. An explanation of the goals of the survey, namely, to understand their current practices without intention of assessment, was provided to the participants. A pilot test was conducted with a several colleagues who were not participants to ensure that the questions were unambiguous and addressed the goals.
A sample of 65 faculty members (23.4 percent female and 76.4 percent male) completed the questionnaire. These were from the schools of sciences, health sciences, pharmacy and engineering. The average HE teaching experience of the participants was 14.5 years. About 15.6 percent of participants were full professors, 39.1 percent were associate professors, 31.3 percent were assistant professors and 14 percent were instructors or lecturers. About 58.6 percent of the participants did not have a leadership role (e.g. head of department, chair of curriculum committee).
In total, 12 (4 female and 8 male) of the 65 faculty members who completed the questionnaire responded positively to the individual interview request. The interview participants include a representative range of STEM faculty members by academic titles (three professors, three associate professors and six assistant professors) and gender (four female and eight male). Table I shows details of interview participants’ background information.
5. Analyses and findings
5.1 Quantitative data analysis and results
To answer the first research question, the mean and standard deviation of each item were calculated to identify the practices that best describe STEM faculty teaching in the given context. The grand mean for each factor was also calculated. The descriptive statistics for participants’ responses to the PIPS are presented in Table II.
The participants reported that the items of factor 2 (F2), content-delivery practices, were mostly descriptive of their teaching ( =3.14). That is, the items stating that their syllabus contains the specific topics that will be covered in every class session ( =3.58), they structure the class session to give students good notes ( =3.18), and they guide students as they listen and take notes ( =2.89) were mostly descriptive of their content delivery.
The grand mean of student–content engagement (F4) was relatively high ( =3.07). This means that, for example, instructors frequently ask students to respond to questions during class time ( =3.49) and frequently structure problems so that students are able to consider multiple approaches to finding a solution.
As to the student–student interaction factor (F1), the grand mean ( =2.18) was relatively low compared to the other factors. The item means ranged from 1.9 to 2.51, with the maximum possible value being 4. Compared with the other items of this factor, item P13 (“I structure class so that students constructively criticize one another’s ideas”) had the lowest mean ( =1.9), which indicates that this practice is somewhat, but not mostly or very much, descriptive of instructors’ practices. The item concerning structuring the class so that students discuss the difficulties they have with the subject matter with other students also had a low mean ( =2.06).
The formative assessment factor (F3) also had a relatively low grand mean ( =2.62). The mean of item P20 was 1.82, indicating that providing feedback on student assignments without assigning a formal grade was not very descriptive of QU instructors’ practices. The means for the rest of the items ranged from 2.7 to 2.98. Using student comments and questions to determine the direction of classroom discussions ( =2.95) and using student assessment results to guide the direction of their instruction ( =2.98) were mostly descriptive of QU instructors’ practices, as reported by participants.
The summative assessment factor (F5) had a low grand mean ( =2.35). This relatively low mean was greatly impacted by item P24 (“I adjust student scores [e.g. curve] when necessary to reflect a proper distribution of the grades”). In the given context, instructors are not allowed to adjust student scores, so the result of this item reflects university policy rather than individual instructor’s preference. An analysis excluding item P24 shows a different picture: the mean of the summative assessment factor without item P24 becomes 2.84. Thus, the student–student interaction factor and the formative assessment factor represent the lowest means in this study.
To answer the second research question, a paired samples t-test was conducted to compare the mean of student-centered items (P02, P04, P06-10, P12-16, P18-20) with the mean of the instructor-centered items (P01, P03, P05, P11, P17, P21-24). The mean of the student-centered factors is 2.69 and the mean of the instructor-centered factors is 2.76. The results of the paired samples t-test found no statistically significant difference (α=0.05) between student-centered mean and instructor-centered mean (t=−1.00, df=64). However, when item 24 is excluded, the mean of the instructor-centered items becomes 2.99. A significant difference (α=0.05) was found between the student-centered mean and the new (excluding item 24) instructor-centered mean (t=−4.15, df=64).
An alignment was identified between the results of the five-factor model analysis and the two-factor model analysis. Quantitative analysis results did not show a correlation between instructional practices and demographic factors such as academic rank or years of teaching. However, the results identified significant differences in using student-centered instructional practices according to the gender of the participant. Based on the data reported by participants, the mean of using student-centered instructional practices was 2.81 for male participants and 2.37 for female participants. A one-way ANOVA found a statistically significant difference (α=0.05) between the student-centered mean of male participants and that of female participants (F=7.64, p=0.008).
5.2 Qualitative data analysis and results
The qualitative analysis provides answers to the third research question. All interviews were transcribed before being coded and analyzed. The analysis used an integrated approach combining guiding principles on SCL by Brook (1999), Rogers (2002) and Weimer (2002), and Kvale and Brinkmann’s (2009) meaning condensation method. The analyzed identified emerging themes from instructors’ accounts of their opinions, experiences and reflections.
Instructors’ definitions and perceptions of their roles in SCL
Although all interviewed instructors believed they were using SCL strategies in their classrooms, they defined the term SCL in various ways. Three categories of definitions were identified; these are explained below. Interview data also found a consistency between instructors’ definitions and their perceptions of their roles in an SCL environment:
Category 1: there were three instructors, one professor and two assistant professors, all male that believed lecturing to be the best way of teaching and learning. According to them, a good lecturer is keen to motivate and encourage students to be free thinkers. When students choose to enter a university, they should be sufficiently mature and willing to work hard enough to progress through their education. Therefore, the university “should be student-centric by definition” (Burhan). This definition was supported by the following remark:
I believe that in our university every instructor is doing SCL in their own way […] but instead of standing there reading slides, I think it makes it more student-centered by providing an interesting lecture so that when they leave the room you will hear them say, “Wow, this is inspiring and interesting.”
All three of the instructors interviewed conceived of their role as to “inspire and attract students.” As Abdullah commented:
It is the responsibility of the instructors to find a way to bring in highly interesting lectures to make students interested […] to do that, we should prioritize research, so we have something really interesting to bring to the class.
Category 2: instructors in this category included one female associate professor, one male assistant professor, two male associate professors and one male professor. They believed that in an SCL environment, the instructor should provide activities for students to learn hands-on skills and relate theories to certain practices, and that students should acquire deep knowledge in the field by working together actively on classroom activities. As Ihab commented, “[I]t is so boring to just fill the class with me talking and lecturing. It is fun to plan some activities so students can work in a team so that they can practice the theories; students like these [activities].” In such an environment, the instructor should play the role of “providing” activities and “guiding” students to learn the requested, relevant knowledge through these activities, as most of them suggested.
Category 3: this category included two female assistant professors, one female professor and one male assistant professor. They believed students should work in small groups, with no more than ten people per team, on certain targets, such as solving a problem. Students should be responsible for organizing study activities and should make decisions on their own to prepare for the requirements of their future professions. They should also be allowed to make mistakes and should receive help with reflecting on these mistakes in order to improve. As Faris commented:
I did not like my own student experiences which were filled with lectures and lab work, I appreciated my past experienced of working in a more student-centered learning environment, which offered me tools to provide what I think as better learning environment now to my students.
These four instructors used a few different metaphors to describe their roles: “leaders” – “leading students to work towards their targets” (Sara and Iman), “observers” – “observing students from a distance and only interfering when they got off-track” (Faris), and “facilitators” – “having patience when students made mistakes” (Faris), “providing rich resources to students in need of help and redirecting students when they were in trouble” (Sara and Iman), “assisting students to be able to make their own decisions on learning goals, what to learn and how to learn it, and critically evaluate and reflect on their own learning” (Duaa).
The interview data did not reveal any patterns in teachers’ definitions and perceptions according to their academic ranking or gender. However, past experiences with SCL seemed to make a difference in their understanding and choice of strategies. For example, participants from category 1 mainly experienced lectures as the major source of learning and form of teaching in their past student and teaching experiences. Those from category 2 experienced different types of SCL environments due to their previous work experiences but not during their student experiences. Two participants from category 3 experienced SCL in the form of problem-based learning (PBL) in their past student experiences, and the other two participants had experiences with SCL both as learners and as instructors prior to their current jobs. A participant’s past experiences, particularly as a learner, seem to have a close link to their current instructional practice. As Sara remarked:
Having experienced the Problem-Based Learning in my college time, I truly it is the best way to learn. Working in team offered us great opportunities to help each other and support each other. This means a lot in particular for us female in Arabic culture. We never went out to talk with others before and in such a environment we learned how to interact with others and how to behave professionally […] we increased our self-confidence and it was very empowering.
Although all three groups mentioned that students should take responsibility for their own learning, when asked to what level students should be involved in deciding what to learn and how to learn it, and even how to assess what they have learned, only one instructor (Ali) said it would be ideal to involve students in these decisions. However, he had neither experienced this himself nor had he observed any such practice in his immediate environment. Out of the 12 interviewees, 10 believed that instructors should decide which activities to provide, what materials to use and how to structure student activity time and form, and should also ensure students reach “the correct” answers.
While the data are too broad to draw any strong conclusions, the majority of the classroom activities that the interviewed instructors exemplified focused on students working in groups to fulfill an assignment designed by the instructor or students answering questions from the instructor in a teacher-student one-to-one form. The roles described by all the instructors involved offering directions and structures. As most of instructors mentioned, given the time pressure to deliver all the required content for their courses, they had to ensure students progressed through the mandated learning checkpoints.
The interviewed instructors agreed that assessment played an essential role in evaluating student learning. One instructor said, an exam “is the best way to engage them to learn because they work so hard just before it” (Ibrahim). With the exception of one instructor, the respondents gave multiple-choice questions plus short-answer questions as the major forms of assessment they used. However, their opinions on what should be included in and what should be the focus of the assessment diverged. The instructors provided examples that included; “To prepare [students] for their future profession, exams in universities should focus on lots of hand-on skills” (Alia); “More writing skills are needed for the exam” (Amin); and “Students need to be posed exams that can question their thinking skills” (Faris).
Two major reasons for the choice of assessment were provided. First, the assessment committee within a college or across colleges defined the assessments as exams for some undergraduate courses, particularly general courses. This limited the options for instructors to design exams different from common exam used in these classes. Second, when instructors did have the freedom to design exams for their courses, it is most convenient to use assessment forms that can “examine the knowledge students have mastered” and are the “least time-consuming” for grading purposes, as 8 out of the 12 interview participants expressed. As one participant said, “It takes a few hours to grade multiple-choice question exams. With the busy schedule we have, you don’t want to spend several days to grade and provide feedback for a few hundred essays” (Ihab).
Two of the interviewed instructors (Faris and Duaa) expressed their views on how formative assessment should be further enhanced in order to better facilitate SCL, only one of them enhanced their assessments in daily practice, as Duaa commented:
Real SCL should involve students not only in deciding on what activities they take in the classroom, but also in defining assessment methods, but I can see the students are shocked when I invite them to give opinions on how they should be assessed […] it will take more time before more people understand that involving students in defining assessment is to motivate them to be more responsible instead of cheating.
Given this challenge, this instructor mainly relied in practice on asking students to identify and structure their own projects and problems.
The majority of the instructors believed that students are the most challenging factor in implementing ideal SCL in the given context. A major reason cited for this is the Arabic culture. Out of the 12 interviewed instructors, 11 believed that most students were raised in Arabic families deeply rooted in Middle Eastern culture, where family plays an important role in one’s daily life, meaning that most teenagers do not have opportunities to live alone and make decisions independently. In addition, their high school experiences did not help them become independent learners, as in that setting they are used to lectures and taking arranged assignments without asking any “why” questions and exams that are mostly in the form of multiple-choice questions that test their memories. Students are familiar with being provided with information and instruction and having their time arranged and they even prefer it that way. As an instructor said, “This is how the students grow up; they are used to it and they cannot take responsibilities on their own. They are not motivated to do things independently, no matter how the instructor works hard to push them, they are not really ready for a true SCL” (Alia).
Large classroom sizes were identified as another major challenge for implementing student–student activities because the students easily slip into a chaotic and “out of control” mode, according to some teachers. Interestingly, this was used as an argument for “offering a really interesting lecture as an effective approach to provide SCL,” as Abdullah commented.
Finally, the busy schedule of university faculty remains a reason to limit what they do: “if we don’t have so much teaching load, we may have more time to do what could have been more student-centered strategies such as letting students identify problems and learning needs on their own” (Ali). Although teaching plays an important role in the appraisal system at QU, research products, such as publications, remain the major tool to evaluate faculty performance. Ibrahim mentioned “when we apply for promotion, which is particularly crucial for assistant professors, all what is to be evaluated is the publication in one’s own field, as long as we can prove we are able to teach, it is not highly critical how we teach.”
Three participants expressed their desire for an institutionalized approach to changing the assessment system, allowing for more faculty autonomy to design assessment methods that are appropriate for their courses. Most of the suggestions for support referred to actions focusing on faculty and students. In total, 11 participants suggested more workshops and training sessions for faculty to gain the necessary skills to facilitate SCL. Five participants suggested student tutoring programs to help first-year undergraduate students learn personal responsibility and to “grow up by following suggestions from experienced students” (Faris). One participant even suggested that attention to pedagogy should be reduced for now because “We give too much attention to the students, nearly like spoon-feeding, worrying too much about whether they are happy or not in studying […] students should stand on their own feet, and sometimes they learn by being thrown into the deep sea” (Burhan).
In this section, we compare the qualitative data findings and the quantitative study results and discuss them in relation to the three dimensions of focus in SCL previously summarized in this paper: instructors’ perceptions and roles, student activity and interaction and assessment. This is followed by a discussion of STEM instructors’ views on challenges to implementation.
6.1. STEM instructors’ understanding and perceptions of SCL
Improving the quality of teaching and learning in the STEM fields necessitates exploring the conceptions that faculty instructors hold regarding the learning environment and the context of teaching since teaching approaches are strongly influenced by the underlying beliefs of the teacher (Kember, 1997). The participants in this study hold different beliefs about and attitudes toward SCL strategies. Connections can be identified between the participants’ understandings and perceptions of SCL and their prior experiences with it. Those who had experienced SCL as learners tended to make more of an effort to implement the strategies effectively in their own teaching practice. This finding echoes previous studies suggesting that in order to maximize their capability of facilitating PBL faculty should be provided with opportunities to experience PBL as learners (Kolmos et al., 2008).
Comparing results from the quantitative and qualitative data, this study identifies gaps between what the instructors consider to be SCL and what they actually practice. As suggested by Paris and Combs (2006), the broad and wide-ranging definitions of SCL legitimize the instructors’ actual practices. This gap can serve as an alert when a large-scale change initiative is being implemented in the given context. As Henderson et al. (2011) note, awareness and knowledge of SCL strategies cannot guarantee their actual practice.
6.2 Student activity and interaction
This study reported that instructors have a general awareness of using student-centered strategies. Student activities are regarded as essential in instructional practices. Nevertheless, this study also shows that, in the given context, most classroom interactions are in the form of student–content and student-teacher interactions whereas student–student interactions remain limited. In practice, a generally low level of SCL can be concluded, according to the PIPS instrument (Walter et al., 2016) and the definition of SCL in previous studies (Brook, 1999; Rogers, 2002; Weimer, 2002). Student interaction with the content and instructor may be directly related to the common concept of instruction and may reflect a lecture-centered pedagogic approach. This finding is in line with the report from a previous study showing that instructors in Qatar tend to focus on content delivered through lectures as an efficient way of teaching (Al-Thani et al., 2016). Previous studies (Borrego et al., 2013; Henderson and Dancy, 2009; Walter et al., 2016) also report that the levels of implementing instructional practices vary according to different aspects; for example, STEM faculties reported limited use of certain strategies such as group work and solving problems collaboratively in daily practice despite their high level of knowledge and awareness. Instructor’s lack of professional vision on collaborative group work can lead to their lack of practice (Modell and Modell, 2017). An often-reported reason is that instructors give priority to content delivery due to limited class time (Hora and Ferrare, 2014; Walter et al., 2016). Another explanation may be instructors’ lack of confidence in letting students take full responsibility for organizing their own learning activities outside of instructors’ control (Du and Kirkebæk, 2012).
Student–student interaction received relatively less attention and consideration from the participants in this study. Previous studies have found that the length of classes and class size were often the most important barrier for the implementation of student-centered instructional practices (Froyd et al., 2013). In the context of this study, this may be one of the factors limiting the possibility of using student interaction in the classroom. In the undergraduate programs, the length of classes is 1 h and 15 minutes, which is counted as a two-study-hour class. This limits instructors’ confidence in their ability to deliver heavy curriculum content while also providing opportunities to engage students with interactive activities. Another possible reason is the bias of the instructors’ knowledge regarding SCL strategies; some instructors believe it is sufficient to deliver SCL by simply asking students to do something that is different from lecturing (Paris and Combs, 2006; Shu-Hui and Smith, 2008). Linking the results in the aspect of the instructors’ definition of SCL to their perceived roles of teaching, as the participants described in interviews, the instructors also lack the belief that interactive student activities can lead to actual learning. Participants consider it important that instructors maintain control of classroom activities. For example, Borrego et al. (2013) found a strong correlation between instructors’ beliefs regarding problem solving and the time students spent on collaborative activities, such as discussing problems.
6.3 Unaligned assessment
Although the participants demonstrated an awareness of SCL in general and willingness to implement certain SCL strategies, they reported limited critical reflection on assessment systems in the given context. Their limited understanding and practice of formative assessment is an impediment to the effectiveness of practicing SCL by aligning instruction and assessment. Instructional innovation demands changes not only in classroom practices but also, more importantly, in assessment methods. Williams et al. (2015) noted that formative assessment is a factor that is often ignored or forgotten, even by many of the researchers who have developed instruments to describe instructional practices. This study similarly found that the summative-oriented prevailing assessment methods at the university level remain unchallenged by the instructors. This may be due to their lack of knowledge and experience of formative assessment, or due at least in part to the convenience of using what they are asked to as well as what they are accustomed. Changing teaching methods without a constructive alignment with assessment methods will limit the effectiveness of any instructional innovation (Biggs and Tang, 2011).
6.4 Factors that make a difference
Previous studies (Dancy and Henderson, 2007, 2010; Froyd et al., 2013; Henderson and Dancy, 2009; Henderson et al., 2012) have reported that a faculty member’s use of student-centered strategies is often related to demographic factors such as gender, academic rank and years of teaching. The results of this study only identified a correlation between instructional practices and gender. In contrast to the findings of previous studies, namely, that female instructors tend to use student-centered methods more often than male instructors and that younger instructors tend to show more interest in adopting new pedagogical initiatives, quantitative data of this study found that male participants reported higher levels of employing student-centered approaches than female participants, but found no patterns regarding academic rank and years of teaching. A major reason may be the small number of participants in this study. A possible reason for the gender difference may be the imbalanced gender ratio among the overall participants in this study (the proportion of female participants was 23.4 percent). Nevertheless, qualitative data did not identify any patterns due to gender and academic rank, but rather, identified a connection between the instructor’s prior experience with SCL and their understanding, perception and practices, as previously discussed.
Two categories of instructor concerns and barriers to their sustainable use of instructional innovation were identified. Students’ lack of maturity, motivation and responsibility was considered the major challenge by most of the interviewed participants, except for those who had experienced SCL as a student. Regarding students as the source of the problem and blaming students for their own poor performance can be seen as another symptom to be associated with a lecturer-centered approach.
Another major challenge is institutional constraints such as the insufficiency of classroom time. Instructors tend to have different opinions regarding the amount of time it takes to include interactive student–student activities. Large class size is often a barrier for instructors hoping to use interactive student–student activities. Female faculty members and younger faculty members are found to have a higher rate of innovative instruction use and continuation.
As previous studies (Froyd et al., 2013) have suggested, when an instructional strategy is adopted at a low level, it means that it is either not mature or will never achieve full adoption. Institutionalized faculty development and support are essential for the further implementation of innovative instructional strategies and the persistence and continuation of the implementation, as Dancy and Henderson (2007) pointed out, while institutional barriers can limit instructional innovations when structures have been set up to function well with traditional instruction. The following list of recommendations is provided as inspiration for institutional support and faculty development activities:
First, faculty members need to develop a deep understanding of SCL through experiences as learners so that they can become true believers and implementers.
Second, autonomy is needed for faculty to adopt appropriate assessment methods that are aligned with their pedagogical objectives and delivery methods. Input on how faculty can adapt instructional innovation to tailor it to the local context is very important for its long-term effectiveness (Hora and Ferrare, 2014).
Third, an inclusive approach to faculty evaluation by encouraging faculty from STEM backgrounds to be engaged in research on their instructional practice will not only sustain the practice of innovative pedagogy but will also enrich the research profiles of STEM faculty and their institutes.
This study examined university STEM instructors’ understanding and perceptions of SCL as well as their self-reported current practices. Results of the study provide insights on how institutional strategies of instructional change are continually practiced. The study identified a lack of alignment between instructors’ perceptions and their actual practices of SCL. Despite agreement on perceiving SCL as an effective teaching strategy, the instructors’ actual practices prioritize content delivery, the teachers’ role in classroom control, and defining student learning activities as well as summative assessment. Student–student interactions and formative assessment are limited. The participants tended to blame the limited use of SCL on the lack of motivation and readiness among students and on institutional constraints. Another perspective to explain this gap may be the diverse yet inclusive definitions of SCL espoused by faculty, which tend to legitimate their practices, reflecting a rather low level of implementation compared to the literature. This study also suggests that faculty’s understanding and perceptions of implementing student-centered approaches were closely linked to their prior experiences – experiencing SCL as a learner may better shape the understanding and guide the practice of SCL as an instructor. Thereafter, recommendations are provided for faculty development activities at an institutional level for sustainable instructional innovation.
The study has a few limitations. First, regarding methodological justification, the data methods chosen in this study were mainly focused on the faculty’s self-reporting. Although such methods are frequently employed for studying faculty beliefs, perceptions and instructional practices (Borrego et al., 2013), data sources from other sources, such as observation, may offer information from new perspectives for instructional development (Henry et al., 2007). Second, the limited number of participants restricts this study’s generalizability because the survey was administered on a volunteer-based manner and the limited number of interview participants makes it difficult to establish clear patterns. Third, researching faculty members raises concerns in the given context, wherein extensive faculty assessments are regularly conducted. Although special considerations regarding ethical concerns were taken in this study – for example, participants were provided with a clear explanation of the goals and consequences of the study and were shown that it had no relation to the university’s annual faculty performance assessment – the potential sensitivity may have caused a certain amount of reservation among participants regarding sharing further information; this may have limited the results of the study.
In conclusion, the results reported in this paper provide a first impression of the present instructional practices in the STEM field in the context of Qatar. Findings of the study, although limited to the given context, may have implications for other countries in the Gulf Region and Arabic speaking contexts, and potentially an even broader contexts, since instructional change toward SCL in STEM classrooms remains a general challenge worldwide (Hora and Ferrare, 2014; Froyd et al., 2013). The results imply that more attention should be given to faculty development programs to enhance instructor awareness, knowledge and skills related student–student interaction and formative assessment. This study contributes to further instructional change implementation by introducing a roadmap toward change on broader levels, such as strategies of institutional change for instructional innovation, as well as toward the establishment of a research-based and evidence-based approach to faculty development and institutional change.
Interview participants’ background information
|Name||Gender||Academic rank||Previous pedagogical experiences|
|Abdullah||Male||Assistant Professor||Student experiences in lecture-based learning environment
Teaching experiences in lecture-based learning environment
|Mohammad||Male||Assistant Professor||Student experiences in lecture-based learning environment
Teaching experiences in lecture-based learning environment
|Burhan||Male||Professor||Student experiences in lecture-based learning environment
Teaching experiences in lecture-based learning environment in 4 countries
|Amin||Male||Associate Professor||Student experiences in lecture-based learning environment
Teaching experiences in lecture-based learning environment in 2 countries
|Ali||Male||Associate Professor||Student experiences in lecture-based learning environment
Teaching experiences in lecture-based learning environment and in active-learning environment
|Ibrahim||Male||Assistant Professor||Student experiences in lecture-based learning environment
Teaching experiences in lecture-based learning environment and in inquiry-based learning environment
|Ihab||Male||Professor||Student experiences in lecture-based learning environment
Teaching experiences in lecture-based learning environment and in project-based learning environment
|Alia||Female||Associate Professor||Student experiences in lecture-based learning environment
Teaching experiences in lecture-based learning environment and in project-based learning environment
|Faris||Male||Assistant Professor||Student experiences in lecture-based learning environment
Teaching experiences in lecture-based learning environment and in problem-based learning environment
|Sara||Female||Assistant Professor||Student experiences in lecture-based learning environment and problem-based learning environment
Teaching experiences in problem-based learning environment
|Iman||Female||Assistant Professor||Student experiences in lecture-based learning environment and problem-based learning environment
Teaching experiences in problem-based learning environment
|Duaa||Female||Professor||Student experiences in lecture-based learning environment and problem-based learning environment
Teaching experiences in lecture-based learning environment and problem-based learning environment in 3 countries
Note: All names are anonymous
The descriptive statistics for participants’ responses to the PIPS survey – five-factor model analysis
|Factor 1: student–student interaction|
|P10. I structure class so that students explore or discuss their understanding of new concepts before formal instruction||2.51||1.03|
|P12. I structure class so that students regularly talk with one another about course concepts||2.77||1.02|
|P13. I structure class so that students constructively criticize one another’s ideas||2.06||1.07|
|P14. I structure class so that students discuss the difficulties they have with this subject with other students||2.06||1.11|
|Grand mean of factor 1||2.18||0.82|
|Factor 2: content delivery practices|
|P01. I guide students through major topics as they listen and take notes||2.89||0.99|
|P03. My syllabus contains the specific topics that will be covered in every class session||3.58||0.77|
|P05. I structure my course with the assumption that most of the students have little useful knowledge of the topics||2.89||0.94|
|P11. My class sessions are structured to give students a good set of notes||3.18||0.83|
|Grand mean of factor 2||3.14||0.55|
|Factor 3: formative assessment|
|P06. I use student assessment results to guide the direction of my instruction during the semester||2.98||0.86|
|P08. I use student questions and comments to determine the focus and direction of classroom discussion||2.95||0.86|
|P18. I give students frequent assignments worth a small portion of their grade||2.7||1.22|
|P20. I provide feedback on student assignments without assigning a formal grade||1.82||1.33|
|Grand mean of factor 3||2.62||0.69|
|Factor 4: student–content engagement|
|P02. I design activities that connect course content to my students’ lives and future work||3.11||0.95|
|P07. I frequently ask students to respond to questions during class time||3.49||0.77|
|P09. I have students use a variety of means (models, drawings, graphs, symbols, simulations, etc.) to represent phenomena||2.92||1.04|
|P16. I structure problems so that students consider multiple approaches to finding a solution||2.94||0.88|
|P17. I provide time for students to reflect about the processes they use to solve problems||2.88||0.87|
|Grand mean of factor 4||3.07||0.53|
|Factor 5: summative assessment|
|P21. My test questions focus on important facts and definitions from the course||3.03||1.00|
|P22. My test questions require students to apply course concepts to unfamiliar situations||2.58||1.16|
|P23. My test questions contain well-defined problems with one correct solution||2.91||1.13|
|P24. I adjust student scores (e.g. curve) when necessary to reflect a proper distribution of grades||0.89||1.19|
|The new grand mean of factor 5, excluding P24||2.84||0.8|
How do you understand/define SCL? What are important characteristics of SCL in your opinion?
What are your past experiences of using SCL?
How do you see the role of instructor in an SCL environment, and in which ways is this role descriptive of your current practice?
What are your preferred assessment methods within your current teaching practices and why?
What should be the ideal assessment methods in an SCL environment?
What are the challenges of practicing SCL in your current environment?
In your opinion, what institutional supports are needed to implement SCL in Qatar?
Al-Thani, A.M., Al-Meghaissib, L.A.A.A. and Nosair, M.R.A.A. (2016), “Faculty members’ views of effective teaching: a case study of Qatar University”, European Journal of Education Studies, Vol. 2 No. 8, pp. 109-139.
American Association for the Advancement of Science (AAAS) (2013), “Describing and measuring STEM teaching practices: a report from a national meeting on the measurement of undergraduate science, technology, engineering, and mathematics (STEM) teaching”, American Association for the Advancement of Science, Washington, DC, available at: http://ccliconference.org/files/2013/11/Measuring-STEM-Teaching-Practices.pdf (accessed November 15, 2006).
Anderson, R.D. (2002), “Reforming science teaching: what research says about inquiry”, Journal of Science Teacher Education, Vol. 13 No. 1, pp. 1-12.
Attard, A., Di Loio, E., Geven, K. and Santa, R. (2010), Student Centered Learning: An Insight into Theory and Practice, Partos Timisoara, Bucharest.
Barr, R.B. and Tagg, J. (1995), “From teaching to learning: a new paradigm for undergraduate education”, Change: The Magazine of Higher Learning, Vol. 27 No. 6, pp. 12-26.
Biggs, J.B. and Tang, C. (2011), Teaching for Quality Learning at University: What the Student Does, McGraw-Hill Education, Berkshire.
Bilgin, I., Karakuyu, Y. and Ay, Y. (2015), “The effects of project-based learning on undergraduate students’ achievement and self-efficacy beliefs towards science teaching”, Eurasia Journal of Mathematics, Science & Technology Education, Vol. 11 No. 3, pp. 469-477.
Black, P. and William, D. (1998), “Assessment and classroom learning”, Assessment in Education: Principles, Policy & Practice, Vol. 5 No. 1, pp. 7-74.
Borrego, M., Froyd, J.E., Henderson, C., Cutler, S. and Prince, M. (2013), “Influence of engineering instructors’ teaching and learning beliefs on pedagogies in engineering science courses”, International Journal of Engineering Education, Vol. 29 No. 6, pp. 1456-1471.
Brawner, C.E., Felder, R.M., Allen, R. and Brent, R. (2002), “A survey of faculty teaching practices and involvement in faculty development activities”, Journal of Engineering Education, Vol. 91 No. 4, p. 393.
Brook, J.G. (1999), In Search of Understanding: The Case for Constructivist Classrooms, Association for Supervision & Curriculum Development, Alexandria.
Cornelius-White, J. (2007), “Learner-centered teacher-student relationships are effective: a meta-analysis”, Review of Educational Research, Vol. 77 No. 1, pp. 113-143.
Creswell, J.W. (2002), Educational Research: Planning, Conducting, and Evaluating Quantitative and Qualitative Research, Pearson Education, Upper Saddle River, NJ.
Creswell, J.W. (2013), Qualitative Inquiry and Research Design: Choosing among Five Approaches, Sage.
Curtis, R. and Ventura-Medina, E. (2008), An Enquiry-Based Chemical Engineering Design Project for First-Year Students, University of Manchester, Centre for Excellence in Enquiry-Based Learning, Manchester.
Dagher, Z. and BouJaoude, S. (2011), “Science education in Arab states: bright future or status quo?”, Studies in Science Education, Vol. 47, pp. 73-101.
Dancy, M. and Henderson, C. (2007), “Framework for articulating instructional practices and conceptions”, Physical Review Special Topics: Physics Education Research, Vol. 3 No. 1, pp. 1-12.
Dancy, M. and Henderson, C. (2010), “Pedagogical practices and instructional change of physics faculty”, American Journal of Physics, Physics, Vol. 78 No. 10, pp. 1056-1063.
Dewey, J. (1938), Experience and Education, Collier and Kappa Delta Phi, New York, NY.
Downey, G.L., Lucena, J.C., Moskal, B.M., Parkhurst, R., Bigley, T., Hays, C. and Lehr, J.L. (2006), “The globally competent engineer: working effectively with people who define problems differently”, Journal of Engineering Education, Vol. 95 No. 2, pp. 107-122.
Du, X.Y. and Kirkebæk, M.J. (2012), “Contextualizing task-based PBL”, Exploring Task-Based PBL in Chinese Teaching and Learning, pp. 172-185.
Du, X.Y., Su, L. and Liu, J. (2013), “Developing sustainability curricula using the PBL method in a Chinese context”, Journal of Cleaner Production, Vol. 61 No. 15, pp. 80-88.
Duran, M. and Dökme, İ. (2016), “The effect of the inquiry-based learning approach on students’ critical-thinking skills”, Eurasia Journal of Mathematics, Science & Technology Education, Vol. 12 No. 12.
Ejiwale, J.A. (2012), “Facilitating teaching and learning across STEM fields”, Journal of STEM Education: Innovations and Research, Vol. 13 No. 3, pp. 87-94.
Felder, R.M., Woods, D.R., Stice, J.E. and Rugarcia, A. (2000), “The future of engineering education II. Teaching methods that work”, Chemical Engineering Education, Vol. 34 No. 1, pp. 26-39.
Freeman, S., Eddy, S.L., McDonough, M., Smith, M.K., Okoroafor, N., Jordt, H. and Wenderoth, M.P. (2014), “Active learning increases student performance in science, engineering, and mathematics”, Proceedings of the National Academy of Sciences, Vol. 111 No. 23, pp. 8410-8415.
Froyd, J., Borrego, M., Cutler, S., Henderson, C. and Prince, M. (2013), “Estimates of use of research-based instructional strategies in core electrical or computer engineering courses”, IEEE Transactions on Education, Vol. 56 No. 4, pp. 393-399.
General Secretariat for Development Planning (2008), Qatar National Vision 2030, General Secretariat for Development Planning, Doha, available at: http://qatarus.com/documents/qatar-national-vision-2030/ (accessed November 15, 2016).
Graham, M.J., Frederick, J., Byars-Winston, A., Hunter, A.B. and Handelsman, J. (2013), “Increasing persistence of college students in STEM”, Science, Vol. 341 No. 6153, pp. 1455-1456.
He, Y., Du, X., Toft, E., Zhang, X., Qu, B., Shi, J. and Zhang, H. (2017), “A comparison between the effectiveness of PBL and LBL on improving problem-solving abilities of medical students using questioning”, Innovations in Education and Teaching International, Vol. 55 No. 1, pp. 44-54, available at: https://doi.org/10.1080/14703297.2017.1290539
Henderson, C. and Dancy, M. (2009), “The impact of physics education research on the teaching of introductory quantitative physics in the United States”, Physical Review Special Topics: Physics Education Research, Vol. 5 No. 2, pp. 1-15.
Henderson, C., Beach, A. and Finkelstein, N. (2011), “Facilitating change in undergraduate STEM instructional practices: an analytic review of the literature”, Journal of Research in Science Teaching, Vol. 48 No. 8, pp. 952-984.
Henderson, C., Dancy, M. and Niewiadomska-Bugaj, M. (2012), “The use of research-based instructional strategies in introductory physics: where do faculty leave the innovation-decision process?”, Physical Review Special Topics – Physics Education Research, Vol. 8 No. 2, pp. 1-9.
Henderson, C., Finkelstein, N. and Beach, A. (2010), “Beyond dissemination in college science teaching: an introduction to four core change strategies”, Journal of College Science Teaching, Vol. 39 No. 5, pp. 18-25.
Henry, M.A., Murray, K.S. and Phillips, K.A. (2007), Meeting the Challenge of STEM Classroom Observation in Evaluating Teacher Development Projects: A Comparison of Two Widely Used Instruments, Henry Consulting, St Louis, MA.
Hora, M.T. and Ferrare, J.J. (2014), “Remeasuring postsecondary teaching: how singular categories of instruction obscure the multiple dimensions of classroom practice”, Journal of College Science Teaching, Vol. 43 No. 3, pp. 36-41.
Hora, M.T., Oleson, A. and Ferrare, J.J. (2012), Teaching Dimensions Observation Protocol (TDOP) User’s Manual, Wisconsin Center for Education Research, University of Wisconsin–Madison, Madison, WI.
Justice, C., Rice, J., Roy, D., Hudspith, B. and Jenkins, H. (2009), “Inquiry-based learning in higher education: administrators’ perspectives on integrating inquiry pedagogy into the curriculum”, Higher Education, Vol. 58 No. 6, pp. 841-855.
Kember, D. (1997), “A reconceptualisation of the research into university academics’ conceptions of teaching”, Learning and Instruction, Vol. 7 No. 3, pp. 255-275.
Ketpichainarong, W., Panijpan, B. and Ruenwongsa, P. (2010), “Enhanced learning of biotechnology students by an inquiry-based cellulose laboratory”, International Journal of Environmental & Science Education, Vol. 5 No. 2, pp. 169-187.
Kolmos, A., Du, X.Y., Dahms, M. and Qvist, P. (2008), “Staff development for change to problem-based learning”, International Journal of Engineering Education, Vol. 24 No. 4, pp. 772-782.
Kvale, S. and Brinkmann, S. (2009), Interviews: Learning the Craft of Qualitative Research, SAGE, Thousand Oaks, CA.
Lehmann, M., Christensen, P., Du, X. and Thrane, M. (2008), “Problem-oriented and project-based learning (POPBL) as an innovative learning strategy for sustainable development in engineering education”, European Journal of Engineering Education, Vol. 33 No. 3, pp. 283-295.
Martin, T., Rivale, S.D. and Diller, K.R. (2007), “Comparison of student learning in challenge-based and traditional instruction in biomedical engineering”, Annals of Biomedical Engineering, Vol. 35 No. 8, pp. 1312-1323.
Modell, M.G. and Modell, M.G. (2017), “Instructors’ professional vision for collaborative learning groups”, Journal of Applied Research in Higher Education, Vol. 9 No. 3, pp. 346-362.
NEA (2010), “Preparing 21st Century students for a global society: an educator’s guide to ‘the four Cs’”, National Education Association, Washington, DC, available at: www.nea.org/tools/52217 (accessed December 20, 2017).
Nicol, D.J. and Macfarlane-Dick, D. (2006), “Formative assessment and self-regulated learning: a model and seven principles of good feedback practice”, Studies in Higher Education, Vol. 31 No. 2, pp. 199-218.
Paris, C. and Combs, B. (2006), “Lived meanings: what teachers mean when they say they are learner-centered”, Teachers & Teaching: Theory and Practice, Vol. 12 No. 5, pp. 571-592.
Piburn, M., Sawada, D., Falconer, K., Turley, J., Benford, R. and Bloom, I. (2000), Reformed Teaching Observation Protocol (RTOP), Arizona Collaborative for Excellence in the Preparation of Teachers, Tempe.
Prince, M.J. and Felder, R.M. (2006), “Inductive teaching and learning methods: definitions, comparisons, and research bases”, Journal of Engineering Education, Vol. 95 No. 2, pp. 123-138.
Qatar University (QU) (2012), “Qatar university strategic plan 2013–2016”, available at: www.qu.edu.qa/static_file/qu/About/documents/qu-strategic-plan-2013-2016-en.pdf (accessed June 10, 2017).
Rogers, A. (2002), Teaching Adults, 3rd ed., Open University Press, Philadelphia, PA.
Rubin, A. (2012), “Higher education reform in the Arab world: the model of Qatar”, available at: www.mei.edu/content/higher-education-reform-arab-world-model-qatar (accessed December 15, 2016).
Scott, L.C. (2015), “The futures of learning 2: what kind of learning for the 21st century?”, UNESCO Educational Research and Foresight Working Papers, available at: http://unesdoc.unesco.org/images/0024/002429/242996E.pdf (accessed December 22, 2017).
Seymour, E. and Hewitt, N.M. (1997), Talking About Leaving: Why Undergraduates Leave the Sciences, Westview, Boulder, CO.
Shu-Hui, H.C. and Smith, R.A. (2008), “Effectiveness of interaction in a learner-centered paradigm distance education class based on student satisfaction”, Journal of Research on Technology in Education, Vol. 40 No. 4, pp. 407-426.
Simsek, P. and Kabapinar, F. (2010), “The effects of inquiry-based learning on elementary students’ conceptual understanding of matter, scientific process skills and science attitudes”, Procedia-Social and Behavioral Sciences, Vol. 2 No. 2, pp. 1190-1194.
Slavich, G.M. and Zimbardo, P.G. (2012), “Transformational teaching: theoretical underpinnings, basic principles, and core methods”, Educational Psychology Review, Vol. 24 No. 4, pp. 569-608.
Smith, K.A., Douglas, T.C. and Cox, M. (2009), “Supportive teaching and learning strategies in STEM education”, in Baldwin, R. (Ed.), Improving the Climate for Undergraduate Teaching in STEM Fields. New Directions for Teaching and Learning, Vol. 117, Jossey-Bass, San Francisco, CA, pp. 19-32.
Smith, M.K., Vinson, E.L., Smith, J.A., Lewin, J.D. and Stetzer, K.R. (2014), “A campus-wide study of STEM courses: new perspectives on teaching practices and perceptions”, CBE Life Sciences Education, Vol. 13, pp. 624-635.
Springer, L., Stanne, M.E. and Donovan, S.S. (1999), “Effects of small-group learning on undergraduates in science, mathematics, engineering, and technology: a meta-analysis”, Review of Educational Research, Vol. 69 No. 1, pp. 21-51.
Steinemann, A. (2003), “Implementing sustainable development through problem-based learning: pedagogy and practice”, Journal of Professional Issues in Engineering Education and Practice, Vol. 129 No. 4, pp. 216-224.
Walczyk, J.J. and Ramsey, L.L. (2003), “Use of learner-centered instruction in college science and mathematics classrooms”, Journal of Research in Science Teaching, Vol. 40 No. 6, pp. 566-584.
Walter, E.M., Beach, A.L., Henderson, C. and Williams, C.T. (2015), “Measuring postsecondary teaching practices and departmental climate: the development of two new surveys”, in Burgess, D., Childress, A.L. and Slakey, L. (Eds), Transforming Institutions: Undergraduate STEM in the 21st Century, G.C. Weaver, W. Purdue, IN, Purdue University Press, West Lafayette, IN, pp. 411-428.
Walter, E.M., Henderson, C.R., Beach, A.L. and Williams, C.T. (2016), “Introducing the Postsecondary Instructional Practices Survey (PIPS): a concise, interdisciplinary, and easy-to-score survey”, CBE – Life Sciences Education, Vol. 15 No. 4, pp. 1-11.
Watkins, J. and Mazur, E. (2013), “Retaining students in science, technology, engineering, and mathematics (STEM) majors”, Journal of College Science Teaching, Vol. 42 No. 5, pp. 36-41.
Weimer, M. (2002), Learner-Centered Teaching: Five Key Changes to Practice, Jossey-Bass, San Francisco, CA.
Williams, C.T., Walter, E.M., Henderson, C. and Beach, A.L. (2015), “Describing undergraduate STEM teaching practices: a comparison of instructor self-report instruments”, International Journal of STEM Education, Vol. 2 No. 18, pp. 1-14, doi: 10.1186/s40594-015-0031-y.
Zhao, K., Zhang, J. and Du, X. (2017), “Chinese business students’ changes in beliefs and strategy use in a constructively aligned PBL course”, Teaching in Higher Education, Vol. 22 No. 7, pp. 785-804, doi: 10.1080/13562517.2017.1301908.
The authors would like to thank the participants in the study: the authors’ colleagues, who supported this study.