Does students’ performance in the formative CLIPP examination predict their scores in the NBMEPSE?

Lolowa Almekhaini (Department of Pediatrics, UAE University, Al Ain, United Arab Emirates)
Ahmad R. Alsuwaidi (Department of Pediatrics, UAE University, Al Ain, United Arab Emirates)
Khaula Khalfan Alkaabi (Department of Pediatrics, UAE University, Al Ain, United Arab Emirates)
Sania Al Hamad (Department of Pediatrics, UAE University, Al Ain, United Arab Emirates)
Hassib Narchi (Department of Pediatrics, UAE University, Al Ain, United Arab Emirates)

Arab Gulf Journal of Scientific Research

ISSN: 1985-9899

Article publication date: 11 April 2023

Issue publication date: 26 March 2024

342

Abstract

Purpose

Computer-Assisted Learning in Pediatrics Program (CLIPP) and National Board of Medical Examiners Pediatric Subject Examination (NBMEPSE) are used to assess students’ performance during pediatric clerkship. International Foundations of Medicine (IFOM) assessment is organized by NBME and taken before graduation. This study explores the ability of CLIPP assessment to predict students’ performance in their NBMEPSE and IFOM examinations.

Design/methodology/approach

This cross-sectional study assessed correlation of students’ CLIPP, NBMEPSE and IFOM scores. Students’ perceptions regarding NBMEPSE and CLIPP were collected in a self-administered survey.

Findings

Out of the 381 students enrolled, scores of CLIPP, NBME and IFOM examinations did not show any significant difference between genders. Correlation between CLIPP and NBMEPSE scores was positive in both junior (r = 0.72) and senior (r = 0.46) clerkships, with a statistically significant relationship between them in a univariate model. Similarly, there was a statistically significant relationship between CLIPP and IFOM scores. In an adjusted multiple linear regression model that included gender, CLIPP scores were significantly associated with NBME and IFOM scores. Male gender was a significant predictor in this model. Results of survey reflected students’ satisfaction with both NBMEPSE and CLIPP examinations.

Originality/value

Although students did not perceive a positive relationship between their performances in CLIPP and NBMEPSE examinations, this study demonstrates predictive value of formative CLIPP examination scores for their future performance in both summative NBMEPSE and IFOM. Therefore, students with poor performance in CLIPP are likely to benefit from feedback and remediation in preparation for summative assessments.

Keywords

Citation

Almekhaini, L., Alsuwaidi, A.R., Alkaabi, K.K., Al Hamad, S. and Narchi, H. (2024), "Does students’ performance in the formative CLIPP examination predict their scores in the NBMEPSE?", Arab Gulf Journal of Scientific Research, Vol. 42 No. 2, pp. 359-369. https://doi.org/10.1108/AGJSR-11-2022-0255

Publisher

:

Emerald Publishing Limited

Copyright © 2023, Lolowa Almekhaini, Ahmad R. Alsuwaidi, Khaula Khalfan Alkaabi, Sania Al Hamad and Hassib Narchi

License

Published in Arab Gulf Journal of Scientific Research. Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


Introduction

The College of Medicine and Health Sciences (CMHS) at the United Arab Emirates University (UAEU), a public federal university, offers a 6-year training program consisting of a 2-year premedical and a 4-year MD program that is divided into two phases of 2 years each: preclinical and clinical training, which center on different specialties. An 8-week pediatrics clerkship is offered in year five (junior clerkship), and a further 4-week clerkship in the final year (senior clerkship). Each clerkship program has a structured teaching program that includes bedside teaching in affiliated teaching hospitals. At the end of each clerkship program, students undergo an assessment that includes mini-clinical assessments (MiniCLEx) and the National Board of Medical Examiners Pediatrics Subject Examination (NBMEPSE).

The National Board of Medical Examiners (NBME) is an independent, nonprofit and web-based American organization established to develop countrywide examination regulations that allow medical accrediting authorities to evaluate candidates for licensing issuance. The NBME subject examination is used internationally to assess and train students in medical schools (Wright & Baston, 2017). It offers standardized and objective examinations in various disciplines, and these examinations are used throughout all phases of medical studies, particularly in the final assessments after clerkships (Johnson, Khalil, Peppler, Davey, & Kibble, 2014; National Board of Medical Examiners, 2016). Various studies have identified the NBME subject examination as the best intervention to evaluate low-performing students before graduation (Johnson et al., 2014; Guiot & Franqui-Rivera, 2018). It also serves as a global resource and a model for assessing approaches and appraisals in medicine, with a wide array of assessments that are well suited for medical schools (Guiot & Franqui-Rivera, 2018). These assessments allow medical schools to appraise their students’ achievement and compare their performance with a sizable, global and representative group of medical students at the same training stage (Bakoush, Al Dhanhani, Alshamsi, Grant, & Norcini, 2019). They assess both basic and clinical science knowledge (National Board of Medical Examiners, 2016) and comprise 100 multiple-choice questions (MCQs), each constructed in the form of clinical vignettes with one best answer that focuses on the application and integration of comprehension (National Board of Medical Examiners, 2019, 2020). Each of these examinations evaluates students’ comprehension of normal development (5%–10% of the questions), as well as a majority of cardiovascular, neurological, renal and other diseases, with each domain evaluated with 15% of the questions. These examinations also assess the important tasks of physicians, including health maintenance and promotion (5%–10% of questions), comprehension of disease mechanisms (25%–30% of questions), establishment of a diagnosis (40%–45% of questions) and the application of management principles (10%–15% of questions). The content area classification is as follows: foundation, diagnosis, management and prevention. The scores are compared across examinations, schools and examinee performance records. Individual performance profiles of correct percentage scores are reported. This web-based examination is administered in a secured setting in our college. In the final clerkship assessments administered, the NBMEPSE scores account for 30% of the total assessments (Narchi, 2013, 2014). In this study, NBME 1 refers to the examination offered at the end of the junior clerkship level (Year 1), while NBME 2 denotes the assessment administered at the end of the senior clerkship (Year 2). Students are provided with a score interpretation guide and a performance profile to facilitate their self-assessment (National Board of Medical Examiners, 2019, 2020).

In 2017, the Computer-Assisted Learning in Pediatrics Program (CLIPP) was added to both junior and senior pediatrics clerkships as a midterm formative assessment taken voluntarily. This is an online, validated assessment of 100-item MCQs created by the Council on Medical Student Education in Pediatrics (COMSEP), which is an independent organization of pediatric medical student education (Schifferdecker, Berman, Fall, & Fischer, 2012; Fall et al., 2005; Sox et al., 2018; COMSEP, 2020). The examination is delivered via a secure browser; it complements the performance assessment of students during their clerkship, enhances students’ self-directed learning and helps develop appropriate remediation when needed (Aquifer pediatrics, 2020). It also helps students prepare for the summative NBMEPSE at the end of their clerkship training. The students receive their CLIPP reports with their respective scores well before their NBMEPSE examination. In addition, the CLIPP reports also help ascertain the effectiveness of the curriculum as well as teaching techniques employed by instructors. In this study, CLIPP 1 refers to the examination administered during junior clerkship (Year 1) and CLIPP 2 to the one administered during senior clerkship (Year 2).

At the end of their last year of study, just before graduation, students sit for the International Foundation of Medicine (IFOM) examination, which covers all aspects of clinical medicine as opposed to the NBME Subject Examination and the CLIPP. The IFOM examination comprises 160 MCQs from basic clinical science in the following clinical areas: pediatrics, medicine, family medicine, surgery, psychiatry, and obstetrics and gynecology (IFOM, 2019a, b). In addition, it allows the comparison of CMHS-UAEU students’ performance prior to graduation with that of final-year medical students around the world. Therefore, the CMHS-UAEU uses the IFOM Clinical Science Examination as an essential part of assessing students prior to graduation. In this study, IFOM 2 refers to the examination administered in the final year prior to graduation.

The objectives of this study were as follows: (1) to evaluate, in each clerkship, the predictive value of the CLIPP formative assessment on students’ performance in the NBMEPSE summative assessment as well as on their performance in the final IFOM assessment and (2) to survey the students’ satisfaction and perceptions regarding the benefits of both of these evaluation tools. The null hypothesis was that the formative CLIPP scores do not predict the NBMEPSE or the IFOM scores. If that hypothesis was rejected for both independent summative assessments (NBMEPSE and IFOM) in both clerkships, the value of the formative CLIPP assessment to predict the two latter scores would be firmly validated. This would therefore suggest that identifying the students performing poorly in the CLIPP assessment would pave the way for the implementation of remediation measures that can improve students’ future performance in both the NBMEPSE and the IFOM.

Methods

This study, undertaken at the CMHS-UAEU, is a retrospective observational review of the marks obtained by all junior and senior students as well as students who recently graduated after completing both their voluntary formative CLIPP and the mandatory summative NBMEPSE at the end of each of the two pediatric clerkships between October 2017 and June 2020.

In addition, at the end of each NBMEPSE, students were asked to participate in a 5-min online survey via a self-administered questionnaire regarding their perceptions about the value of the CLIPP examination and any correlation they think it has with the NBMEPSE (the complete survey is available as Supplemental Digital Appendix 1). The survey was anonymous and did not have any impact on the students’ progression throughout their medical studies. By completing the online survey, the students confirmed their voluntary consent to participate in this study. Participants’ confidentiality was maintained, and no identifying data were included in data analysis. The participants were informed that they reserved the right to withdraw at any stage of the study without facing any penalties.

Statistical analysis

After testing for normality (Wilks–Shapiro test), the examination scores were expressed as mean value and standard deviation (SD). The group mean scores were compared using the unpaired t-test, and the correlation between the students’ CLIPP and NBMEPSE scores was tested using the Pearson product–moment correlation. The correlation between the CLIPP and NBMEPSE scores in each clerkship was tested using a univariate linear multivariate regression model, and the IFOM scores were tested using a linear multivariate regression model including gender. The results of the survey were expressed as percentages of students selecting a particular option on a 5-point Likert scale. All the statistical tests were performed using the STATA version 15 software package (StataCorp, Texas, USA), and a two-sided p value < 0.05 defined statistical significance.

Results

Participants

A total of 381 students (299 females, 78.5%) were enrolled during the study period. Out of these, 286 students had completed both junior and senior clerkships. In the junior clerkship, the CLIPP-1 assessment was taken by 200 students and the NBME-1 by 353 students, with 200 taking both examinations. The CLIPP-2 assessment and the NBME-2 assessment in the senior clerkship were taken by 223 and 286 students, respectively, with 223 taking both examinations. The IFOM-2 examination was taken by 167 senior students (43.8%). However, in total, only 128 junior and senior students completed the online survey.

Assessment scores

There were no significant differences in the CLIPP, NBME and IFOM scores based on gender (Table 1). Similarly, there were no significant differences in the NBME and IFOM scores between the students who voluntarily took the CLIPP assessment and those who did not (Table 2).

Correlation between the CLIPP and NBME scores

While there was a good correlation (r = 0.72) between both scores and in both genders in the junior clerkship, it was more modest (r = 0.46) in the senior clerkship where it was weak in females (r = 0.44) but remained strong (r = 0.73) in males (Table 3).

Modeling of the relationship between the CLIPP and NBME scores

In both clerkships, there was a statistically significant linear relationship between both scores (Table 4). In Junior Clerkship 1, for each increase by 1 mark in the CLIPP score, the resulting increase in the NBME mark was 0.82 (Table 3) regardless of gender. In Senior Clerkship 2, for each increase by 1 mark in the CLIPP score, the resulting increase in the NBME mark was 0.54, with an additional effect of gender; that is to say, the CLIPP scores of male students were expected to increase by 8.7 marks when compared with those of female students (Table 4).

Modeling of the relationship between the CLIPP and IFOM scores

In the unadjusted simple linear regression model, CLIPP 2 scores were significantly associated with the final IFOM 2 scores. However, in the adjusted linear regression model, gender was significantly associated with the IFOM scores. In the senior clerkship, for each extra mark in CLIPP 2 scores, the IFOM 2 score increased by 3.82 marks, with an additional 43.15 marks in male students’ scores when compared with female students’ scores (Table 4).

CLIPP survey

Most students were happy about the suitability of the setting and venue. They were also positive about the contents of the scenarios and questions, the authenticity of the questions, their clarity, the optimal use of software and the examination duration (Figure 1 and Table 5). However, many students did not feel that CLIPP prepared them well for the NBME or helped predict their performance in the NBME. They also felt that the feedback they received after the CLIPP assessment did not help them prepare for the NBME.

Discussion

This study confirms that both the summative NBME and IFOM scores increase in proportion to the formative CLIPP score, corroborating that both can be predicted by formative CLIP scores. This significant relationship between formative CLIPP scores and summative NBMEPSE scores, which is not surprising as both examinations assess knowledge in pediatrics, was also found between the CLIPP assessments – exclusively in pediatrics – and the final IFOM summative examinations, which evaluate knowledge in all branches of medicine. This strongly reinforces the strong predictive value of the formative CLIPP assessment regarding all summative examinations. However, this is not unexpected because these examinations assess students’ knowledge, also confirming the findings of previous studies demonstrating the positive correlation between the CLIPP examination and the summative NBME scores (Morrison et al., 2010). In addition, the use of its patient case simulations helps improving the teaching quality (Leong, Baldwin, & Adelman, 2003). Identifying poorly performing students in the formative CLIPP assessment and giving them appropriate feedback and remediation should, therefore, intuitively help the students improve their performance in the summative NBME and IFOM assessments.

Although this study employs CLIPP as a formative pediatric assessment, it is surprising that it also predicts the final IFOM examination that encompasses all branches of medicine. It is not expected that improving students’ knowledge in pediatrics, demonstrated in the CLIPP results, would enhance their knowledge in all other branches of medicine as tested in the IFOM. Therefore, we can speculate that the students’ scores in both CLIPP and IFOM examinations reflect the students’ own inherent learning abilities across a wide range of domains instead of attributing the enhancement of their IFOM scores directly and solely on their pediatric CLIPP experience.

Surprisingly, the students did not perceive any value of the CLIPP feedback in helping them prepare for the NBME. This could be attributed to the fact that they currently receive only their CLIPP scores without any detailed feedback on their performance, thus making it impossible to identify their weaknesses and take the necessary measures to improve their future performance. The participants believe that the CLIPP examination is easier than the NBME examination is as it focuses on a limited range of clear objective pediatrics conditions as opposed to the NBME examination, which covers a broad range of pediatric ailments and requires deeper and more advanced knowledge. This is perhaps the reason for which the students felt that the CLIPP examination had no significant role in improving their readiness for the NBME examination. However, they admitted that taking the CLIPP examination earlier in their clerkship helped them objectively benchmark their knowledge and identify their strengths and weaknesses. Consequently, this encouraged them to improve their knowledge.

In contrast to the plethora of reports in the literature evaluating the NBME and the United States Medical Licensing Examination (USMLE) (Boscardin, Earnest, & Hauer, 2020; Torre et al., 2020; Rosenthal et al., 2019), the paucity of publications regarding the relationship between the CLIPP assessment and other examinations in pediatric clerkships does not allow researchers to make any comparisons (Schifferdecker et al., 2012; Sox et al., 2018). This is probably due to the fact that CLIPP is an exclusive pediatric assessment as opposed to the NBMEPSE examinations, which cover all medical disciplines.

This study has some limitations. In addition to the small sample size of students, not all the students took the two formative CLIPP assessments, the two summative NBMEs or even the final IFOM assessment. In addition, the effects of the remediation provided to some students after their CLIPP assessment reports and just before the summative assessments were not analyzed because no systematic documentation existed. Finally, because the CLIPP examination was an optional and voluntary formative assessment, its outcomes could not be assessed in all students. However, the findings of this study showed no significant differences in the NBME and IFOM scores between students who voluntarily took the CLIPP assessment and those who did not. This implies that the proportionality between the CLIPP assessment scores and both summative assessments scores could be attributed to the students’ own intellectual capabilities as opposed to the feedback received from the summative assessments.

Future studies should consider using a larger sample size and conducting a formal analysis of the role of remediation measures implemented after the CLIPP on the summative assessment scores. Including the students’ NBME performance in other medical disciplines (which do not have CLIPP assessments) in the analysis of the final IFOM scores would also help better delineate the specific contribution of the CLIPP and NBMEPSE pediatric assessments to the IFOM.

Conclusion

This study reveals that the CLIPP examination is a good predictor of students’ performance in both the NBMEPSE assessments during pediatrics clerkships and the IFOM assessment prior to graduation. Therefore, giving students detailed feedback on their performance in the CLIPP examination is essential and should be an integral part of their educational experience. Appraising students assessment tools provides robust evidence-based information for improving their performance. Future studies should include larger sample size to evaluate the role of pre-assessment preparation, post-assessment feedback and appraisal of remedial measures to improve future performance. Introducing and examining a new assessment tool matched with appropriate elements of instructional design is imperative. These may include gamification, interface design, density of the content presented, in addition to the learning and cognitive styles of the learners.

Figures

CLIPP survey results

Figure 1

CLIPP survey results

Examination scores of students in the two clerkships

MalesFemalesp value*All students
CLIPP 139.7 ± 11.939.1 ± 11.00.739.3 ± 11.2
NBME 151.0 ± 13.052.6 ± 1.90.252.3 ± 12.2
CLIPP 238.2 ± 13.542.6 ± 13.40.0641.8 ± 13.5
NBME 256.1 ± 14.454.4 ± 14.50.454.8 ± 14.5
IFOM 2529.6 ± 95.8522.8 ± 82.60.6524.3 ± 85.4

Note(s): n = 200 in junior clerkship and 223 in senior clerkship, expressed as mean ± standard deviation (SD) Assessments 1 and 2 represent junior and senior pediatric clerkships, respectively

*unpaired t-test

Source(s): Authors’ work

Comparison of NBME and IFOM examination scores between students who voluntarily took the CLIPP assessment and those who did not

CLIPP takenCLIPP not takenp value*
Total students 200143
NBME 153.0 ± 12.751.2 ± 11.30.17
Total students22363
NBME 254.6 ± 14.955.5 ± 13.10.63
Total students’15116
IFOM 2526.6 ± 86.4502.3 ± 74.20.27

Note(s): The results are expressed as mean ± standard deviation (SD), unless stated otherwise Assessments 1 and 2 represent junior and senior pediatric clerkships, respectively

*unpaired t-test

Source(s): Authors’ work

Pearson’s r correlation between the NBME and CLIPP scores in each of the junior and senior clerkships

MalesFemalesAll students
Junior clerkship (Year 1)0.720.720.72
Senior clerkship (Year 2)0.730.440.46

Source(s): Authors’ work

Multivariate linear regression modeling of NBME and IFOM scores from the obtained CLIPP scores, corrected for gender, in each of the junior and senior clerkships

Unadjusted models*Adjusted model§
Students nCoefficient for CLIPP (95% ci)ConstantAdjusted R2p valueCoefficient for CLIPP (95% ci)ConstantAdjusted R2p valueCoefficient for gender (p value)
NBME 1 (junior)2000.82 (0.71. 0.93)20.790.52<0.0010.82 (0.71, 0.93)20.40.52<0.0011.72 (0.22)
NBME 2 (senior)2230.51 (0.38, 0.63)33.230.21<0.0010.54 (0.41, 0.66)30.480.26<0.0018.78 (<0.001)
IFOM 2 (senior)1513.71 (2.9, 4.5)380.130.36<0.0013.82 (3.0, 4.6)369.310.39<0.00143.15 (0.006)

Note(s): * Simple linear univariate regression; § multivariate linear regression model adjusting for gender

Predicted NBME 1 score = (0.82 × CLIPP 1 score) + 20.4 (p < 0.001)

Predicted NBME 2 score = (0.54 x CLIPP 2 score) + 30.48 + 8.78 if male student (p < 0.001)

Predicted IFOM score = (3.82 x CLIPP 2 score) + 369.31 + 43.15 if male student (p < 0.001)

Source(s): Authors’ work

CLIPP examination survey results expressed as Likert score mean ± SD

Mean ± SD
The CLIPP examination adequately assesses the paediatric clerkship’s learning objectives3.5 ± 1.1
It adequately matches your academic-year level3.5 ± 1.1
Its contents, including the scenarios and answer options, are clear, well written and up-to-date3.7 ± 1.2
The cases are authentic and realistic3.9 ± 1.0
It provides adequate preparation for the pediatric NBME2.9 ± 1.2
It is a useful experience within the pediatric clerkships3.5 ± 1.2
The feedback helped you predict your future NBME performance2.8 ± 1.2
The feedback helped guide your learning objectives plan3.2 ± 1.3
The feedback is a useful tool in preparing for pediatrics NBME2.9 ± 1.3
It is important to include the examination results and the feedback in the pediatric curriculum3.1 ± 1.2
Its timing was appropriate to support your preparation for the NBME assessment3.0 ± 1.3
The allocated time was sufficient to answer all questions3.7 ± 1.2
The examination software was used optimally3.9 ± 1.1
The scenarios, questions and answer options were visually clear and suitable4.0 ± 0.9
The CLIPP examination setting/venue was appropriate4.1 ± 1.0

Source(s): Authors’ work

Supplemental Digital Appendix 1

References

Aquifer pediatrics (2020). Available from: https://aquifer.org/courses/aquifer-pediatrics/#assessment (accessed 10 October 2020).

Bakoush, O., Al Dhanhani, A., Alshamsi, S., Grant, J., & Norcini, J. (2019). Does performance on United States national board of medical examiners reflect student clinical experiences in United Arab Emirates? MedEdPublish, 8(4). doi:10.15694/mep.2019.000004.1.

Boscardin, C. K., Earnest, G., & Hauer, K. E. (2020). Predicting performance on clerkship examinations and USMLE step 1: What is the value of open-ended question examination? Acad med. In 95(11S association of merican medical colleges learn serve lead: proceedings of the 59th annual research in medical education presentations) (pp. S109S113).

Council on Medical Student Education in Pediatrics (2020). COMSEP. Available from: https://www.comsep.org/assessment-and-evaluation-collaborative (accessed 8 August 2020).

Fall, L. H., Berman, N. B., Smith, S., White, C. B., Woodhead, J. C., & Olson, A. L. (2005). Multi-institutional development and utilization of a computer-assisted learning program for the pediatrics clerkship: The CLIPP project. Academic Medicine, 80(9), 847855.

Guiot, H. M., & Franqui-Rivera, H. (2018). Predicting performance on the United States medical licensing examination step 1 and step 2 clinical knowledge using results from previous examinations. Advances in Medical Education and Practice, 9, 943949.

International foundation of medicine (IFOM) (2019a). Clinical science examination content outline. Available from: https://www.nbme.org/IFOM_CSE_Public_Outline_Udate- 2016-2020.pdf (accessed 10 October 2019).

International foundation of medicine (IFOM) (2019b). Subject examination. Content outlines and sample items. Available from: https://www.nbme.org/2020-SE_ContentOutlineandSampleItems.pdf (accessed 10 October 2019).

Johnson, T. R., Khalil, M. K., Peppler, R. D., Davey, D. D., & Kibble, J. D. (2014). Use of the NBME comprehensive basic science examination as a progress test in the preclerkship curriculum of a new medical school. Advances in Physiology Education, 38(4), 315320.

Leong, S. L., Baldwin, C. D., & Adelman, A. M. (2003). Integrating web-based computer cases into a required clerkship: Development and evaluation. Academic Medicine, 78(3), 295301.

Morrison, C. A., Ross, L. P., Fogle, T., Butler, A., Miller, J., & Dillon, G. F. (2010). Relationship between performance on the NBME comprehensive basic sciences self-assessment and USMLE Step 1 for U.S. and Canadian medical school students. Academic Medicine, 85(10), (suppl), S98S101.

Narchi, H. (2013). Pediatric examinations content validity comparison: In-house versus NBME examination. Medical Science Education, 23(2), 250258.

Narchi, H. (2014). Predictive validity of an in-house pediatric examination towards the NBME Pediatrics Subject examination. Journal of Contemporary Medicine Education, 2(1), 2329.

National Board of Medical Examiners (2016). Characteristics of clinical clerkships. Available from: http://www.nbme.org/PDF/SubjectExams/Clerkship_Survey_Summary.pdf (accessed 25 October 2016).

National Board of Medical Examiners (2019). Guide to the subject examination program. Available from: https://www.nbme.org/2020/Guide_Subject_Exams.pdf (accessed 20 May 2019).

National Board of Medical Examiners (2020). Subject examination. Content outlines and sample items. Available from: https://www.nbme.org/2020_SE_ContentOutlineandSampleItems.pdf (accessed 8 August 2020).

Rosenthal, S., Russo, S., Berg, K., Majdan, J., Wilson, J., Grinberg, C., & Veloski, J. (2019). Identifying students at risk of failing the USMLE step 2 clinical skills examination. Family Medicine, 51(6), 483499.

Schifferdecker, K. E., Berman, N. B., Fall, L. H., & Fischer, M. R. (2012). Adoption of computer-assisted learning in medical education: The educators’ perspective. Medical Education, 46(11), 10631073.

Sox, C. M., Tenney-Soeiro, R., Lewin, L. O., Ronan, J., Brown, M., King, M, … Dell, M. (2018). Efficacy of a web-based oral case presentation instruction module: Multicenter randomized controlled trial. Academic Pediatrics, 18(5), 535541.

Torre, D. M., Dong, T., Schreiber-Gregory, D., Durning, S. J., Pangaro, L., Pock, A., & Hemmer, P. A. (2020). Exploring the predictors of post-clerkship USMLE step 1 scores. Teaching and Learning in Medicine, 32(3), 330336.

Wright, W. S., & Baston, K. (2017). Use of the national board of medical Examiners® comprehensive basic science exam: Survey results of US medical schools. Advances in Medical Education and Practice, 8, 377383.

Acknowledgements

The authors are grateful to their students for their voluntary participation in this study.

Corresponding author

Lolowa Almekhaini can be contacted at: lolwa.mukini@uaeu.ac.ae

Related articles