Preprint
Review

Evolving Assessment in Medical Education: Exploring the Role of Open-Book Examinations

Altmetrics

Downloads

124

Views

79

Comments

0

A peer-reviewed article of this preprint also exists.

Submitted:

22 April 2024

Posted:

25 April 2024

You are already at the latest version

Alerts
Abstract
The landscape of medical education has witnessed significant transformations over the past decades, particularly with the advent of active teaching methodologies. However, despite these advancements, the traditional theoretical assessment methods have remained largely unchanged. This lack of evolution in assessment systems poses a challenge, as it is crucial for assessment methods to evolve in tandem with teaching approaches to ensure a comprehensive and effective learning process in medical education. This paper reviews the integration and effectiveness of open-book examinations (OBEs) in medical education, reflecting their growing significance. An integrative review of the literature was conducted, drawing from a range of relevant publications over the last decade, sourced from databases such as PubMed, Web of Science, Scopus, and ERIC. The inclusion criteria focused on full-text articles in English, with search terms including "medicine," "assessment," "open book examination," "open book exam," and "open book assessment," combined using Boolean operators. Thirteen publications were selected and critically appraised using The Critical Appraisal Skills Program checklist. The analysis identified three primary thematic categories: ' Teaching Strategy for Pandemic and Challenging Conditions,” Tool of Learning & Educational Impact” and “Operational Challenges & Future Directions”. These themes were explored to understand the role and impact of OBEs in medical education. The findings indicate that open-book examinations are a crucial component in the evolving landscape of medical education. While certain reservations remain, OBEs have shown significant potential in fostering critical thinking, argumentation skills, and lifelong learning among medical students. They reflect the ongoing evolution of knowledge in the medical field and contribute to the development of professionals’ adept at navigating and applying complex information. Further research is recommended to solidify these findings and to expand the understanding of OBEs in medical education.
Keywords: 
Subject: Medicine and Pharmacology  -   Other

Introduction & Background

The landscape of medical education is in a state of continuous evolution, adapting to the ever-changing demands of healthcare and the advances in educational theory. A critical aspect of this transformation is the assessment strategies employed to evaluate medical students. Traditional examinations, predominantly closed-book in nature, have long been the cornerstone of academic evaluation [1]. However, in recent decades, there has been a paradigm shift towards more innovative assessment methods, one of which is the open-book examination (OBE) [2].
An OBE is a type of assessment method in which the students are allowed to consult approved resources permitted material while attempting the examination [3].The approved resources could include class notes, textbooks, primary or secondary readings, and/or access the internet when answering questions. The inception of OBEs in medical education marks a significant departure from conventional memorization-based assessments. Unlike closed-book exams, which often prioritize rote learning and recall, open-book exams encourage students to understand, analyze, and apply knowledge, mirroring the realities of clinical practice where resources and references are readily available [4]. This shift acknowledges the immense body of medical knowledge, continuously expanding and evolving, making it impractical and unnecessary for students to memorize vast quantities of information. [1]
Critically, OBEs aim to assess a student's ability to efficiently locate and utilize information, skills that are essential for practising physicians who must navigate a plethora of clinical data and evidence-based guidelines in their daily practice [2,5,6]. This approach aligns assessment methods with the practical demands of modern healthcare, emphasizing the application of knowledge rather than its mere recall. 6] It also reflects a broader educational philosophy that values lifelong learning and adaptability, skills that are indispensable in a profession characterized by rapid advancements and constant change [7].
However, the implementation of OBEs in medical education is not without challenges. Questions arise regarding their ability to rigorously assess student competence, the potential for academic dishonesty, and the need for carefully crafted questions that probe deeper levels of understanding. Additionally, there is a debate on how well OBEs prepare students for high-stakes, closed-book examinations like licensing and board certification tests, which remain a reality in the medical profession [8].
The efficacy of OBEs also hinges on the pedagogical framework within which they are employed. They demand a curriculum that fosters independent learning, critical thinking, and efficient information management. Instructors play a pivotal role in crafting questions that test knowledge application and problem-solving skills, mitigate the risk of superficial learning, and encourage a thorough understanding of core concepts [9].
In exploring the role of OBEs, it is imperative to consider their impact on students' learning experiences, study behaviours, and academic performance. Research suggests that OBEs can reduce examination-related anxiety, promote a deeper engagement with the material, and encourage a more strategic approach to learning [10]. However, there is also a need to understand the perspectives of both students and educators on the effectiveness of this assessment method, its impact on learning outcomes, and the challenges encountered in its implementation.
As the medical field continues to advance, so must the methods we use to educate and assess future physicians. Open-book examinations represent a progressive step in aligning medical education with the realities of clinical practice and the principles of adult learning theory. By embracing this innovative assessment approach, medical education can foster a generation of doctors who are not only knowledgeable but also adept at navigating the vast landscape of medical information, which is critical in making informed, evidence-based decisions in patient care [11].
The exploration of open-book examinations in medical education offers valuable insights into how assessment strategies can evolve to better prepare students for the demands of modern medical practice. This exploration is not just about changing how we test but also about redefining what and how we teach, ensuring that medical education remains relevant, effective, and responsive to the needs of students and the healthcare systems they will serve.
Assessment in medical schools during times of crisis, such as pandemics like COVID-19, war, and natural disasters, poses unique challenges that can significantly impact the quality and effectiveness of medical education. Moat recently COVID-19 pandemic has had significant impacts both the delivery of education and the implementation of fair, effective evaluation methods, highlighting the need to evaluate our students with the utmost quality and reliability [12,13]. Social distancing requirements and the challenges associated with ensuring rigorous examination conditions have, in numerous instances, made the administration of conventional traditional closed-book exams impossible. The experience gained during this period should continue and be scientifically based to be used more effectively, with greater confidence and knowledge.
Thus, this study aimed to systematically investigate the integration of open-book examinations as an educational assessment method in the medical curriculum, with a focus on understanding the pedagogical approaches, challenges, and learning outcomes associated with their implementation.

Methods

Ethics statement
This was a literature-based study; therefore, neither approval from the institutional review board nor informed consent was required.
Design
This was an integrative review, described in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines.
The following research question guided this review: How do open-book examinations impact the learning process and academic performance of medical students compared to traditional closed-book examinations?
To structure this review, and answer this research question, we used Whittemore and Knafl's framework for integrated reviews [14] Integrative review is a form of research that reviews, critiques and synthesises representative literature on a topic in an integrated way such that new frameworks and perspectives on the topic are generated. It presents a unique approach by combining and synthesizing data from diverse research designs including qualitative, quantitative and mixed methods, to gain a more comprehensive understanding of the phenomenon of interest [15,16].
Eligibility criteria
The PICOS (population, intervention, comparison, outcome, study design) framework was used to develop search term for the systematic review and was the foundation for study inclusion and exclusion criteria.
Search strategies.
Information sources
A comprehensive systematic search of the literature was conducted using four electronic databases, including PubMed, Web of Science, Scopus, and ERIC.
The databases were checked for English-language studies published from 2013 to 2023.
Search strategy.
Two reviewers (Y.A., S.K.) independently conducted a systematic search for studies evaluating open-book examination in medical education.
We utilized a group of descriptors, combined using the Boolean operator "AND" and "OR" The following key search terms were used for databases “medicine,” “assessment,” “open book examination,” “open book exam,” “open book test,” and “open book assessment.” Using 6 descriptors, we performed a total of 30 search combinations.
Selection process.
The process began by defining the theme, in this case, open-book examinations in medical education. This allowed us to identify specific descriptors or keywords relevant to the subject.
Data management of this systematic review study was done using Zotero version 6.0.30 (Corporation for Digital Scholarship, Vienna, VA, USA).
Titles and abstracts were initially screened by 2 researchers (YA, SK) separately to identify potentially included articles. Subsequently, the full articles were thoroughly reviewed by both reviewers against the eligibility criteria. Consensus was reached by the research team on the final list of articles to be included. The other 2 researchers (MW, MA) resolved any contradictions between the 2 researchers in selecting studies. Given the diverse range of research methodologies, the Critical Appraisal Skills Programme (CASP) tool was adapted. This adaptation closely followed the well-documented modifications introduced by Halcomb et al [17,18]. Finally, to prevent data loss, the list of study references was evaluated manually.

Results

Study selection.
A total of 6,720 articles were identified from the electronic databases. Initially, a total of 249 duplicate records were identified and removed from the dataset to ensure data integrity. The comprehensive search yielded 6,471 records relevant to the research question, which underwent the initial screening process. After careful screening based on titles and abstracts (inclusion criteria of content), many records (n=6,267) were found not to meet the inclusion criteria of content and were consequently excluded from further consideration. Further scrutiny was applied to the remaining records, resulting in the exclusion of all but 204 full-text articles, which were assessed for eligibility in the systematic review. After conducting a thorough full-text screening, 191 articles were excluded due to not meeting the inclusion criteria of the research. Subsequently, a total of 13 articles were deemed eligible and included in the final analysis for this systematic review. A detailed literature search is presented in Figure 1.
Abbreviations: PubMed, United States National Library of Medicine. ERIC , Education Resources Information Center
This review identified 13 publications that met the eligibility criteria (Table 2).
Study characteristics.
The findings from the 13 publications reviewed are categorised to three primary categories providing a structured framework for discussing the central evidence identified in the publications. These categories, ‘Teaching Strategy for Pandemic and Challenging Conditions, Tool of Learning and Educational Impact in Medical Education, and Operational Challenges and Future Directions will be used to guide our discussion of the findings.
This table provides a structured framework for understanding the three primary categories identified in the study, each encompassing various aspects related to open-book examinations in medical education.

Discussion

This section review commences by underscoring the adaptability and effectiveness of OBEs during the unprecedented challenges of the COVID-19 pandemic, highlighting their role in maintaining educational continuity and integrity. Conditions similar to the COVID-19 pandemic that would rationalize the use of open book exams (OBEs) include any situation where traditional, in-person examination methods are impractical or pose health, safety, or logistical issues. These conditions often necessitate a shift towards more flexible, inclusive, and accessible assessment methods that can be administered remotely. This includes public health crises, natural disasters, political or civil unrest and technological disruptions. As we delve deeper, we explore OBEs' enduring utility beyond the pandemic context, examining their impact on student learning strategies, performance, and anxiety levels.

Teaching Strategy for Pandemic and Challenging Conditions

Since the beginning of the year 2020, we have witnessed significant challenges brought about by the COVID-19 pandemic. Its global impacts have rippled across all sectors and activities, including education. Social distancing measures became imperative to contain viral transmission, resulting in the abrupt suspension of in-person teaching activities. To prevent potential disruptions, adaptations were made to teaching and assessment methods [31]. This period of remote learning extended far beyond its initial expectations. This would be applied in all similar scenarios where there are challenges that disrupt traditional examination methodologies (natural disasters, public health crises, political or civil unrest etc). In the following sections, we shall delve into a comprehensive discussion of studies of assessments conducted within this unique context.

Response to the Digital Era and the Pandemic Challenge

Utilization of open-book examinations in distance learning has emerged as a promising approach to adapt to the evolving landscape of education, especially in light of the COVID-19 pandemic. This assessment method aligns well with the principles of remote and online education, where students have access to vast digital resources [19,20,22]. Open-book exams acknowledge the reality that in the era of information abundance, memorization of facts and figures holds less importance compared to the ability to locate, synthesize, and apply knowledge. As such, these assessments emphasize critical thinking, problem-solving, and the practical application of information - skills that are highly relevant in the digital age and future medical practice [20,21].
Additionally, the use of open-book exams in distance education offers flexibility and adaptability. This approach enhances the learning experience, promoting active engagement with the content [23]. Moreover, open-book exams often necessitate continuous engagement with the course material, reinforcing the principles of active learning. In a remote learning environment, where self-directed learning is crucial, open-book exams support students in becoming more independent and resourceful learners [21]. This is in agreement with Bobby et al [24] who assessed the effectiveness of "Test enhanced learning" via "open-book examination" as a formative assessment tool in the context of medical education. Most of the students opined that the open-book examination enhanced their self-directed focused learning process. The students felt that open-book examination is more beneficial than “Self-study” in reinforcing the learning concepts after regular didactic lectures.

Efficacy in a remote online setting

During the COVID-19 pandemic, despite the challenges encountered, it presented an excellent opportunity for medical educators to carefully explore the utilization of open-book assessments in an online environment [31]. This mode of assessment can evaluate students' ability to efficiently research and translate information, a crucial skill requisite for future clinical practice. Furthermore, the same authors advocate for hybrid assessment strategies, including an initial section without access to reference materials to assess the learning of fundamental concepts that students should possess without external aids. Subsequently, a second section permits access to resources, evaluating students' ability to research, synthesize, correlate, construct arguments, and apply clinical reasoning to specific topics.
Eurboonyanun et al. [20] compared the final-year medical students' online surgery clerkship assessment scores to the traditional written examinations in the previous three rotations, using the same question bank. Students who took the online open-book examination scored higher on average in multiple-choice and essay questions but lower on short-answer questions. This result is important as it highlights the need to establish comparable pass rates and minimum passing grades for assessments with and without open-book access, procedures that should be replicated in other institutions wishing to change their assessment methodologies. Similarly, Sarkar et al. [21] conducted a study on an online open-book assessment in the otolaryngology discipline, also due to adaptations to remote teaching during the pandemic. The authors compared these results with previous traditional in-person assessments without open-book access and demonstrated similar pass rates in both methodologies.

Student Perspectives

In a study conducted by Elsalem et al. [19], the findings revealed that only one-third of the students preferred online open-book exams as an assessment modality during the COVID-19 pandemic. The authors attributed this low level of acceptance among students toward open-book assessments to several factors, the need for more effort/time to prepare for online open-book exams, challenges encountered during pre-examination preparations, and perceived disparities between the examination questions and the study materials provided. These findings are valuable for planning academic strategies aimed at effectively addressing the challenges associated with remote open-book assessments. Such strategies may include enhancements in distance learning methods, reorganization of assessment strategies, and review of academic curricula to ensure alignment with the evolving educational landscape.

Academic Integrity

Researchers have raised questions about the occurrence of academic dishonesty or cheating in remote open-book assessments. Elsalem et al. [19] investigated the occurrence of cheating and dishonesty during these exams and showed that 55.07% of students reported no exam dishonesty or misconduct, while 20.41% mentioned seeking help from friends, and 24.52% used other unauthorized sources of information. Additionally, it is noteworthy that, Sarkar et al. [21] obtained similar results, with 72.2% of students did not consult with their friends during the examination and answered independently. Monaghan [22] argues that to safeguard against cheating or collusion, strategies like randomizing the order of questions for each student were employed, rendering communication between them ineffective. This demonstrates that the occurrence of dishonesty and cheating in remote open-book assessments does not appear to be as frequent, and there are methods to discourage such behaviour. However, the current literature lacks studies comparing the frequency of these dishonest actions between in-person and remote assessment methods, as well as comparisons between open-book and closed-book assessments, to determine whether permitting reference materials or open-book access inhibits or reduces the occurrence of cheating.

Tool Of Learning and Educational Impact in Medical Education

Assessment stands as a fundamental facet in any educational process, a crucial element that demands the serious consideration of all stakeholders, particularly in the context of medical education [32]. Therefore, in this section, we shall dig into studies that have examined the integration of open-book assessment as an educational tool within medical education.

Educational Transformations

The shifts in teaching and assessment methods during the COVID-19 pandemic provided the opportunity to implement, test, and better understand open-book assessments in medical curricula whether conducted remotely or in traditional settings [21,22,31]. While most of these educational transformations occurred out of urgency, many will probably remain, in a more refined form, as preferred methods of teaching and assessment in the future [22]. Furthermore, Dave and Durning [25] assert that the window of opportunity afforded during this period should be exploited to advance our comprehension of holistic student assessment.
It is noteworthy that even before the COVID-19 pandemic, Bobby et al. [24] pointed out the need for such changes, advocating for an era of open books. Their argument posits that this change in assessment philosophy could benefit students by exposing them to deeper and more enjoyable learning methods. This perspective resonates with Ibrahim et al [26]. who emphasise the necessity of adopting more innovative assessment methods such as open-book assessment, self-assessment, and peer assessment.
Challenges in the adoption of open-book assessments are notably highlighted in the consensus among researchers, supporting the necessity of persuading medical educators who may exhibit resistance to change and a preference for adhering to established traditional assessment systems [21]. Similarly, other authors concur that the inherent challenge resides in educators' reluctance to embrace changes, thereby exhibiting a proclivity to perpetuate the utilization of conventional assessment methodologies [22]. To overcome these challenges, a paradigm shift in assessment philosophy is needed, which can result in students engaging in deeper and more thorough knowledge and innovative teaching and assessment methods.

Depth and Resource Accessibility

Regarding the depth of the topics covered in assessments with and without access to reference materials, it was observed that written clinical exams are suitable for open-book assessment formats. This is because the questions require a differentiated synthesis of information from the provided clinical scenarios, and therefore, the answers cannot be simply looked up on the internet [11].
However, Davies et al [28] predicted that questions requiring students to show understanding of a concept or apply knowledge to new information would be less affected by access to open-book resources. This is because the information required to answer the question should be more complex and may be more difficult to find online, which may have been particularly limiting in a time-pressured examination.

Test Enhanced learning.

It is noteworthy that one of the advantages highlighted in open-book assessments is that they not only discourage students from temporarily memorizing superficial information for regurgitation during evaluations but, crucially, closely mirror real clinical practice where such information is readily available from resources such as evidence-based digital medical libraries [11,21]. It becomes evident that utilizing this type of assessment also confers the advantage of exposing students to situations more closely aligned with the professional practice they will encounter in the future.
Bobby et al [24] discuss that open-book examinations present a valuable method for promoting higher-order cognitive competencies. This stands in contrast to traditional closed-book exams, which are much based on the candidates’ ability to memorise. Within the framework of open-book exams, questions can be framed to align with the higher levels of Bloom's Taxonomy. This format empowers educators to appraise students' proficiency in critical thinking, knowledge analysis, synthesis, and evaluation, characteristics of vital importance in the medical field. As a research gap, Eurboonyanun et al. [20] point out the need for further studies to assess the long-term effects of online open-book exams on knowledge retention and application.

Comparative Efficacy and Student Performance

An important initial point to note is that Erlich [29] and Durning et al. [11] conducted comparisons between open-book examinations and closed-book examinations to determine if there is a difference in student performance in these modalities. However, they found similar results in both cases. Additionally, Durning et al. [11] demonstrated that the performance of students in open-book examinations could be enhanced by the implementation of practical preparatory tests and instructions on this assessment modality, as students often have limited experience with this type of evaluation. Equally important, Erlich [29] argued that the majority of the students who perform below average in open-book examinations also have low scores in clinical assessments by preceptors [family physicians hosting students in their clinical practices], specifically in the domain of information mastery. This demonstrates that there is no superiority of one assessment modality over the other, and both can be used depending on the assessment's objectives. Therefore, both assessment tools are capable of identifying students with low performance.

Operational Challenges and Future Directions

Exam Preparation and Study Strategies

Regarding the influence of the assessment on the preparation and study method, researchers have different opinions. Durning et al. [11] showed that students do not change their study tactics for open-book examinations. Similarly, Davies [28] found no difference in preparation time or study tactics for open-book examinations.
In contrast, Sarkar et al. [21] evaluated student feedback after an online open-book examination and reported that they spent more time understanding the subject rather than just memorizing it. They also realized that they would not be able to write answers to the examination questions if the subjects had not been read and studied beforehand. This may indicate that open-book examinations do not hinder the method of study and preparation, or even better, that student’s study more deeply, focusing less on memorizing concepts and more on higher functions such as correlation, argumentation, and synthesis. This view is shared by Erlich [29], who emphasizes that in an era of internet-based knowledge evolution, medical professionals and medical students must be competent in quickly accessing, synthesizing, and applying continually updated information for decision-making.

Time Efficiency

In a study conducted by Durning et al. [11], students took 10% to 60% more time to complete open-book examinations when compared to similar closed-book assessments. These findings are in line with the results of studies carried out in the United States of America. Brossman et al. [27] reported the need for 40% more time to complete open-book examinations. Studies that evaluated the time required to complete the assessments are in agreement, showing an increase in the time needed for open-book examinations. This finding has direct implications for the implementation of open-book assessments, as the additional time required for completion should be taken into account to ensure that it does not become a factor that negatively influences the assessment outcome [27].

Anxiety Levels

Other researchers assessed the level of anxiety related to open-book assessments. Sarkar et al. [21] analyzed student feedback and found a lower level of stress during open-book assessments, in agreement with Prigoff, et al. [30].
Davies et al. [28] suggest that anxiety may have played a role in the exam performance, but the researchers did not measure the examinees' anxiety directly. They considered it unlikely that those sitting the open-book exam felt less anxious, given that they had no experience within the course of performing open-book exams and the uncertainty and disruption caused by COVID-19. Furthermore, while examinees may expect themselves to be less anxious in an open-book exam, there is evidence that their experienced anxiety is similar.
However, in Durning et al. [11], it is observed that students associated open-book assessments with lower anxiety levels, but only a minority of them reported lower anxiety when they performed this type of assessment. Thus, there does not seem to be a consensus on whether students experience reduced anxiety with open-book assessments. However, none of the studies demonstrated an increase in stress or anxiety related to open-book assessments.

Discrimination Power

The power of discrimination in a test or assessment is an index indicating how well the question separates the high-scoring from the low-scoring examinees [31]. To assess discriminative power, Brossman et al [27] employed Item Response Theory (IRT), which considers three characteristics of test items: their ability to evaluate whether students have the necessary knowledge to answer them, the level of difficulty, and the likelihood of guessing the correct answer by chance or random guessing. Using this methodology, they demonstrated that open-book assessments have greater discriminative power than similar closed-book assessments. This means that the questions in an open-book assessment have a greater capacity to differentiate high-performing candidates from low-performing ones. This appears to be linked to the depth and complexity of the questions in open-book assessments, which tend to require higher-order thinking skills. At this point, it is worth questioning whether the improved discriminative results are attributed solely to the use of external reference materials or if they are driven by the formulation of questions demanding higher levels of clinical reasoning.
Additionally, it is noteworthy that, Rehman et al. [23] obtained similar results. The open-book examination was highly discriminatory between high-performing and low-performing students as the majority of the items were found to have moderate [index ranging from 16 to 30) to high discrimination indices of more than 30. Clearly written unambiguous test questions of medium difficulty, which improve the reliability of assessment, are again supportive of these findings.

Limitations and Future Research Direction

There are limitations within this study which need to be acknowledged. One limitation of this study is the relatively limited number of research literature available concerning open-book assessments in medical education, resulting in a small pool of studies included in this review. However, it's worth noting that among these, there are robust and technically sound studies, that allow for important conclusions to be drawn on this subject and provide a foundation for future research initiatives. Therefore, it is anticipated that the insight and knowledge generated in this review will not only stimulate more in-depth discussions on open-book assessments but also serve as a basis for further studies in the future.

Conclusions

There is a clear growing need for appropriate assessment tools, particularly in the biomedical field, to keep bound with the rapid expansion and accessibility of knowledge. These tools should be dynamic, adaptable, and reflective of the evolving landscape of biomedical knowledge. These tools should aim to evaluate not only factual knowledge but also higher-order cognitive skills, critical thinking, and the ability to apply knowledge in practical scenarios.
This review comprehensively examined the current applications of open-book examinations and their integration into medical education. It has presented robust evidence supporting the utilization of remote online open-book assessments, demonstrating their effectiveness, reliability, and compatibility with active teaching methodologies that emphasize student engagement. These assessments enable thorough and high-quality evaluations without compromising their ability to distinguish student performance and reliability. Consequently, they emerge as highly valuable tools for assessing medical students. However, its implementation within medical curricula remains somewhat limited, although it has increased during the COVID-19 pandemic, notably through remote examinations on online platforms.
We have highlighted several potential advantages of open-book assessments, as studies show that it contributes to the development of professionals with critical thinking, and problem-solving skills, prepared for lifelong learning and aware of the constant evolution of medical knowledge. However, there are also various challenges associated with this assessment method, including the need to redefine cut-off points and grading criteria, the operational aspects of conducting open-book exams, and the training of both educators and students in using this assessment format.
While open-book examinations in medical education show promise for fostering critical thinking, problem-solving skills, and an appreciation for the evolving nature of medical knowledge, some challenges warrant attention. These include redefining assessment criteria, addressing concerns about academic integrity, and ensuring that such examinations are used complementarily with traditional assessment methods. Therefore, while open-book examinations are a valuable addition to medical curricula, their implementation should be carefully balanced with other forms of assessments to ensure a comprehensive evaluation of student competence.

References

  1. Boursicot K, Kemp S, Wilkinson T, Findyartini A, Canning C, Cilliers F, et al. Performance assessment: Consensus statement and recommendations from the 2020 Ottawa Conference. Medical Teacher. 2021 Jan 2;43(1):58–67. [CrossRef]
  2. Pangaro L, ten Cate O. Frameworks for learner assessment in medicine: AMEE Guide No. 78. Medical Teacher. 2018 Jun 1;35(6):e1197–210.
  3. Myyry L, Joutsenvirta T. Open-book, open-web online examinations: Developing examination practices to support university students’ learning and self-efficacy. Active Learning in Higher Education. 2015 Jul 1;16(2):119–32.
  4. Epstein, RM. Assessment in Medical Education. New England Journal of Medicine. 2007 Jan 25;356(4):387–96. [CrossRef]
  5. GMC. Assessment in undergraduate medical education-Tomorrow’s Doctors. General Medical Council-UK; 2018.
  6. Ben-David, MF. AMEE Guide No. 18: Standard setting in student assessment. Medical Teacher. 2000 Jan 1;22(2):120–30.
  7. Mahajan R, Saiyad S, Virk A, Joshi A, Singh T. Blended programmatic assessment for competency based curricula. J Postgrad Med. 2021;67(1):18–23.
  8. Schuwirth LWT, van der Vleuten CPM. How ‘Testing’ Has Become ‘Programmatic Assessment for Learning’. Health Professions Education. 2019 Sep 1;5(3):177–84. [CrossRef]
  9. Vleuten C, Lindemann I, Schmidt L. Programmatic assessment: the process, rationale and evidence for modern evaluation approaches in medical education. Medical Journal of Australia. 2018 Nov;209(9):386–8.
  10. Hauer KE, Boscardin C, Brenner JM, van Schaik SM, Papp KK. Twelve tips for assessing medical knowledge with open-ended questions: Designing constructed response examinations in medical education. Medical Teacher. 2020 Aug 2;42(8):880–5.
  11. Durning SJ, Dong T, Ratcliffe T, Schuwirth L, Artino ARJ, Boulet JR, et al. Comparing Open-Book and Closed-Book Examinations: A Systematic Review. Academic Medicine. 2016 Apr;91(4):583.
  12. Sam AH, Reid MD, Amin A. High-stakes, remote-access, open-book examinations. Med Educ. 2020 Aug;54(8):767–8.
  13. Greene J, Mullally W, Ahmed Y, Khan M, Calvert P, Horgan A, et al. Maintaining a Medical Oncology Service during the Covid-19 Pandemic. Irish medical journal. 2020 May 3;11:77.
  14. Whittemore, R. Combining Evidence in Nursing Research: Methods and Implications. Nursing Research. 2005 Feb;54(1):56. [CrossRef]
  15. Whittemore R, Knafl K. The integrative review: updated methodology. Journal of Advanced Nursing. 2005;52(5):546–53.
  16. Yu Xiao, Maria Watson. Guidance on Conducting a Systematic Literature Review. Journal of Planning Education and Research. 2019 Mar;9(1):93–112.
  17. Halcomb E, Stephens M, Bryce J, Foley E, Ashley C. Nursing competency standards in primary health care: an integrative review. Journal of Clinical Nursing. 2016;25(9–10):1193–205.
  18. CASP - Critical Appraisal Skills Programme [Internet]. [cited 2023 Oct 6]. CASP Checklists - Critical Appraisal Skills Programme. Available from: https://casp-uk.net/casp-tools-checklists/.
  19. Elsalem L, Al-Azzam N, Jum’ah AA, Obeidat N. Remote E-exams during Covid-19 pandemic: A cross-sectional study of students’ preferences and academic dishonesty in faculties of medical sciences. Annals of Medicine and Surgery. 2021 Feb 1;62:326–33. [CrossRef]
  20. Eurboonyanun C, Wittayapairoch J, Aphinives P, Petrusa E, Gee DW, Phitayakorn R. Adaptation to Open-Book Online Examination During the COVID-19 Pandemic. J Surg Educ. 2021;78(3):737–9.
  21. Sarkar S, Mishra P, Nayak A. Online open-book examination of undergraduate medical students - a pilot study of a novel assessment method used during the coronavirus disease 2019 pandemic. J Laryngol Otol. 2021 Apr;135(4):288–92.
  22. Monaghan, AM. Medical Teaching and Assessment in the Era of COVID-19. J Med Educ Curric Dev. 2020 Oct 16;7:2382120520965255. [CrossRef]
  23. Rehman J, Ali R, Afzal A, Shakil S, Sultan AS, Idrees R, et al. Assessment during Covid-19: quality assurance of an online open book formative examination for undergraduate medical students. BMC Medical Education. 2022 Nov 15;22(1):792.
  24. Bobby Z, Meiyappan K. “Test-enhanced” focused self-directed learning after the teaching modules in biochemistry. Biochemistry and Molecular Biology Education. 2018;46(5):472–7.
  25. Dave M, Patel K, Patel N. A systematic review to compare open and closed book examinations in medicine and dentistry. Faculty Dental Journal [Internet]. 2021 Oct 1 [cited 2023 Oct 10]; Available from: https://publishing.rcseng.ac.uk/doi/10.1308/rcsfdj.2021.41.
  26. Ibrahim NK, Al-Sharabi BM, Al-Asiri RA, Alotaibi NA, Al-Husaini WI, Al-Khajah HA, et al. Perceptions of clinical years’ medical students and interns towards assessment methods used in King Abdulaziz University, Jeddah. Pak J Med Sci. 2015;31(4):757–62.
  27. Brossman BG, Samonte K, Herrschaft B, Lipner RS. A Comparison of Open-Book and Closed-Book Formats for Medical Certification Exams: A Controlled Study. American Educational Research Association; 2017 Apr 28; San Antonio, TX.
  28. Davies DJ, McLean PF, Kemp PR, Liddle AD, Morrell MJ, Halse O, et al. Assessment of factual recall and higher-order cognitive domains in an open-book medical school examination. Adv Health Sci Educ Theory Pract. 2022 Mar;27(1):147–65.
  29. Erlich, D. Because Life Is Open Book: An Open Internet Family Medicine Clerkship Exam. PRiMER. 2017 Jul 20;1:7.
  30. Prigoff J, Hunter M, Nowygrod R. Medical Student Assessment in the Time of COVID-19. J Surg Educ. 2021;78(2):370–4.
  31. Zagury-Orly I, Durning SJ. Assessing open-book examination in medical education: The time is now. Med Teach. 2021 Aug;43(8):972–3.
  32. Taha MH, Ahmed Y, El Hassan YAM, Ali NA, Wadi M. Internal Medicine Residents’ perceptions of learning environment in postgraduate training in Sudan. Future of Medical Education Journal. 2019 Dec 1;9(4):3–9.
Figure 1. Literature search and selection flow.
Figure 1. Literature search and selection flow.
Preprints 104590 g001
Table 1. Eligibility criteria according to the PICOS framework.
Table 1. Eligibility criteria according to the PICOS framework.
Parameter Description
Population Health care professionals from any medical education context, without restriction to specific levels of education or specialties. Studies that focused exclusively on students in nursing, dentistry, or other specialized healthcare fields were excluded
Intervention Studies that have evaluated open-book examinations or assessments as the primary educational or evaluative method being studied
Comparison comparisons between open-book and closed-book examination or between different implementations of open-book examination.
Studies may have no comparison group or comparator
Outcome Measures related to the effectiveness, student performance, perceptions, challenges, or advantages of open-book examinations
Studies Experimental and non-experimental studies published in in English between 2013 and 2023 from peer-reviewed journals. This includes qualitative, quantitative, and mixed method studies.
commentary articles, letters to the editor, editorial, theses, dissertations, reports, book chapters, non-peer-reviewed publication, and publications not in English were excluded.
PICOS, population, intervention, comparison, outcome, and study design
Table 2. Descriptive list of 13 studies included in the review.
Table 2. Descriptive list of 13 studies included in the review.
Study Authors Design Focus of study
“Comparing Open-Book and Closed-Book Examinations: A Systematic Review” Durning et al.2016 [11] Qualitative Analyse and synthesize existing research on the utility of OBES and CBES
“Remote E-exams during Covid-19 pandemic: A cross-sectional study of students' preferences and academic dishonesty in faculties of medical sciences” Elsalem et al. 2021 [19] Quantitative Evaluate the experiences and preferences of students regarding remote E-exams during the COVID-19 pandemic
“Adaptation to Open-Book Online Examination During the Covid-19 Pandemic” Eurboonyanun et al. 2021 [20] Quantitative Compare the performance of students in online OBE to traditional written examinations
“Online open-book examination of undergraduate medical students: a pilot study of a novel assessment method used during the coronavirus disease 2019 pandemic” Sarkar et al. 2021 [21] Quantitative check the feasibility and acceptability of an online OBE for medical students
“Medical teaching and assessment in the era of Covid-19” Monaghan et al. 2020 [22] Quantitative
Assessment during Covid-19: quality assurance of an online open book formative examination for undergraduate medical students. Rehman et al. 2022 [23] Quantitative Evaluate the quality of an online OBES administered to first-year medical students during the COVID-19 pandemic
Test-enhanced” focused self-directed learning after the teaching modules in biochemistry. Bobby et al. 2018 [24] Mixed methods Evaluate the effectiveness of OBE and self-study in promoting learning among medical students
A systematic review to compare open and closed book examinations in medicine and dentistry. Dave et al.2021 [25] Qualitative Summarize and critically evaluate the existing literature regarding OBE in medicine
“Perceptions of clinical years' medical students and interns towards assessment methods used in King Abdulaziz University, Jeddah” Ibrahim et al.2015 [26] Quantitative Assess students’ perceptions of different assessment methods
“Comparison of Open-Book and Closed-Book Formats for Medical Certification Exams: A Controlled Study”. Brossman et al. 2017 [27] Quantitative Assess the viability of open-book formats for medical certification exams, with a focus on logistical and psychometric aspects
Assessment of factual recall and higher-order cognitive domains in an open-book medical school examination Davies et al. 2021 [28] Quantitative Quantify the effect of open-book resources on student performance in different cognitive domains based on Bloom's taxonomy
“Because Life is Open Book: An Open Internet Family Medicine Clerkship Exam” Erlich et al. 2017 [29] Quantitative Assess medical students' information mastery competency
Medical Student Assessment in the Time of COVID-19. Prigoff, et al. 2020 [30] Quantitative comparative observational research Examines the impact of changes made in response to the pandemic, including the use of OBE
OBES=Open-Book Examinations, CBES=Closed-Book Examinations
Table 3. Summary of Primary categories in Open-Book Examination Studies.
Table 3. Summary of Primary categories in Open-Book Examination Studies.
Category Description Key Aspects

Teaching Strategy for Pandemic and Challenging Conditions

This category examines the role and effectiveness of OBEs as a teaching and assessment method for the COVID-19 pandemic.
and Challenging Conditions
Response to digital learning and the pandemic.
Efficacy of OBEs in remote/online settings.
Student perspectives on OBEs
Academic integrity.

Tool of Learning & Educational Impact
This category explores the utility of OBEs as an ongoing educational tool within medical education, beyond the context of the pandemic and challenging conditions Educational transformations.
Depth and accessibility of resources.
Test-enhanced learning benefits. Comparative Efficacy and Student Performance
Operational Challenges & Future Directions
This category discusses the implications of OBEs on exam duration and student anxiety, including the balance between comprehensive assessment and time constraints Impact on exam preparation and study strategies.
Time efficiency considerations.
Influence on students' anxiety levels.
Discrimination power of OBEs
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2024 MDPI (Basel, Switzerland) unless otherwise stated