Higher education teachers' and students' perceptions of open-book and proctored examinations in the COVID-19 pandemic

Authors

DOI:

https://doi.org/10.5944/educxx1.33514

Keywords:

assessment, higher education, open-book examination, proctored exams, authentic assessment

Agencies:

evaluación, educación superior, examen a libro abierto, exámenes supervisados, evaluación auténtica

Abstract

In the early days of the COVID-19 pandemic, higher education was forced to review its assessment processes. Competency achievement and academic honesty should be ensured in online assessments. In the Master of Educational Technology and Digital Competences of a Spanish University, the open-book examination model was implemented to respond to the new situation considering the characteristics of authentic assessment (adapted to students, intellectually challenging, related to practice, coherent with the didactic methodology, makes plagiarism difficult).  We wanted to analyze the relevance of this change in evaluation. The main objective is to analyze the differences between traditional face-to-face exams from before the pandemic and open-book exams with and without proctoring according to the perception of teachers and students. The research is of an empirical nature and quantitative approach and is based on the responses of 66 teachers and 301 students to a questionnaire with sufficient validity (chi-2/Gl: 2.453, RMSEA: .069, CFI: .99 and TLI: .99), and an Omega reliability coefficient of .882. Comparisons were made between model A: traditional face-to-face examination, model B: open-book examination with proctoring, model C: open book examination without proctoring. The results show that for teachers and students open-book exams with or without proctoring had no significant differences and are more in line with an authentic assessment than face-to-face exams. It is concluded that open-book exams with or without proctoring are suitable for authentic online assessment in higher education. It is recommended to contrast the results in other online university courses and to encourage authentic assessment in higher education institutions.

Downloads

Download data is not yet available.

References

Boud, D. (2020). Retos en la reforma de la evaluación en educación superior: una mirada desde la lejanía. RELIEVE-Revista Electrónica de Investigación y Evaluación Educativa, 26(1). https://doi.org/10.7203/relieve.26.1.17088

Brown, S. (2015). La evaluación auténtica: el uso de la evaluación para ayudar a los estudiantes a aprender. RELIEVE-Revista Electrónica de Investigación y Evaluación Educativa, 21(2). https://doi.org/10.7203/relieve.21.2.7674

Brown, S., & Glasner, A. (1999). Assessment matters in higher education. Open University Press.

Byrne, B. (2009). Structural Equation Modeling with AMOS (2ª Ed.). Routledge.

Butler-Henderson, K., & Crawford, J. (2020). A systematic review of online examinations: A pedagogical innovation for scalable authentication and integrity. Computers & Education, 159. https://doi.org/10.1016/j.compedu.2020.104024

Callegaro, M., Manfreda, K. L., & Vehovar, V. (2015). Web survey methodology. Sage.

Cano, E. (2008). La evaluación por competencias en la educación superior. Profesorado. Revista de Currículum y Formación de Profesorado, 12(3), 1-16. https://www.redalyc.org/articulo.oa?id=56712875011

Cheung, C. (2020). Evaluation of academic integrity of online open book assessments implemented in an undergraduate medical radiation science course during COVID-19 pandemic. Journal of Medical Imaging and Radiation Sciences, 51(4), 610-616. https://doi.org/10.1016/j.jmir.2020.09.009

Fardoun, H., González-González, C., Collazos, C., & Yousef, M. (2020). Estudio exploratorio en iberoamérica sobre procesos de enseñanza-aprendizaje y propuesta de evaluación en tiempos de pandemia. Education in the Knowledge Society, 21, 17. https://doi.org/10.14201/eks.23537

Feller, M. (1994). Open-book testing and education for the future. Studies in Educational Evaluation, 20(2), 235-238. https://doi.org/10.1016/0191-491X(94)90010-8

Fritz, C., Morris, P., & Richler, J. (2012). Effect size estimates: current use, calculations, and interpretation. Journal of Experimental Psychology: General, 141(1), 2-18. https://psycnet.apa.org/doi/10.1037/a0024338

García-Alberti, M., Suárez, F., Chiyón, I., & Mosquera, J. (2021). Challenges and Experiences of Online Evaluation in Courses of Civil Engineering during the Lockdown Learning Due to the COVID-19 Pandemic. Education Sciences, 11(2), 59. https://doi.org/10.3390/educsci11020059

García-Peñalvo, F., Corell, A., Abella-García, V., & Grande-de-Prado, M. (2021). Recommendations for mandatory online assessment in higher education during the covid-19 pandemic. In D. Burgos, A. Tlili & A. Tabacco (Eds) Radical Solutions for Education in a Crisis Context. Lecture Notes in Educational Technology. Springer. https://doi.org/10.1007/978-981-15-7869-4_6

Guangul, F., Suhail, A., Khalit, M., & Khidhir B. (2020). Challenges of remote assessment in higher education in the context of COVID-19: a case study of Middle East College. Educational Assessment, Evaluation and Accountability, 32, 519–535. https://doi.org/10.1007/s11092-020-09340-w

Gudiño, S., Jasso, F., & de La Fuente, J. (2021). Remote proctored exams: Integrity assurance in online education? Distance Education, 42(2), 200-218. https://doi.org/10.1080/01587919.2021.1910495

Hair, J., Black, W., Babin, B., & Anderson, R. (2014). Multivariate Data Analysis (7ª Ed.). Pearson.

Halak, B., & El-Hajjar, M. (2019). Design and evaluation of plagiarism prevention and detection techniques in engineering education. Higher Education Pedagogies, 4(1), 197-208. https://doi.org/10.1080/23752696.2018.1563757

Herrington, J., & Herrington, A. (1998). Authentic assessment and multimedia: how university students respond to a model of authentic assessment. Higher Education Research and Development, 17, 3, 305–322.

Hu, L., & Bentler, P. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6, 1–55. http://dx.doi.org/10.1080/10705519909540118

Ibarra-Sáiz, M., & Rodríguez-Gómez, G. (2020). Aprendiendo a evaluar para aprender en la educación superior. Revista Iberoamericana de Evaluación Educativa, 13(1), 5–8. https://revistas.uam.es/riee/article/view/12070

Ibarra-Sáiz, M., Rodríguez-Gómez, G., Boud, D., Rotsaert, T., Brown, S., Salinas Salazar, M. L., & Rodríguez Gómez, H. M. (2020). El futuro de la evaluación en la educación superior. RELIEVE-Revista Electrónica de Investigación y Evaluación Educativa, 26(1), 1-6. https://doi.org/10.7203/relieve.26.1.17323

Koutselini, M. (1997). Testing and life-long learning: Open-book and closed-book examination in a university course. Studies in Educational Evaluation, 23, 2, 131-139. https://doi.org/10.1016/S0191-491X(97)00008-4

Li, C. H. (2014). The performance of MLR, USLMV, and WLSMV estimation in structural regression models with ordinal variables. Michigan State University. https://tinyurl.com/3fypvhos

McArthur, J. (2020). Participación e implicación del estudiante en la evaluación: implicar a todo el estudiante en la búsqueda de la justicia y el bien social. RELIEVE Revista Electrónica de Investigación y Evaluación Educativa, 26(1). https://doi.org/10.7203/relieve.26.1.17089

Nunnally, J. C. & Bernstein, I. H. (1994) Psychometric theory. McGraw-Hill.

Organización para la Cooperación y el Desarrollo Económicos. (2019). OECD Skills Strategy 2019. Skills to Shape a Better Future. https://www.oecd.org/skills/oecd-skills-strategy-2019-9789264313835-en.htm

Otzen, T., & Manterola, C. (2017). Técnicas de Muestreo sobre una Población a Estudio. International journal of morphology, 35(1), 227-232. http://dx.doi.org/10.4067/S0717-95022017000100037

Pagram, J., Cooper, M., Jin, H., & Campbell. (2018). Tales from the Exam Room: Trialing an E-Exam System for Computer Education and Design and Technology Students. Education Sciences, 8, 4. 8. https://doi.org/10.3390/educsci8040188

Prigoff, J., Hunter, M., & Nowygrod, R. (2021). Medical student assessment in the time of COVID-19. Journal of Surgical Education, 78(2), 370-374. https://doi.org/10.1016/j.jsurg.2020.07.040

Shaushenova, A., Zulpykhar, Zh., Zhumasseitova, S., Ongarbayeva, M., Akhmetzhanova, Sh., Mutalova, Zh., Niyazbekova, Sh., & Zueva, A. (2021). The influence of the proctoring system on the results of online tests in the conditions of distance learning. AD ALTA: Journal of Interdisciplinary Research, 11(2) 250-256.

Slade, C., Lawrie, G., Taptamat, N., Browne, E., Sheppard, K. & Matthews, K. (2021). Insights into how academics reframed their assessment during a pandemic: disciplinary variation and assessment as afterthought. Assessment & Evaluation in Higher Education, 47(4), 588-605. https://doi.org/10.1080/02602938.2021.1933379

Soodmand, H. & Ranjbar, N. (2021). EAP teachers’ assessment literacy: From theory to practice. Studies in Educational Evaluation, 70. https://doi.org/10.1016/j.stueduc.2021.101042

United Nations Educational, Scientific and Cultural Organization (2021). One year into COVID: prioritizing education recovery to avoid a generational catastrophe. https://unesdoc.unesco.org/ark:/48223/pf0000376984

Vázquez, J., Chiang, E., & Sarmiento, I. (2021). Can we stay one step ahead of cheaters? A field experiment in proctoring online open book exams. Journal of Behavioral and Experimental Economics, 90. https://doi.org/10.1016/j.socec.2020.101653

Williams, J., & Wong, A. (2009). The efficacy of final examinations: A comparative study of closed-book, invigilated exams and open-book, open-web exams. British Journal of Educational Technology, 40(2), 227–236. https://doi.org/10.1111/j.1467-8535.2008.00929.x

Xia, Y. (2016). Investigating the chi-square-based model-fit indexes for WLSMV and ULSMVestimators [Doctoral Dissertation]. Florida State University Libraries. https://bit.ly/3IISnc6

Published

2023-01-02

Issue

Section

Estudios