Development of an assessment questionnaire for on-line teaching competency
DOI:
https://doi.org/10.5944/ried.23.2.27084Keywords:
distance education, formative evaluation, assessment teacher, feedback, quality of teachingAbstract
Evaluating teaching to improve the quality of education implies to recognize its complexity and to count with valid assessment instruments that lead to improvement. The objective of this article is to report the research process that allowed the development of a formative assessment questionnaire of on-line teaching based on the opinion of students. The participants were six experts on the design and implementation of on-line courses, and a total of 2821 university students. The method consisted of two stages: 1) Development of the instrument through theoretical background regarding the elements of the on-line teaching competencies that are susceptible of being evaluated by students; the making of the questionnaire specification matrix; matrix validation by experts judgment and pilot study; 2) Contribution of confidence evidence and validity included the application of the questionnaire; descriptive statistical calculations; unidimensional analysis through the Rasch model; establishment of the scores reliability; exploratory and confirmatory factorial analysis. It was demonstrated that 28 items evaluate the same construct and 2 factors were confirmed: forecast of the teaching-learning process and the conduction and evaluation of the teaching and learning experience; a total variance of 0.01% and alpha ordinal of 0.998; and adjustment index RMSEA=0.08; SRMR= 0.03, CF1=0.91. It was concluded that the results corroborated the instrument’s theoretical background with acceptable psychometric characteristics so that its application is recommended.
Downloads
References
Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, R., Ice, P., Richardson, J. C., y Swan, K. P. (2008). Developing a community of inquiry instrument: Testing a measure of Community of Inquiry framework using a multi-institutional sample. Internet and Higher Education, 11, 133-136.https://doi.org/10.1016/j.iheduc.2008.06.003
Bakieva, M., Jornet, J., González-Such, J., y Leyva, Y. (2018). Colegialidad docente: evidencias de validación a partir del análisis realizado por comités de expertos acerca del instrumento para autoevaluación docente en España y México. Estudios sobre Educación, 34, 99-127. https://doi.org/10.15581/004.34.99-127
Baldwin, S., y Trespalacios, J. (2017). Evaluation instruments and good practices in online education. Online Learning, 21 (2). http://dx.doi.org/10.24059/olj.v21i2.913
Bangert, A. W. (2008). The Development and Validation of the Student Evaluation of Online Teaching Effectiveness. Computers in the Schools, 25 (1-2), 25-47. https://doi.org/10.1080/07380560802157717
Benton, S. L., y Cashin, W. E. (2014). Student ratings of instruction in college and university courses. En M. B. Paulsen, (Ed.), Higher Education: Handbook of Theory and Research, Volume 29 (279-326). Springer Science+Business Media Dordrecht. https://doi.org/10.1007/978-94-017-8005-6_7
Chickering, A. W., y Gamson, Z.F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, 3-7. https://eric.ed.gov/?id=ED282491
Cleveland-Innes, M., y Campbell, P. (2012). Emotional presence, learning, and the online learning environment. The International Review of Research in Open and Distributed Learning, 13 (4), 269-292. https://doi.org/10.19173/irrodl.v13i4.1234
Darwin, S. (2017). What contemporary work are student ratings actually doing in higher education? Studies in Educational Evaluation, 54, 13-21. https://dx.doi.org/10.1016/j.stueduc.2016.08.002
Denyer, M., Furnemont, J., Poulain, R., y Vanloubbeeck, G. (2007). Las competencias en la educación, un balance. México: Fondo de Cultura Económica.
Domínguez-Lara, S. (2018). Fiabilidad y alfa ordinal. Actas Urológicas Españolas, 42 (2), 140-141.https://doi.org/10.1016/j.acuro.2017.07.002
Finch, W. H., y French, B. F. (2019). Exploratory and Confirmatory Factor Analysis. En Educational and Psychological Measurement (135-169). Routledge.
García-Aretio, L. (2014). Bases, tendencias y futuro de la educación a distancia en la sociedad digital. España: Síntesis.
García-Cabrero, B., Luna, E., Ponce, S., Cisneros-Cohemour, E., Cordero, G., y Espinoza, J. (2018). Las competencias docentes en entornos virtuales: un modelo para su evaluación. RIED. Revista Iberoamericana de Educación a Distancia, 21 (1), 343-365. https://doi.org/10.5944/ried.21.1.18816
Garrison, D. R., Anderson, T., y Archer, W. (2000). Critical Inquiry in a Text-Based Environment: Computer Conferencing in Higher Education. The Internet and Higher Education, 2 (2), 87-105. http://auspace.athabascau.ca/bitstream/2149/739/1/critical_inquiry_in_a_text.pdf
Hornstein, A. (2017). Student evaluations of teaching are an inadequate assessment tool for evaluating faculty performance. Cogent Education, 4 (1). https://doi.org/10.1080/2331186X.2017.1304016
Laurillard, D. (2002). Rethinking teaching for the knowledge society. EDUCAUSE review, 37 (1), 16-24. https://www.educause.edu/ir/library/pdf/ffpiu017.pdf
Linacre, J. M. (2002). What do infit and outfit, mean-square and standardized mean? Rasch Measurement Transactions, 16 (2), 878. https://www.rasch.org/rmt/rmt162f.htm
Linse, A. R. (2017). Interpreting and using student ratings data: Guidance for faculty serving as administrators and on evaluation committees. Studies in Educational Evaluation, 54, 94-106. http://dx.doi.org/10.1016/j.stueduc.2016.12.004
Luna, E., Ponce, S., Cordero, G., y Cisneros-Cohernour, E. (2018). Marco para evaluar las condiciones institucionales de la enseñanza en línea. Revista Electrónica de Investigación Educativa, 20 (2), 1-14. https://doi.org/10.24320/redie.2018.20.2.2072
Mauri, T., y Onrubia, J. (2008). El profesor en entornos virtuales: condiciones, perfil y competencias. En C. Coll y C. Monereo, (Eds.), Psicología de la educación virtual (132-152). Morata.
McClary, J. (2013). Factor in High quality distance Learning Courses. Online Journal of Distance Learnin Administration, 16 (2), 230-256. https://www.westga.edu/~distance/ojdla/summer162/mcclary162.html
Mishra, P., y Koehler, M. J. (2006). Technological pedagogical content knowledge: a framework for teacher knowledge. Teachers College Record, 108 (6), 1017-1054. https://www.learntechlib.org/p/99246/
Nasser-Abu, F. (2017). Guest editor introduction to special issue “Contemporary evaluation of teaching: hallenges and promises”. Studies in Educational Evaluation, 54, 1-3. http://dx.doi.org/10.1016/j.stueduc.2017.02.002
Olpak, Y. Z., y Kilic, E. (2018). Examining the realiability and validity of a Turkish version of the community of inquiry survey. Online Learning, 22 (1), 147-161. http://dx.doi.org/10.24059/olj.v22i1.990
Oon, P.T., Spencer, B., y Chun, S. K. (2017). Psychometric quality of a student evaluation of teaching survey in higher education. Asseessment & Evaluation in Higher Education, 42 (5), 788-800. https://doi.org/10.1080/02602938.2016.1193119
Ravenscroft, B., Luhanga, U., y King, B. (2017). Adapting Bangert’s online
teaching effectiveness evaluation tool to a Canadian context. Innovations in Education and Teaching International, 54 (4), 355-363. https://doi.org/10.1080/14703297.2016.1231618
Red Iberoamericana de Investigadores sobre la Evaluación de la Docencia
[RIIED]. (2008). Reflexiones sobre el diseño y puesta en marcha de programas de evaluación de la docencia. Revista Iberoamericana de Evaluación Educativa, 1 (3e), 163-168. http://www.rinace.net/riee/numeros/vol1-num3_e/reflexiones.html
Sireci, S., y Faulkner-Bond, M. (2014). Validity evidence based on test content. Psicothema, 26 (1), 100-107. https://doi.org/10.7334/psicothema2013.256
Smith, W. C., y Kubacka, K. (2017). The emphasis of student test scores in teacher appraisal systems. Education policy analysis archives, 25 (86). https://doi.org/10.14507/epaa.25.2889
Tabatabaee-Yazdi, M., Motallebzadeh, K., Ashraf, H., y Baghaei, P. (2018). Development and Validation of a Teacher Success Questionnaire Using the Rasch Model. International Journal of Instruction, 11 (2), 129-144. https://doi.org/10.12973/iji.2018.11210a
Verma, J. P. (2019). Application of Factor Analysis in Psychological Data. En Statistics and Research Methods in Psychology with Excel (567-588). Springer.

Published
How to Cite
Issue
Section
License
Copyright (c) 2020 RIED. Revista Iberoamericana de Educación a Distancia

This work is licensed under a Creative Commons Attribution 4.0 International License.
The articles that are published in this journal are subject to the following terms:
1. The authors grant the exploitation rights of the work accepted for publication to RIED, guarantee to the journal the right to be the first publication of research understaken and permit the journal to distribute the work published under the license indicated in point 2.
2. The articles are published in the electronic edition of the journal under a Creative Commons Attribution 4.0 International (CC BY 4.0) license. You can copy and redistribute the material in any medium or format, adapt, remix, transform, and build upon the material for any purpose, even commercially. You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
3. Conditions for self-archiving. Authors are encouraged to disseminate electronically the OnlineFirst version (assessed version and accepted for publication) of its articles before publication, always with reference to its publication by RIED, favoring its circulation and dissemination earlier and with this a possible increase in its citation and reach among the academic community.