Monográfico

Evaluating Learning Transfer from MOOCs to Workplaces: A Case Study from Teacher Education and Launching Innovation in Schools

Evaluando la Transferencia del Aprendizaje de MOOCs al Centro de Trabajo: Un Estudio de Caso en Educación para el Profesorado y Lanzando Innovación en Colegios

Alyssa Napier 1
Massachussets Institute of Technology, Estados Unidos
Elizabeth Huttner-Loan 2
IBM, Estados Unidos
Justin Reich 3
Massachussets Institute of Technology, Estados Unidos

Evaluating Learning Transfer from MOOCs to Workplaces: A Case Study from Teacher Education and Launching Innovation in Schools

RIED. Revista Iberoamericana de Educación a Distancia, vol. 23, núm. 2, 2020

Asociación Iberoamericana de Educación Superior a Distancia

“Los textos publicados en esta revista están sujetos a una licencia “Creative Commons Atribución-NoComercial 4.0 Internacional". Puede copiarlos, distribuirlos, comunicarlos públicamente, de forma no comercial y siempre que reconozca los créditos de la obra (autor, nombre de la revista, instituciones editoras) de la manera especificada en la revista.”

Recepción: 13 Enero 2020

Aprobación: 04 Febrero 2020

How to reference this article: Napier, A., Huttner-Loan, E., y Reich, J. (2020). Evaluating Learning Transfer from MOOCs to Workplaces: A Case Study from Teacher Education and Launching Innovation in Schools. RIED. Revista Iberoamericana de Educación a Distancia, 23(2), pp. 45-64. doi: http://dx.doi.org/10.5944/ried.23.2.26377

Resumen: Después de dos iteraciones del Curso En línea Masivo y Abierto (CEMA) para líderes en colegios, Lanzando Innovación en Colegios, hemos desarrollado y probado una serie de elementos de diseño para transferir el aprendizaje en línea a un contexto presencial. Un aprendizaje efectivo profesional necesita estar embebido dentro de la propia experiencia laboral: los estudiantes deberían emplear las nuevas habilidades y conocimientos adquiridos en su trabajo como parte de la experiencia de aprendizaje. Este MOOC tenía como objetivo invitar a sus participantes a planificar y realizar esfuerzos que significaran un cambio en su práctica docente, y consiguió que un conjunto de los participantes más motivados fuera capaz de realizar esto durante el curso. Una serie de evaluaciones fomentaron que los estudiantes realizaran dichas acciones, conjuntamente con llamadas a la acción por parte de los instructores y ejemplos provistos como parte de los elementos del curso. Nuestros resultados muestran que los participantes lideraron iniciativas de cambio, mantuvieron reuniones con las partes interesadas, recolectaron nuevos datos sobre sus contextos, y compartieron materiales de los cursos de forma colaborativa. La recolección de datos sobre el aprendizaje de los participantes y su comportamiento fuera del entorno de CEMA es esencial para los investigadores y diseñadores que buscan crear entornos de aprendizaje en línea que sean efectivos para el aprendizaje profesional.

Palabras clave: analítica de aprendizaje, CEMAs, aprendizaje en internet, aprendizaje profesional.

Abstract: Over two iterations of a Massive Open Online Course (MOOC) for school leaders, Launching Innovation in Schools, we developed and tested design elements to support the transfer of online learning into offline action. Effective professional learning is job-embedded: learners should employ news skills and knowledge at work as part of their learning experience. This MOOC aimed to get participants to plan and actually launch new change efforts, and a subset of our most engaged participants were able and willing to do so during the course. Required assessments spurred student actions, along with instructor calls to action and modeling and exemplars provided by course elements. We found that participants led change initiatives, held stakeholder meetings, collected new data about their contexts, and shared and used course materials collaboratively. Collecting data about participant learning and behavior outside the MOOC environment is essential for researchers and designers looking to create effective online environments for professional learning.

Keywords: learning analytics, MOOCs, online learning, professional learning.

Developing massive open online courses (MOOCs) that benefit public service professionals may be among the best ways to leverage MOOCs for societal good. Research about performance and attrition in MOOCs (Ho et al., 2015) suggests that most learners find independent, self-regulated learning difficult. These successful MOOC learners typically have already succeeded in the formal educational system, as evidenced by earning a college or graduate degree. Many public service professionals, including teachers and public health professionals, operate in sectors with effective post-secondary professional education, but little continuous professional learning. Given that a substantial portion of MOOC learners identify as educators, MOOCs and other online and blended learning opportunities can play an important role in expanding opportunities for teacher learning around the world (Ho et al., 2015; Seaton et al., 2014).

The central challenge of MOOCs for professional development is that most professionals do work that is collaborative, synchronous, and rooted in particular places offline, while MOOCs are independent, asynchronous, and online. For MOOCs to support effective professional learning, instructional designers need to develop, test, and iteratively refine scaffolds within these courses so that people learning alone and online can develop proficiency in skills that are deployed collaboratively and offline. The focus of our current research is the design and evaluation of course elements that help participants bridge learning experiences between MOOCs and authentic professional settings.

In this paper, we report on design research (Sandoval & Bell, 2004) in Launching Innovation in Schools, a MOOC on change leadership for educators. Over two iterations of this MOOC, we developed and tested three pairs of design elements: “Learning Circles” with accompanying Facilitator’s Guide, action-oriented assignments with “Call to Action” videos, and theory-linked activities with “Take-Out Packages.” The primary goal of the course was to have participants work collaboratively in their school environments to launch and refine change initiatives that would benefit their students.

This goal has important consequences for assessing our student learning and growth. We are not particularly interested if participants learned specific declarative knowledge; our courses have no tests or quizzes. Nor are we primarily concerned with how much of our course learners complete or whether they earn certificates. In one memorable exit survey response, a participant explained that their team started an initiative to adopt a later start time at their high school (an evidence-based approach to improve learning outcomes), and they stopped our course after the third week to focus on their change initiative rather than our course. For us, this is an excellent outcome; they used the course as necessary and moved on when they had enough support and inspiration to focus on their change initiative. Since our goal is to support transfer of learning to participants’ work practices, in our research we assess how participants have taken action in their local schools and contexts as a result of the learning experiences in our MOOC.

In this paper, we commence by highlighting existing research on effective online learning in the professions and for educators, in particular. We then describe the design elements that we hoped would support participants in transferring learning from the MOOC to their school context. We present our methods for collecting data about what types of actions our participants took in their local context and how their actions were supported or inspired by the design elements. We conclude by discussing how we plan to incorporate our findings into future MOOCs.

BACKGROUND

Research on professional development for educators—and on online learning for professionals more generally—has identified several key elements of effective learning experiences. Effective professional development (PD) is extended over time, relevant to the specific work of educators, and “job-embedded,” which means that new learning can be readily put to use in a participant’s current work setting (Darling-Hammond, et al., 2007; Hunzicker, 2010). Effective online professional learning “supports active rather than passive participation” (US Office of Ed Tech, 2014) where students learn new principles and practices and, then, go on to rehearse and enact them. These findings from education cohere with findings from PD in other professions. Milligan and Littlejohn (2014) suggest that, to have an impact on professional learning, MOOCs should be “tightly integrated with work practice”. The best professional learning experiences are designed such that participants begin making practice changes in workplace settings during their learning experiences.

Several recent MOOCs for educators have put these principles into practice. The Creative Computing Online Workshop (CCOW), a large-scale online learning experience for teachers, included constructivist-oriented activities related to using the programming language Scratch, as well as opportunities for discussion and peer feedback in the course’s online forums (Brennan, Blum-Smith & Turkofsky, 2018). The final design-based project in CCOW invited learners to engage in an iterative and reflective exploration involving their personal practice. The Friday Institute has run a series of MOOCs for educators under the “MOOC-Ed” brand, and these courses closely align the online experience with educator practice through several mechanisms, including support for blended learning, expert panel videos, and course projects with peer feedback. Evaluation of MOOC-Ed courses found that learners valued elements that provided tools, information and frameworks that were directly applicable to their practice. The top three participant responses for how they applied course content to their professional practice were “1) integrating new tools and strategies, 2) implementing course projects, and 3) using course content for instructional coaching or PD” (Kleiman, Kellog, & Booth, 2015). In our research, we extend these findings and provide additional evidence about the efficacy of specific practices.

If effective MOOCs for educators and other professionals are tightly aligned with work practice, then it becomes essential for MOOC researchers to study how MOOCs affect workplace practice. In measuring the impact of our course on learners, we align ourselves with other recent efforts in the MOOC literature to collect data about learner experiences beyond the courseware to better understand learning transfer. To evaluate the impact of a Functional Programming MOOC, Chen, Davis, Hauff, and Houben (2016) examined GitHub log data requests to find evidence of MOOC participants (using the same usernames across platforms) deploying programming skills from the MOOC in projects. To evaluate the impact of a course on learning analytics, Wang, Baker, and Pacquette (2017) assessed how MOOC participants joined scholarly societies and submitted papers in the field. National education platforms have started using MOOCs to supplement college STEM instruction (Chirikov et al., Forthcoming) and students who complete MOOCs report benefits ranging from earning credit towards a degree to enhanced skills in a current job or finding a new job (Littenberg-Tobias and Reich, 2018; Zhenghao et al., 2015). While our measures are more qualitative and self-reported than these approaches, we join these researchers in evaluating how instructional design features within our courses impact the behavior of learners outside the MOOC.

In developing and researching Launching Innovation in Schools, we sought to create a learning experience where participants take job-embedded actions as part of the course. We developed a set of course design elements intended to support the transfer of learning from our asynchronous, online, independent learning environment to the collaborative, synchronous, in-person working environments of educators. We collected data through assignments, surveys, and self-check questions to better understand the following research questions: 1) What types of actions did MOOC learners report taking within their own schools and settings, and 2) which course design elements seemed to inspire or support learner actions?

RESEARCH DESIGN

In this section, we provide a brief overview of Launching Innovation in Schools. We, then, describe the design elements intended to scaffold transfer of learning from the online course to the offline context and introduce our methods for investigating participants’ behavior outside the course. All research was reviewed by an institutional review board, and learners on edX consent to research participation in the course of registering for the site.

Overview of Launching Innovation in Schools

Targeted toward K-12 educators, Launching Innovation in Schools is a 7-week, 6-unit course that ran twice through MITx on edX, in January and September of 2017 (we refer to each iteration as a “course run”). The instructors developed a framework for change leadership in schools called the Cycle of Launching Innovation that highlighted four key tasks of leaders: 1) bringing people together around ideas they care about, 2) refining a vision and getting to work, 3) working together through ups and downs, and 4) measuring progress and adjusting. (Course materials are publicly accessible through edx.org.) The course defined school leadership broadly, including anyone in a school system, regardless of formal position, who worked with colleagues to bring about positive changes in student learning. As might be expected, most participants who responded to the entrance survey indicated that they currently or formerly identified as an instructor/teacher: 94% in the January run and 88% in the September run. From our forum interactions, we suspect that the remaining participants were community members, career changers, pre-service teachers, and others interested in our approach to leadership.

Each unit in the course includes four kinds of learning experiences: 1) expert presentations about change leadership, 2) “Voices in Practice” videos that show change leadership in real settings, 3) activities and assignments that get learners doing the work of change leadership, and 4) links to articles and other resources relevant to course themes. Over the arc of the course, the assignments ask learners to define a problem of practice (a challenge in the learning environment that impedes student learning), identify assets and resources in their context, develop an action plan toward addressing their problem of practice and, then, begin executing on that plan and identifying the results. The course was designed to take approximately two hours per week, but 29% of exit survey respondents from the first run of the course reported putting in 6-10 hours of work. Participants needed to report completion of 60% of assignments and activities to earn a certificate. Participants received credit for assignments and activities by filling out Completion Checklists, a self-check system. At the end of each unit, participants selected “yes” or “no” to confirm completion of each assignment and activity. This honor-based system was not verified by course staff. Assignments typically required doing some reflection and writing, posting in the forums and, then, providing peer feedback to other responses. As with many MOOCs, Launching Innovation in Schools had many registrants but far fewer learners who persisted throughout the entire course (Table 1). We expected this attrition among busy professionals, and we intentionally designed our course to provide benefit to those who only participated briefly.

Design Elements for Supporting Participant Action in Local Contexts

In the development of the course, we paid particular attention to design elements that could help learners take the ideas which they were learning in the course and implement those ideas in their own community during the run of the course. We designed or adapted three pairs of course elements to encourage and support learners in doing so.

Learning Circles and Learning Circle Facilitator's Guide

Before the course launched, we encouraged registrants to invite colleagues in their school or organization to join them in taking the MOOC as part of a Learning Circle. A Learning Circle is a facilitated group of registered learners who meet in person during an online course. By working on the course with colleagues, learners are able to ground course content in their specific context. Learners were not required to join a Learning Circle, but instructors and course elements regularly suggested that the work of change leadership is more effective with colleagues. Our implementation of Learning Circles was based on work by Peer2Peer University (P2PU), an organization that partners with libraries to support library staff in organizing and facilitating MOOC Learning Circles, even when the library staff has no particular domain expertise. Working with colleagues provides a structure for a support and accountability, and Learning Circle provides a natural context for collaborative efforts to improve schools.

Table 1. Overview of learner participation in the Launching Innovation in Schools MOOC
  January (First Run) September (Second Run)
Participants enrolled as of the course end date 7352 3419
Participants who logged into the forums at least once 1680 721
Participants who earned certificates 125 79
Participants who submitted a final project 81 65

Based on P2PU’s Facilitator’s Handbook (2016), we created a Facilitator's Guide that provides resources and suggestions for organizing and managing a Learning Circle. The Facilitator’s Guide was designed so that participants with minimal domain knowledge in change leadership would feel comfortable organizing a regular discussion with peers. The guide included sample emails to recruit colleagues, sample meeting agendas for each unit of the course, video discussion questions, and suggestions for making activities and assignments more collaborative.

Action-Oriented Assignments with Calls to Action

We designed course assignments to support learners in engaging with their community, planning steps toward beginning change initiatives, and evaluating the impact of those steps. Learners submit their work to our course discussion forums, where their classmates provide feedback. In Unit 1, learners define a problem of practice, and the assignments for the rest of the course are tools that help learners tackle that problem. Other assignments specifically encourage participants to begin a change leadership initiative: the main assignment that spans Units 3, 4, and 5 is an action plan that asks participants to begin planning an initiative to address their problem of practice. The assignment prompts typically only require participants to write and reflect, not to do anything in their own setting. We were concerned that if assignments strictly required taking local actions, some participants might feel like the course was impossible for them to complete.

Each assignment is accompanied by a Call to Action video, in which the lead instructor suggests ways for participants to go beyond coursework and to bring the course into their practice. A Call to Action might ask participants to share their work with others in their context to assist them deepen their understanding. Given the voluntary nature of the course, we hypothesized that video appeals from the lead instructor would be more powerful in promoting participant action than a written prompt in the assignment text. Accompanying Call to Action videos in each unit encourages participants to actually get started with the change initiative in addition to submitting the forum posts and peer feedback that are required by the assignments.

Theory-Linked Activities and Take-Out Packages

Throughout the course, participants learn ideas and frameworks about change leadership that can be abstract, such as refining a common vision for improvement or facilitating trusting and candid conversations. Whenever possible, these theories are paired with specific leadership activities that leaders can use as part of their practice. When we discuss the importance of reflecting on collaborative conversation, we engage participants in an activity called the Left-Hand Column Case, which is a specific protocol for debugging tough conversations. Through these activities, we try to smooth the pathway from understanding abstract ideas about leadership towards implementing specific practices based on those ideas.

Most of these activities were originally developed by the instructors for use in face-to-face school meetings or PD workshops, and then adapted for the online context of MOOCs. To help participants go from engaging in online activities to leading those practices in-person, we provide what we call Take-Out Packages, documents with instructions for facilitating and debriefing four of the specific activities from our online course in face-to-face contexts. These Take-Out Packages are provided within the courseware as well as within the Facilitator’s Guide.

For example, when we describe the importance of soliciting stakeholder feedback on a vision, we have participants engage in an activity called Four Corners that has participants describe the strengths, values, and ongoing initiatives within their schools, and they then connect those ideas to their proposed change initiative. In the Take-Out Package, we provide a facilitator's script for running an entirely face-to-face version of the Four Corners activity with colleagues who are not expected to have taken the MOOC and, then, ideas for debriefing the activity. Within the course, videos of expert presentations illustrate theoretical principles; course activities have participants engage with a specific practice that enacts that principle; and Take-Out Packages provide a mechanism for participants to engage people in their local community in that new practice.

Data Collection and Analysis

The ideal way to measure the impact of our course would be to collect data about what kinds of leadership practices participants engaged in before the course, what new practices they engaged in during and after the course, and how course design elements support and inspire participants to take action. This kind of data collection is complex and difficult, but essential for MOOC research to make meaningful contributions to pedagogical and instructional design research. In this study, we took initial steps in collecting data about what actions participants took in their home environments as a result of the course. Most of this data is qualitative and self-reported, so we are cautious about using the data to estimate distributions of activity or to generalize beyond the case of this particular school leadership MOOC. We are primarily interested in mapping the possibility space of participant actions, hypothesizing connections between actions and design elements, and developing more robust strategies of collecting data that can help shed light on these connections.

In order to address our first research question about the type of actions that participants took in their local contexts, we looked at select activity and assignment responses and other relevant threads in the discussion forums, replies to the Call to Action open-response questions, and post-course surveys. Across these sources, we looked for descriptions of actions that participants took in their local context. To address our second research question about which design elements inspired and supported these local actions, we utilized the same sources and looked for descriptions of which course resources that participants identified as useful. In addition, we examined the correlations between instructions and suggestions embedded in course elements and, then, what actions learners report doing outside the course. Next, we detail our sources of data and some of their limitations.

Discussion Forums

Forums are a central component to Launching Innovation in Schools: learners submit assignment and activity work, provide peer feedback, respond to video discussion questions and readings, and form groups in the course’s discussion forums. Instead of the edX discussion forums, we used an external forum, Discourse. Although there are interactive elements on the edX platform, most learners work in the course happens in forums.

We targeted our data collection toward the forum threads where learners were most likely to report actions taken. Across both courses, we looked at a total of 2769 posts from participants. We examined 25 forum threads associated with video discussion, 15 threads with responses to activities with and without analogous Take-Out Packages, and 4 threads of assignment submissions for: Unit 3 assignment “Initial Action Plan, Part 2: Concrete Steps” and the Unit 6 assignment “Final Deliverable.” This included the responses associated with the Unit 1 activity “Interview/Shadow a Student,” which was the only optional activity that specifically asked learners to take an action beyond the course. We also examined participant submissions associated with two required assignments. We chose to review responses to the Unit 3 assignment (“Initial Action Plan Part 2: Concrete Steps”) because it required learners to make a plan for taking action in the near future and encouraged learners to then take those steps. We expected learners to report taking actions that the assignment asked them to plan. We also analyzed responses to the Unit 6 assignment (“Final Deliverable”). In the first run of the course, we asked learners to reflect and revise the assignment work they had done throughout the course. In the second run of the course, we asked learners to create an artifact that would be used to share the work they had done in the course with others in their context; this was the only time an assignment prompt required taking an action outside the course.

Surveys

In Unit 6, the last unit of the course, we invited learners to take a post-course survey, which received 226 responses in the first run and 81 responses in the second. We coded responses to the questions, “How will this course or its materials impact you in the future?” and “What were your favorite aspects of this course?” Two months after the first run of the course ended, we sent out another survey that asked about the impact the course had on their practice and their experience using Learning Circles, and we received 80 responses. We coded responses to three prompts, 1) “Briefly tell us about the impact, both personal and on your community, from actions taken so far and your aspirations for the future,” 2) “Describe how participation in a Learning Circle shaped your experience in the Launching Innovation in Schools course,” and 3) “Describe how participation in a learning circle shaped the change leadership efforts that you have taken on since the start of the course.”

Call to Action Open-Responses

Learners tracked the activities and assignments which they completed by filling out Completion Checklists at the end of each unit. In each run of the course, we added an ungraded Completion Checklist item, “Did you connect your work in this course with your own practice and take action?” for which the answers were either “Yes” or “No”. In the first run of the course, we had a corresponding thread in the forums for each unit that asked participants to reflect on how they responded to each unit’s Call to Action. This garnered very few responses. Therefore, in the second run of the course, we added the same question as an ungraded, open response item in the Completion Checklist:” “If you would like to tell us more about responding to this unit's Call to Action, please submit your response in the text box below. Any response you write will register as correct, but this will not count toward your graded progress.” This open response question proved more useful in encouraging learners to report actions taken each week.

Coding Guidelines and Limitations

Out of 3340 total responses from surveys, forum posts, and Call to Action open responses, we identified 257 participant responses where participants described some action that they took in their local community. We used an iterative coding process where we, as authors, coded a subset of items, discussed their findings, and coded subsequent subsets to develop a set of types of action items (Charmaz, 2006). After an initial round of independent coding, we discussed and compared notes and investigated possible themes to categorize the codes, using a constant comparative approach (Glaser & Strauss, 1967). We then reexamined the data independently and met again to address issues of reliability and consistency. During these follow up discussions, codes were examined, questioned, debated and grouped into the themes presented here.

We counted actions when learners wrote in the past or present tense. We did not count when learners shared what they planned to do in the future since we cannot know if these actions took place. For instance, one participant wrote, “Very soon, on the 20th, 21st and 22nd March, we will have some discussions forums with school leaders in order to assess the implementation of a project in Portuguese schools.” We excluded this response.

Moreover, in our coding rules, actions need to be described as having been started as a result of the course, rather than as ongoing initiatives that predated the course. A few learners described their actions in a way that we could not determine whether or not they resulted from the course. The presence of these types of descriptions in the data reminded us of a common challenge in evaluation of MOOCs: measuring learners at a baseline. In most MOOCs, the evaluation challenge is figuring out what participants know before the course to identify what they learned in the course; in our context, the challenge is figuring out what work participants were doing before the course, and what new activities or what substantial changes are triggered by the course.

This self-reported data has important limits: It is likely that there were learners who took action but did not share what they did. It is also possible that some self-reports overstate what learners actually did, especially in cases where we nominally require some kind of action. Triangulation of findings across multiple sources of data helps bolster the convergent validity of key themes. We view this initial data as providing useful guidance towards better targeted solicitation of self-report data as well as future approaches to observing MOOC learners in their local context.

FINDINGS

What Types of Actions Did MOOC Learners Report Taking Within Their Own Settings?

We identified six types of actions that participants took in their local contexts. In Table 2, we report the types of actions, an example of each action type and, then, the number of responses coded for each action type. We are cautious about drawing inferences from the frequency distributions of each action type. It is very likely that there are learners who took actions that they did not report, that some reports might be exaggerated or even fabricated, and that different researchers might have developed different coding schemes or counts. It is better to think of this table as mapping a possibility space for how learners acted, rather than accurately measuring the distributions of their activities. We observe enough differences in the reported frequency of activity to identify four more commonly reported actions: initiating an experiment in practice, sharing course content, meeting to launch change, and collecting data; and two less commonly reported actions: using course content, doing course assignments with others, and collecting data. Our intuition is that the most commonly reported actions actually happened more frequently than the least commonly reported actions. It is possible, however, that these frequencies are sensitive to how we asked participants to report behaviors in ways that bias their responses. Below, we describe the six types of actions.

Initiating an Experiment in Practice

One of the signature goals of the course was to get participants not only to just plan change, but also to actually get started with new initiatives during the course. Participants reported that they did indeed commence a range of new experiments, including providing scholarships for students to take edX courses and earn verified certificates, improving teacher collaboration across grade levels and subjects, and developing a program to help students to communicate further about bullying. Conducting these kinds of interventions is challenging, requiring participants to overcome anxiety associated with change and to find time to develop and implement new ideas. One of the most important findings from our work is that a subset of learners in a MOOC for professionals will indeed begin to implement change in local environments.

Meeting to Launch Change

Change initiatives in schools often require collaboration and coordination, and learners reported that the course inspired them to schedule or host meetings to start change initiatives. Learners met with colleagues, supervisors, heads of schools, superintendents, leadership teams, and students, as well as in professional development meetings and during workshops. In some respects, this represents a less risky, less time-consuming start of a change initiative when compared with actually launching an experiment. As instructors, we often suggested that learners solicit ideas and feedback from colleagues, and we are glad to see learners taking this step. However, we would be concerned if the course inspired many additional meetings that did not subsequently lead to change initiatives. As one learner wrote after a meeting with her university’s administration, “We still have a lot to do, but it is interesting on how actually the ideas exposed on this course works very well on the real world. So, thanks!"

Sharing Course Content

Resources and practices from Launching Innovation in Schools were designed not only just to teach people how to engage in change initiatives, but also to support the change process. The resources and practices are meant to be portable, and participants reported sharing course content, taking videos, readings, coursework, or other resources from the course to others in their offline environment to foster further discussion. One learner described, “I'm already working with a few teachers who are excited about redefining their teaching and collaborating on ways we can improve student learning. I will have to focus on what I can control, but I have collected all of the course materials, and have already started using them in my teaching, as well as sharing them with my colleagues.”

Collecting Data

In several sections of the course, we encourage students to collect data to provide new insights into their context, for example, all of Unit 4 is devoted to Measuring Progress and Adjusting. The most direct suggestion in the course is made in Unit 1, where we encourage students to interview or shadow a student. The bulk of responses reporting data collection are from the corresponding thread to this activity. One learner describes taking this optional activity a step further: “... for this course, I had to interview some students in my school. I did it with 35 students aged 15-18 who answered (anonymously) all the questions of the interview. I gathered their answers and now I can build my entire strategy on their wishes and desires.” Aside from responses to this specific prompt for this one activity, there were only a few additional responses coded within this category throughout data from the rest of the course. These responses described interviewing students, parents, and peers, holding focus groups and sending out surveys.

Doing Course Assignments with Others

One of the less commonly reported actions was doing course assignments with others, typically within the context of Learning Circles. The Facilitator’s Guide offers a variety of suggestions for how colleagues can productively spend time together. One learner shared the benefits of working with a colleague on the course: “Since there are two of us taking this class, we have had the opportunity for reflection at each step of the way.… We have used each other’s feedback to deepen our understanding of and commitment to working together to help solve our problem of practice. We are planning on using this document next week at the cultural competency training that we planned.”

Using or Facilitating Course Content

This category included learners who reported using course content individually in their practice or reported facilitating an activity from the course with others. As noted above, participants in the course engage in online versions of activities, four of which we provide Take-Out Packages for, that function as scripts for facilitating the activity in-person with a group of colleagues. Facilitating these activities requires significant commitment from participants, including planning the activity, scheduling colleagues and, then, allocating one or more hours to facilitate the activity and reflect on it. Participants also described using these activities individually in their own practice, such as using the Left-Hand Column Case to work through a difficult conversation.

Table 2. Typology of participant selfreported actions example of selfreport for each action type and number of responses coded in each category
Action Taken Example Quote Number of Responses
More commonly reported actions
Initiating an experiment in practice “Since part of my role is to design curriculum for ed-tech learning events, I was able to use principles from this MOOC to inspire me to design two learning events that incorporate reflection and online learning using Lynda.com & Seesaw.” 56
Sharing course content “Even though my whole school is not taking this course, I discussed what I was doing with a couple grade-level teams. I told them what I was hoping to get out of the class and asked them if they would help me.” 73
Meeting to launch change "I have been talking with other members of staff and students about the developments that we want to make in DT (Design Technology), which is been received very positively and is building an open dialogue between teachers." 55
Collecting data "I began my research and to talk to different key teachers in the Middle School and High School to see their perspective about the project. In fact some have had some initiatives this year on interdisciplinary project integration and were willing to participate. All of them requested more time to plan with peers." 49
Less commonly reported action
Using course content “I have used the Left-Hand Column to unpack a conversation with a student who had been sent to me for disciplinary reasons. This tool helped to sort out the issues.” 13
Doing course assignments with others “Our team created this visual to help identify and explain our goals for our SHIFT innovation teacher team.” 11

The six action types give some sense of the range of actions that participants took in their local community. They provide evidence that the learning in Launching Innovation in Schools, at least for a subset of the most engaged participants, was indeed embedded within participants’ professional practice. From this, we have evidence that online professional learning can inspire and support learners in taking action in their local contexts, even when those actions are time-consuming and challenging. In the next section, we attempt to characterize what parts of the course supported these actions.

Which Design Elements Inspired and Supported Learner Actions?

Our first strategy for identifying which design elements supported learner actions was to analyze survey responses, forum posts, and replies to Call to Action open-response questions for learner-reported evidence of course use. Our expectation was that, when participants wrote about what actions they took, they would naturally make connections back to design elements. Unfortunately, this happened very infrequently. Only in 37 responses did participants self-report both an action they took in their local context and the resources that supported them. These data were too sparse to support any defensible hypotheses, and future data gathering efforts will need to elicit this information more directly.

We, then, looked for similarities and connections between the suggested actions embedded in assignments and Call to Action videos, and the actions participants reported taking. In Table 3, we list each Call to Action video, the actions suggested by the Call to Action video (using the classifications developed in the section above), and then the actions most reported in the related Call to Action open-response question. Throughout the videos, we suggested four strategies: sharing course content, meeting to launch change, initiating an experiment in practice, and collecting data. The responses suggest that, generally speaking, when participants self-report the kinds of actions taken, they are aligned with the suggestions in the Call to Action videos. We observe that data collection appears to be one area where our suggestions were weakly taken up. We are unsure if these calls were less clear or compelling, if we did not provide enough support for collecting data, or if this type of activity is more onerous for teachers and leaders.

To summarize, our course appears to have three kinds of mechanisms to encourage participants to take action in their local context. It seems obvious in retrospect: the most powerful approach appears to be including some type of action in the text of a required assignment. In the design of the course, we were reticent to do this, since we did not want to make the requirements of the course so onerous as to make it impractical for participants, but it appears to be the most effective. Moving questions within an “accountability framework,” even when they are not strictly required, can improve responses. We had a low response rate to a discussion forum prompt that asked people to report on how they responded to our Call to Action videos. However, when we gave a similar optional prompt in the Completion Checklists for each unit, we received substantially more responses with more detail. Suggestions that are embedded directly in assignment and accountability mechanisms produced the most activity and best descriptions of those activities.

The next most powerful mechanism appears to be direct appeals or suggestions from the instructor, in our case, through Call to Action videos. We hypothesized that these direct video appeals would be the most compelling way to inspire participant actions, especially given the weak accountability mechanisms in the course, but our estimation is that they are less effective than simply including tasks in the assignment. Finally, participants can be inspired and supported to take action by tools, models, and exemplars throughout the course. While these can support participants in making high-commitment actions like facilitating complex exercises, they appear to inspire fewer actions than the more direct appeals.

Table 3. Call to Action video titles actions suggested by Call to Action videos and most common actions reported in Call to Action Open Response Questions
Call to Action Video Actions Suggested by Call to Action Video Most Common Actions Reported in Call to Action Open Response Question
Assignment 1.1 Powerful Learning Environment Sharing course content Sharing course contentMeeting to launch change
Assignment 1.2 Problem of Practice Collecting data, meeting to launch change
Assignment 2 Asset Map Meeting to launch change Meeting to launch changeSharing course contentInitiating an experiment in practice
Initial Action Plan (IAP) Part 1: What Does Awesome Look Like Meeting to launch change, sharing course content Meeting to launch changeSharing course content
IAP Part 2: Concrete Steps for the Action Plan Meeting to launch change, initiating an experiment in practice Meeting to launch change
Initial Action Plan, Continued Using course content Using course contentMeeting to launch change
IAP Part 3: Assessment Plan Collecting dataInitiating an experiment in practice Initiating an experiment in practiceMeeting to launch changeData collection
Final Assignment Sharing course content Sharing course contentMeeting to launch changeInitiating an experiment in practice

The strongest correlations between course design and self-reported actions involved the one activity and the one assignment where we directly indicated that participants should take some kind of action. As noted above, the optional Unit 1 activity “Interview/Shadow a Student” asked learners to engage in a specific data collection activity. However, this activity was optional and not for credit, so it was surprising that a number of learners chose to do it. This activity is one of the first of the course, and participation was less likely to be affected by high attrition rates than assignments and activities later on. In the second run of the course, we re-wrote the final assignment in Unit 6 to require students to share their work with a colleague for feedback and discussion. Of the 65 learners who submitted final assignments in the forums, 38 learners (58%) reported sharing a draft of their artifact with a colleague. This is the only point in the course where taking action is required (but not verified) for credit, and it was among the most successful at encouraging participants to take a specific action.

DISCUSSION AND FUTURE WORK

Effective professional learning in education and other professions is “job-embedded”; it allows participants to deploy new ideas and practices in their workplaces. For professional online learning to be effective, it needs to support this meaningful transfer from courseware into the workplace. If MOOC researchers hope to improve the efficacy of online professional learning opportunities, then it will be essential to study not just what happens in MOOCs, but how MOOCs are supporting professionals in improving their practice. Previous studies have identified that MOOC learners apply new credentials in job search or university applications (Littenberg & Reich, 2018; Zhenghao et al., 2015), MOOC learners apply new programming skills in open source projects (Chen, Davis, Hauff & Houben, 2016), and MOOC learners apply new analytics skills in scholarly pursuits (Wang, Baker & Paquette, 2017). We add to this growing literature about real world transfer of learning from MOOCs by providing preliminary evidence about how educators apply learning from MOOCs in their local school settings.

Through analysis of learner responses, we found evidence that a subset of the most engaged participants in Launching Innovation in Schools took a variety of actions in their local contexts. The course focused on getting participants not only to just plan, but also actually commence new change initiatives, and our most important finding is that participants are indeed able to do so during the course. We also found that participants shared course materials, initiated planning meetings, collected data, facilitated course activities, and did coursework with others.

Our hypotheses about which design elements supported these new actions are more speculative. The evidence suggests that the most effective way to get learners to engage in real-world actions is to embed those suggestions directly into activity and assignment prompts and accountability mechanisms, like self-check questions. Even when the accountability mechanisms are, in fact, quite weak, these seem to inspire learner actions. Suggestions from instructors, embedded in videos or other design elements, also appear to inspire learners to take action. Finally, design elements that model or support practices directly can also encourage participants to engage in those practices in their own contexts.

As course instructors, we find sufficient evidence of the efficacy of our design elements--Learning Circles with a Facilitator's Guide, action-oriented assignments with Call to Action videos, and theory-linked activities with Take-Out Packages--that we intend to continue using these elements in subsequent courses. We plan to shift more of our calls to action directly into coursework prompts and accountability mechanisms to encourage more activity, while watching carefully to see if these higher expectations increase attrition. We also plan to continue to find ways to make our course materials more shareable by using Creative Commons licenses, web-accessible readings and resources, and ensuring that our videos are easily accessible online.

We are planning several strategies to improve our data collection practices in subsequent courses. From these two runs of Launching Innovation in Schools, we gathered important insights into how our courses influence participants’ professional practices, but we still have low response rates to all of our prompts. First, we are considering being more explicit with participants about our research goals by announcing early in the course that reporting changes in work practice is an important element of the course. We have some concerns about social desirability biasing participant responses, but those concerns are balanced by our desire to help participants understand our objective better. We also plan to be more direct in asking participants to connect their new practices with the design elements that were most useful in helping scaffold changes in practice. We hope to improve survey response rates through reminders, and we are interested in experimenting with soliciting data through social media channels such as Facebook and Twitter. It is hard to predict which data collections will elicit participant responses. This underscores that multiple, varied methods, deployed throughout the run of a course, are necessary. We also need better baseline data about participant practices, so we can better understand what changes during and after the course. We plan to include additional items in pre-course surveys and early assignments to collect more baseline data about participant practices. We expect that these suggestions may be useful to other MOOC researchers and course designers studying professional learning.

Future research will also need to go beyond self-report data of changes in practice to methods that allow for the direct observation of change. This data collection will be very challenging given the wide geographic distribution and diversity of our learners. It may be possible to partner with school districts that encourage employees to take MOOCs, so researchers can make observations of practice before and after the course, or use data such as annual reviews that can shed light into how the course might affect participant behavior. Only by deeply understanding how MOOCs change learner behavior in the real world, MOOC researchers will be able to provide faculty and instructional designers with guidance for designing effective environments for online professional learning.

REFERENCES

Brennan, K., Blum-Smith, S., & Yurkofsky, M. M. (2018). From Checklists to Heuristics: Designing MOOCs to Support Teacher Learning. Teachers College Record, 120(9).

Chen, G., Davis, D., Hauff, C., & Houben, G. J. (2016, April). Learning transfer: Does it take place in MOOCs? An investigation into the uptake of functional programming in practice. In Proceedings of the Third (2016) ACM Conference on Learning@ Scale (pp. 409-418). ACM.

Charmaz, K. (2006). Constructing grounded theory: A practical guide through qualitative analysis. Sage.

Chirikov, I., Semenova, T., Maloshonok, N., Bettinger, E., & Kizilcec, R. F. (in press). Online Education Platforms Scale College STEM Instruction with Equivalent Learning Outcomes at Lower Cost. Science Advances.

Darling-Hammond, L., LaPointe, M., Meyerson, D., Orr, M. T., & Cohen, C. (2007). Preparing School Leaders for a Changing World: Lessons from Exemplary Leadership Development Programs. School Leadership Study. Final Report. Stanford Educational Leadership Institute.

Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory. Chicago, IL: Aldine.

Ho, A., Chuang, I., Reich, J., Coleman, C., Whitehill, J., Northcutt, C., ... & Petersen, R. (2015). HarvardX and MITx: Two years of open online courses fall 2012-summer 2014. Available at SSRN 2586847.

Hunzicker, J. (2010). Characteristics of Effective Professional Development: A Checklist. Retrieved from https://files.eric.ed.gov/fulltext/ED510366.pdf

Kleiman, G., Kellogg, S., & Booth, S. (2015). MOOC-Ed evaluation final report. Retrieved January, 16, 2018. Retrieved from https://fi-courses.s3.amazonaws.com/place/research-reports/hewlett-evaluation-final.pdf

Littenberg-Tobias, J., & Reich, J. (2018) Evaluating Access, Quality, and Inverted Admissions in MOOC-Based Blended Degree Pathways: A Study of the MIT Supply Chain Management MicroMasters. Retrieved from SocArXiv: https://osf.io/preprints/socarxiv/8nbsz

Milligan, C., & Littlejohn, A. (2014). Supporting professional learning in a massive open online course. The International Review of Research in Open and Distributed Learning, 15(5).

P2PU. (2016). Learning Circles Facilitator Handbook. Retrieved from https://www.p2pu.org/assets/uploads/learning_circle_downloads/facilitator_handbook.pdf

Sandoval, W. A., & Bell, P. (2004). Design-based research methods for studying learning in context: Introduction. Educational Psychologist, 39(4), 199-201.

Seaton, D., Coleman, C., Daries, J., & Chuang, I. (2014). Teacher Enrollment in MITx MOOCs: Are We Educating Educators? Available at SSRN 2515385.

US Office of Educational Technology. (2014). Online Professional Learning Quality Checklist. Retrieved from https://tech.ed.gov/wp-content/uploads/2014/11/Section-5-Online-Professional-Learning-Quality-Checklist-FINAL.pdf

Wang, Y., Baker, R., & Paquette, L. (2017, January). Behavioral predictors of MOOC post-course development. In Proceedings of the Workshop on Integrated Learning Analytics of MOOC Post-Course Development.

Zhenghao, C., Alcorn, B., Christensen, G., Eriksson, N., Koller, D., & Emanuel, E. (2015). Who’s benefiting from MOOCs, and why. Harvard Business Review, 25, 2-8.

Notas de autor

1 Alyssa Napier is a doctoral candidate at the Harvard Graduate School of Education.
E-mail: amnapier@mit.edu
2 Elizabeth Huttner-Loan is an instructional designer at IBM.
E-mail: huttner@mit.edu
3 Justin Reich is an Assistant Professor of Comparative Media Studies at MIT, and the director of the MIT Teaching Systems Lab, which aspires to design, implement, and research the future of teacher learning.
E-mail: jreich@mit.edu

Enlaces refback

  • No hay ningún enlace refback.




RIED. Revista Iboeroamericana de Educación a Distancia
(La Revista Iberoamericana de la Educación Digital)
Director/Editor : Lorenzo García Aretio
UNED, Facultad de Educación
C/ Juan del Rosal, 14
28040 Madrid (Spain).
ried@edu.uned.es
ISSN :1138-2783
E-ISSN : 1390-3306
Depósito Legal : M- 36.279 -1997
Edita: Asociación Iberoamericana de Educación Superior a Distancia (AIESAD).
 Madrid (España).

Reconocimiento NoComercial (by-nc): Se permite la generación de obras derivadas siempre que no se haga un uso comercial. Tampoco se puede utilizar la obra original con finalidades comerciales.
SÍGUENOS EN:

https://2.bp.blogspot.com/-wtzwURZeg6I/V_y8vM5DmdI/AAAAAAABKKQ/y_fW6U2dW3cOLG6z-tUwJ9u1Pwt9ltXHACLcB/s320/blogger_b_logo.jpg https://4.bp.blogspot.com/-Q3lAzaCezXA/V_TZ0BTuIkI/AAAAAAABKF4/wP8QRQVCPiQnk0sE7nEDnZHY5F03AOjbgCLcB/s200/twitrer_120%2B%25281%2529.jpg https://4.bp.blogspot.com/-4So1RLxqN7Q/VHMWABdXX9I/AAAAAAAAb4E/mV00Ac5Gm-Q/s1600/fb_icon_325x325.png https://1.bp.blogspot.com/-S7ecZmnt3os/Vzmf77J7EfI/AAAAAAABEYc/g3MJ_0z_noUtAiLS7MRRHXgzOkGbZbfUACLcB/s200/scholar_logo_lg_2011.gif

Colaboran con RIED:

https://2.bp.blogspot.com/-VKcDNIR3Sqk/V_aPanb6P0I/AAAAAAABKIA/XSdUeendX2wJ_afKOCIIkxkZjW0ZnT0vACLcB/s320/logoCUED.jpg       https://3.bp.blogspot.com/-wxw5W-VCRGA/WAnp69yeyuI/AAAAAAABKgo/LHi490KturcyZQE7KnlK2ZT9taWEUXkgQCLcB/s320/logo-AM2.01.png    Alteridad