Specialist area 2. MOOC design

My current role has allowed me to explore MOOC design as a new specialist area.

Within two years I have designed, with subject experts, 15 new open online courses. I manage the delivery of up to 9 concurrent courses, with both internal staff mentors and external consultants for authoring and mentoring. Through this process I am exploring the nature of MOOC pedagogy, shifting my practice on a course-by-course basis in reaction to changes to the MOOC platform, marketing constraints of GDPR and identification of what works best for our particular audience of practising teachers in schools and colleges. The impact of my work is evidenced through learner retention and end-of-course feedback. My dissemination activities are, due to my role outside of an academic institution and demands of my role as both operational and strategic, limited to selected conferences, research seminars and collaborations with other professional development MOOC providers. As such a national or international impact of my work on other learning technologists is not within the scope of my role, however the main impact is on enabling thousands of learners to change their professional practice.

Making activity-based learning understandable

To support course development team awareness of online learning design, I had to come up with an understandable model for activity-based learning.

Content/resources, educators, other learners and the learner themselves have a two-way relationship with activity. This activity leads towards a specific learning outcome.

Drawing upon Anderson (2003) who explored ‘meaningful interactions’ between tutors, students and content in distance learning, and the ideas of activity theory which involves an understand of rules, roles, behaviours and actions that lead to learning outcomes (Engeström, 2001), I presented the way that actors and content interact through activity to achieve specific learning outcomes. Without an understanding of learning activity, and the concept of roles of educators, other learners and the learner contributions, there would be a risk of online courses being designed as little more than non-interactive text books. The concept of an online activity was useful to my colleagues to break down their online course design in manageable chunks, whilst ensuring the professional development (learning) outcomes were being met. This equipped them with a basic grasp of online learning for creating courses with the ABC/Viewpoints learning design workshops that FutureLearn align their design process to.

  • Introduction to MOOC design internal presentation (starts off course development process) [Redacted for public portfolio]

Comments and learning contributions

Within FutureLearn commenting is available for every course page, and on some courses this can lead to generic statements which are not reflective of the course content (typically statements such as “great info” or repeating the key point of the page). I have attempted to address this issue by including discussion prompts where discussion should occur and removing unfocused questions. Anecdotally, mentors have said that learner contributions tend to be richer and more detailed as a result of this change. This is partly due to better targeting of learners, so that we have more teachers than non-teachers on courses, and also due to clearer direction for what is to be achieved through discussion tasks.

A simple addition I made to courses was to have a verb (such as ‘comment’, ‘discuss’, ‘create’) or just ‘task’ above an activity or discussion prompt as a heading. This separated the activity instruction clearly from the course content and prompted action from the learner. I also explored the academic literature for learner comments. In particular, Laurillard’s (2012) “types of intervention… question… explanation… conjecture… comment… critique” I found particularly useful and have now included in the opening step of all courses.

Learner support goes in tandem with support for course facilitators and I’ve been actively supporting colleagues to improve how they respond to and stimulate discussion online. Every facilitator receives a summary guide when they join, but throughout the course, facilitators log where discussions are going well and where help is needed.

  • MOOC facilitation guide (internal document) [Redacted for public portfolio]

Beyond comments

Formerly, when I worked within the HE context, it was possible to design learning activities that require complex use of online tools, for example to support undergraduates preparing for seminars with a wiki task. Such tasks are not feasible within MOOCs I develop now, as the step-by-step structure tends to favour shorter tasks that build up to a larger outcome. This means that layered activities requiring multiple steps and technical processes are not likely to be completed by learners. The physical space on a single step also limits the amount of technical guidance that can be provided, so it is important to reduce the requirements of an activity to what is actually essential for a learning objective. One example is the use of Padlet for learner contributions. In rewriting a task for an existing course, I applied my approach of separating task from technical detail, reduced the optional aspects and simplified the language. The task was now solely about posting to Padlet, and didn’t require learners to have to juggle between Padlet and the step comments.

Reflection

In my opening statement presenting work on MOOC engagement and learner needs at #OER18 I said that designing for MOOCs caused me to rethink what I thought I knew about distance learning. In particular I have changed my ideas about induction processes for online learning. MOOCs do not afford incremental induction models, instead have to cater for learners who may be looking for just-in-time or low-commitment learning (DeBoer, et al., 2014). Professional learners, who are likely graduates or experienced in self-directed learning, are also less likely to be motivated by certificates and more on addressing self-identified learning gaps (Watted and Barak, 2018). The idea of building a coherent community of learners, is disrupted by the way learners may join sporadically and sometimes months apart. This introduces challenges in the design of both discussion-based and collaborative activities.

I remain cautious that expectations of a social learning experience may not be being met, so continue to explore more appropriate models of balancing the needs of different groups of learners in an open course. In upcoming course runs I have sought the advice of external consultants to consider how the first steps of a course could remove induction material altogether, and instead embed administrative tasks through the first week.

I would also like to particularly highlight my blog post on videos to support learning in MOOCs where I look at the literature around video learning and evaluate examples from open courses.

Impact

The learning design approaches I have adopted and adapted for MOOC learners (as outlined in my #OER18 paper and FLAN presentation linked above) include:

  • Shorter courses for professional development audiences targeting specific learning needs aligned to subject discipline.
  • Short sequences of activities that allow achievable progression through a course.
  • Scaffolded learner-learner and learner-educator interaction that is meaningful and adds value.
  • Opportunities for cohort-specific educator input to build relationships and allow late-comers means to catch-up.
  • Additional email contact to provide prompts for key learning activities.

As the FLAN presentation slides show [Link redacted for public portfolio], overall our course retention is, on average, above 20% (notably higher than the 12.6% average completion cited by Jordan, 2015). More significantly though are my findings that the retention of learners who enrol in advance is almost double that than those who enrol after a course has started. Similarly, that commenting (social learning) is more likely for those who enrol before the course start date. As a result, I have developed strategies to encourage planned participation in courses through the use of an annual CPD schedule, identifying group CPD needs and developing pathways through the courses based on career needs.

Measuring impact is possible using pre-course and post-course surveys. Whilst the pool of participants for matching data is small, aggregating the data shows clear trends. In the examples of impact below, practice and understanding have both developed. These reflect the course content and design (thus the impact is a produce of collaboration between educators and myself).

Two charts are shown for four metrics which represent classroom practice. The second, post-course data, shows notable increase across all four metrics.

Also of note is the positive feedback on the relevance of courses and organisation, which is a combination of the platform, marketing approach and design.

This data shows the end of course feedback for a primary teaching courses. Over 95% say their understanding has improved, over 80% changed practice, and over 90% agree the course was well organised, planned and relevant to their needs. This data shows the end of course feedback for three science teaching courses. Over 95% say their understanding has improved, over 80% changed practice, and over 90% agree the course was well organised, planned and relevant to their needs.

My data analysis has also justified high-cost interventions, such as mentoring and the Q&A recordings with expert insight. These have had particular impact on learners as the following learner feedback shows:

“I was impressed that the Mentors and Educators managed to reply to so many comments. This is the first time I have engaged in a course of this type and the responses really increased my confidence and encouraged me to extend my thinking and engage with learning beyond the basic course hours.” – Science of Learning

“I have completed several FutureLearn courses. This was the best by far: it packed lots in but was succinct; it exemplified the EBC model; Paul is an inspirational speaker; the mentors provided valuable support; the resources and links were brilliant for my ongoing practice and I really enjoyed and learnt from the student forum.” – Science of Learning

“I really enjoyed the way the course was structured, with a mix of videos showing good practice and input from the educators… questions and tasks really helped with understanding and reflecting.”
–  Planning for Learning

“[Mentors], the Q&A videos and comments from other teachers from around the world all helped.” – Managing Behaviour

“Love the structure – bite-sized pieces, and very useful discussion from participants. I’m new at teaching biology, feeling much more confident about planning.” – Teaching Practical Science: Biology

Across all our online courses we adopt self-assessed impact measures. This follows a model suggested by Guskey (2002), which underpins our face-to-face programme too, aligning teacher outcomes to learning outcomes. The model reflects the impact on individuals (the participant learner), their students (outcomes or behaviours) and their colleagues (as a measure of disseminating good practice).

Example course. Impact on self: Zero=0, Low=1, Medium=15, High=28. Impact on students: Zero=1, Low=11, Medium=25, High=7. Impact on colleagues and/or organisation: Zero=5, Low=16, Medium=21, High=2. Impact overall: Zero=1, Low=3, Medium=32, High=8.

The graph above shows the self-reported impact from a 5 week course on the Science of Learning. Notable is the significant impact on our learners’ practice, but also a reasonable impact on students as a result of changes participants have made to their teaching. What is encouraging is that even after only 5 weeks, changes are made which have a perceivable impact. This is only down to the way the course incorporates activity, and importantly real classroom tasks, to link concepts from the course back into practice during the period the course is running. Further work on impacting colleagues can be achieved by requiring more collaborative, offline tasks for participants to undertake. As I build resources and activities for blended CPD, I anticipate greater opportunities for participants to bridge both the online and offline spaces, involving more of their colleagues when learning.

Ultimately though, the impact measures of the online courses I have designed (by design I refer to the appropriate selection of a sequence of learning activities, visual appearance, combination of media, which must be done in collaboration with subject experts based on discussion of CPD and discipline needs), show that the course are effective for those who do complete.

[Next steps referencing organisational objectives redacted for public portfolio]

References

Anderson, T. (2003). ‘Getting the Mix Right Again: An updated and
theoretical rationale for interaction’, International Review of Research in Open and Distance Learning, 4(2).

De Boer, J., Ho, A.D., Stump, G.S., Breslow, L. (2014). ‘Reconceptualizing Educational Variables for Massive Open Online Courses’, Educational Researcher, 43(2), 74-84.

Engeström, Y. (2001). ‘Expansive Learning at Work: toward an activity theoretical reconceptualization’, Journal of Education and Work, 14(1), 133-156.

Guskey, T.R. (2002). ‘Professional development and teacher change’, Teachers and Teaching: theory and practice, 8(3/4).

Jordan, K. (2015). Massive open online course completion rates revisited: assessment, length and attrition. International Review of Research in Open and Distributed Learning 16(3): 341–358

Laurillard, D. (2012). Teaching as a Design Science. Abingdon: Routledge.

Watted, A. and Barak, M. (2018) ‘Motivating factors of MOOC completers: Comparing between university affiliated students and general participants’, The Internet and Higher Education, 37, 11-20.

Next: Advanced Area: Lecture capture.