Success measures in technology-enhanced learning

What does success look like for technology-enhanced learning? This was the opening substantive question from the #LTHEchat on 3 May 2017 and as an unofficial follow up, I’m cribbing here from the Twitter community and expanding on some of the ideas that were discussed.

TEL or just L?

Should success in technology-enhanced learning (TEL) be any different from success in learning generally? There is still a distinction between learning and teaching practice with or without technology, whether that is for good or bad is often debated, but the distinction is subtle. A clear dichotomy doesn’t exist, technology use is better described as ‘fuzzy’, particularly as we shouldn’t just think of designed-in, educator-led learning technology use. Students make their own choices of technology to support their ways of learning too, appropriating not just institutional platforms, but social networks, apps and personal devices. Within organisations, the distinction of with/without technology still exists because of varying levels of digital capability, the mystique around innovation/excellence in teaching and a desire to know ‘what really works’ with learning technology before giving it a go. This means we still have TEL teams, learning technologists and digital champions, all aiming to support, evaluate and embed TEL practice, responding to new technologies, new pedagogies and changes in students’ learning contexts.

Success in technology-enhanced learning is therefore measured and judged by those with experience of non-technology-enhanced learning, or at worst technology-diminished learning. That leads to comparisons to approaches to learning and teaching that have gone before, and doesn’t accept the potential for change, innovation, risk and reward.

To find a way to better convey the value of technology-enhanced learning approaches, the question posed by Sue Watling and Patrick Lynch offered some answers. I’ve grouped responses from the #LTHEchat community into four broad categories.

  • Technology not interfering
  • Sustainability
  • Time and effort
  • Ways of learning

The first three are largely technology-specific and undeniably relevant and appropriate measures of the use of learning technology. The fourth category of responses emphasises the pedagogical opportunities of technology-enhanced learning. I would argue that you need to consider all four success measures, as without the first three you do not have a stable learning environment and without considering ways of learning, the use of technology has no value or meaning. The success measures are what we (both learning technologists and educators) should be striving for in aiming to create effective learning experiences. I round off by picking two Tweets that challenge commonly held ideas of success: disruption and failure.

Technology not interfering

“Success is when the tech becomes invisible and the learning is the prime objective.” – @profpeterbrad

Originally, I was going to suggest this as ‘transparent technology’, but that conjured images of invisible barriers that will hit you when you least expect it. Whilst this experience is arguably true of many technologies (VLEs tend to fall contentedly into this bracket), transparency would not be the most appropriate analogy of success. What is aimed for is technology that facilitates a particular engagement or learning experience, but a) is intuitive, b) is reliable and c) isn’t shouting from the rooftops that it is there. Would ‘unconscious use of technology’ be more appropriate?

“Students learning without noticing what the technology is doing” – @santanuvasant

Part of the success measure comes down to choosing the ‘right’ technology for a particular learning experience. There are obvious limitations due to institutional provision and student acceptance of a particular tool, but deliberately choosing a tool which requires additional login details or new accounts, is incompatible with student devices or, at a superficial level, is just plain ugly, doesn’t create a conducive learning space. Consider the choice of tool the same as a choice of room. In a lecture theatre there are barriers to small-group working or practical tasks, and you wouldn’t hold a seminar discussion on a complex or emotive issue in a public, noisy place. Certainly, successful use of technology by ensuring it is ‘transparent’ means cognitive effort is put into the learning task rather than on how to use the tool. Except of course, if your learning outcome is that the student is proficient in the use of a particular tool. Though in that case I would use open or professional tools students could use in future careers, rather than those requiring special licenses or institutional infrastructure to access.


One of the challenges to using open, online tools is sustainability. Innovation in technology-enhanced learning often comes through playing with the latest tool (innovation in the technology, rather than in the learning and teaching perhaps). The risk is clear, as online tools are frequently in ‘perpetual beta’, which means they are subject to interface changes or could be made unavailable or have pay-walls attached to them at any time. This risk is emphasised when the intended learning experience is dependent upon a specific tool, rather than being technology agnostic.

“Another measure of success is how long a technology endures for without becoming stagnant: can you keep doing more with it” – @kjhaxton

The constant churn of online tools and the way that learning technology continues to evolve at a rapid pace is perhaps the biggest argument for the presence of TEL teams. TEL teams should have the time and space allocated to them to keep a watchful eye on new learning experiences and new tools offer, however their focus should not be on the technology and supporting ‘button-clicking’ training. A far more sustainable approach to the use of technology to support learning and teaching is to develop organisational capacity in digital capabilities and digital pedagogies. TEL teams play a role in showing what is possible, usually at that moment in time, but enabling educators to think outside the box whilst also being conscious of risks involved. Digital capability, in terms of the ability for an individual to make their own decision over the appropriateness of a technology and adaptability should that technology change, really needs to be at the fore of any educator’s development.

“Sustainable adoption beyond early enthusiasts, and delivering measurable improvement” – @johncoup

Time and effort

Developing digital capability as a teaching professional does take time and effort. This is a skill set and an approach to working that should be built before time and effort in use of learning technologies is even explored.

If you are designing an online or deeply integrated blended learning course, you would become aware of the large amount of time it takes to plan, design, author, quality assure, deliver and evaluate a single course. The reason is that, particularly for online distance learning, the educator is not present at the same time as the learner. There is no immediacy in interactions, the learner cannot clarify instructions and the educator cannot monitor engagement or progress synchronously as the student learns. Therefore interactions and engagement with peers, content and educator is carefully designed through activity and instruction. The same principles apply to blended activity design in technology-enhanced modules or programmes. Online tasks must be clear and achievable for learning independent of lecturer presence, and that does require time and effort to understand and implement. Failing to take the time to develop TEL practice is likely to lead to more time spent unpicking problems, dealing with email queries and a rather unenjoyable experience for all.

“My measure of success is … I like the tech, it’s usable, it saves me some time and it’s useful for my students. I think.” – @cpjobling

However, for some, the use of technology does offer time and effort saving. There is the potential for it to be effective and efficient. Where technology is used to support ‘administrative’ tasks, such as collecting in assignments and making feedback available to large cohorts, there can be a time saving opportunity.  Online quizzes are a good example, where they require time and effort to write effective questions, the automated feedback can support students by allowing them to judge their knowledge gaps independently or to provide the educator with a measure of the group’s understanding of concepts. If the use of learning technologies can also support deeper learning in more efficient ways, perhaps through application to case studies or problem-solving tasks, surely that also is a valid success measure.

“I always feel nervous about putting ‘saving me time’ as a goal of new tech so glad someone else does :)” – @kjhaxton

I would argue though that sometimes for a particular approach to be successful, it does require more time and effort and that’s not something that should be seen in the negative. The nature of the pedagogical approach will also influence time spent, for example discussion boards take time to read and construct responses to. Whilst more time is required, there are learning benefits through discussion board tasks as they allow students time and space to construct an answer, perhaps offering less confident students the opportunity to contribute to the group where they may not have done in face-to-face.

It is also worth considering where the time is spent, either in preparation or in delivery. Again, there is a comparison to non-TEL teaching approaches, spending time preparing for a seminar and delivering a seminar may have different loads to spending time setting up a discussion board and facilitating it. If the use of technology-enhanced learning demands the educator takes time out to really get to grips with what it can offer, maybe organisations should start to see that as a positive, developmental activity.

Ways of learning

“Success is the tech helping staff deliver something they couldn’t do otherwise. & hopefully improving student learning!” – @sjhalpin

I’ve touched upon already how I view success in technology-enhanced learning to be about the learning design and intended learning experience. This is a view that positions technology as an enabler, rather than a string of problems. Success is determined therefore by student outcomes, however they may be measured. I would also argue that softer outcomes, such as student satisfaction or feedback on teaching approach, are also suitable measures of how successful a TEL intervention is. By this, I am not saying that all student feedback should be positive. Statements that show that the learning process was challenging (but would hopefully be ultimately rewarding), and that the focus of such feedback is on the learning rather than the technology, are signs of success.

“My measure of success is student engagement, satisfaction and deep learning using tech” – @Content_Edu

The responses from the Twitter chat in this area provide good examples of how success in technology-enhanced learning is about the learning, offering deeper learning and engagement, conversation, collaboration and communities of learners.

“I use to build communities share research resources and connect with fashion students internationally” – @fashionnatascha

“My measure of success is how much conversation/comment/interaction is generated” – @rbrtjnkns

Disruption and failure

As suggested above, technological barriers, time and effort, and sustainability all need to be considered and addressed to enable success at the pedagogic level. Yet, there were two comments from the #LTHEchat stood out as measures of success in technology-enhanced learning that emphasise the value effective use of learning technology can bring: disruption and failure.

“‘My measure of success is disruption.’… so I prefer to defer the measure of success to our students!” – @S_J_Lancaster

Disruption in the form of challenging students’ learning, the way they normally approach learning or their misconceptions about a particular topic, seem worthwhile positive outputs from any learning experience. Disruption to the educator is often seen as not desirable! Yet, whilst we aim to encourage adaptability and problem solving in our students, we should also think about the success of developing adaptability and problem solving in us as learning technologists and educators. Strangely, I’m brought again to the, somewhat obvious, conclusion that digital capability may be at the heart of successful use of learning technology and learning designs dependent on technology. Disruption here though is grounded in the learning process, and not through difficult access or interference of technology. Part of this disruption may be requiring students to use old tools in new ways, to learn how to interact, to be creative, to accept they may not always know the right way of doing something.

“My measure of success is discernible impact – either on my learners, or on my own practice through a constructive failure.” – @james_youdale

Seeing failure as a success measure accepts that innovation and new ways of learning may not always work. What is key is to try to learn from them. I would agree that a failure can be a success if you can learn from it, through discussing with your students about their experience and evaluating the use of technology, their performance and that of the educator in an open, reflective and dispassionate way. This is perhaps not the view of those who design metrics to measure performance or success, or those who view student satisfaction as a suitable league table determinant. Where there is no room for failure, there is also no room for success.


Thanks again to all those above who have prompted me to think about success in technology-enhanced learning and for Sue Watling and Patrick Lynch for the questions and #LTHEchat theme. I’ve captured all those Tweets which made me think about this topic in my Storify of the chat.

Further details about this Twitter chat are on the LTHEchat blog.





Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.