GDPR: is this the end of innovation in technology-enhanced learning?

Disclaimer: I am not a lawyer. The following article reflects my current, personal, understanding, which may change as more guidance becomes available. Working draft.

GDPR. These innocuous four letters have spawned many unanswered questions within educational institutions and other organisations. I’m lucky in that I work for an organisation that is ISO 27001 accredited, which means we operate to adhere to information security processes, as standard. Every individual knows what is expected, what the processes are, and have received appropriate training. It is part of our professional practice. This means GDPR is not a shock to the system, because we already understand the principles it is trying to achieve and we only have to adapt our way of working by building upon already sound practices. The same, unfortunately, cannot be said of all educational institutions.

GDPR basics

The EU General Data Protection Regulations come into force on 25 May 2018. They expand upon the Data Protection Act (1998) in the UK and have had a massive impact on many organisations. The UK legislation that enshrines the EU GDPR (amended) is the Data Protection Act (2018). There are eight individual rights an individual has about the way their personal data is used, and six lawful bases for processing data (EU, 2016). GDPR is about personal data, which is any information about a living individual. Guidance on the application of these regulations is still being formed, as more examples and use cases are considered. The main difference between GDPR and the DPA is that data breaches, and misuse of data, now has significant and crippling financial penalties attached. This has led to some knee-jerk implementation by risk-averse organisations to batten down the hatches and abandon common sense by treating the most banal, non-personal data as protected all in the name of GDPR. Equally though, there is a substantial risk where institutions keep on doing the same thing, believing that they can hide behind one of the six lawful uses.

In this post I look at some of the issues raised by Mark Glynn (2018), Head of the Teaching Enhancement Unit at Dublin City University as part of an ALT Webinar, my interpretation of ICO and JISC guidance, reference to the EU legislation and how all this together may impact educational innovation and online learning design.

Data used in education

Before looking at specific practices, I just want to introduce some ideas about what data educators are available to access and how GDPR may require stronger justification of data sharing within institutions.

Mark Glynn highlighted that two lawful bases apply most to educational processing of data: consent and performance of a contract. Consent is a pretty clear concept. It must be informed, in that the purpose is clearly defined; specific, in that each different use can be consented to individually; and above all, opt-in, not opt-out. Just because you have asked for consent doesn’t mean you are using the best legal basis for processing data. Consent must be freely given, so asking for consent to use a students’ personal data in a particular way before they will be given their degree certificate, whilst you are already on a course, isn’t in my opinion freely given consent as it is holding a detrimental consequence in lieu of consent. Indeed, the EU legislative document makes this point, that consent obtained due to some form of power or authority isn’t the best legitimate basis for data processing (EU, 2016, Recital 43).

The alternative, use under performance of a contract, is a little less well-defined. This is the basis under which Glynn suggested most educational institutions will process students’ personal data. Using data under this lawful basis has to be justified as being ‘necessary’ (more on that in a bit), in order to fulfil a contractual obligation. However, when consent is actually required, as opposed to when data is processed in the delivery of a course under contract, is the grey area that educators and administrators are needing to navigate.

The development of contractual sharing of data will require clarity and improvements in institutional policies. However, all of this is becomes a moot point if you decide not to collect or process the data in the first place. One of the core principles that stands out to me is to question whether the data actually needs to be collected, and if so, does it need to be shared with all involved in the delivery of a service or contract? Remember processing of data under contract must be necessary for the delivery of that service. The ICO advises the following on interpreting ‘necessary’ processing to fulfil contractual obligations:

“‘Necessary’ does not mean that the processing must be essential for the purposes of performing a contract or taking relevant pre-contractual steps. However, it must be a targeted and proportionate way of achieving that purpose. This lawful basis does not apply if there are other reasonable and less intrusive ways to meet your contractual obligations or take the steps requested.

The processing must be necessary to deliver your side of the contract with this particular person. If the processing is only necessary to maintain your business model more generally, this lawful basis will not apply and you should consider another lawful basis, such as legitimate interests.” (ICO, 2018a)

The idea of different legitimate uses and processing of data starts to raise the bar in terms of security of data and quite rightly challenges justifications over who should and shouldn’t have access. The UK legislation has specific reference to schools (Data Protection Act 2018, Schedule 3), allowing processing of personal data by teachers, but there is no reference to higher education institutions. So, what would you say counts as necessary processing of personal data, in particular data to be accessed by educators? Grades and records of performance: yes. Names, email addresses, and other contact information: yes. Age, gender, marital status: maybe not? The same would apply to system administrators, do they need access to personal addresses for students in order to ensure they are on the right courses on a VLE?

Demographic, contact details and their personal data can often be presented together in a student record, and whilst anonymised for educational research isn’t always anonymised or stripped of sensitive data when used for education delivery. In some systems, different categories of personal data may have been treated the same in terms of access control, and I would seriously question whether that should be the case now. I’m not sure educators really need to know all personal information and I’ve not known where it has genuinely informed the decisions made about how to teach the group (not at an individual level anyway).

Thinking about how personal data might be used in education, there may be demographic data is used in educational research to summarise the participants in a study, but consent in this case is sought to adhere to research ethics and would also be a legitimate scientific or historical use. Personal data is regularly collected on enrolment for reporting to public funding bodies, though again is aggregated and I guess here would be lawfully processed as a public task (I believe a legal duty under the Higher Education and Research Act 2017). Further, anonymised data where there is no chance of identifying the individual, as is often the case in research, is exempt from GDPR:

“This Regulation does not therefore concern the processing of such anonymous information, including for statistical or research purposes.” (EU 2016, Recital 26)

If such personal data, linked to individuals non-anonymised, was informing teaching choices, this may be seen as profiling (the personal data wouldn’t be related to academic achievements), it may perhaps require consent more so than be argued as a legitimate interest. Mark Glynn provided the example of a learning analytics system which used VLE (virtual learning environment) interactions of a student, compared to previous cohorts, as a predictor of how successful they would be. Glynn states this would require consent.

“The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her. [This] shall not apply if the decision: (a) is necessary for entering into, or performance of, a contract between the data subject and a data controller… or (c) is based on the data subject’s explicit consent. (EU, 2016, Article 22)

Using background personal data, let alone their system interactions, on students this way would definitely be debatable, but because the legal justification would need to be exceptionally strong (conveyed through contract or consent), may in fact require rigorous pedagogical research. This research would inform institutional practices and provide a clearer explanation to students as to why this type of data processing supports their education.

Contributions from students during learning

What about more complex data, such as contributions in class? Lecture recording has always been a use of learning technology that has pushed the boundaries on ideas of consent, intellectual property and personal data. As institutions move from opt-in lecture recording to opt-out, there is an increased risk of unintentional (and perhaps unmonitored) processing of personal data. If an individual student shares something in a class that is personal and (and is the important bit here) is personally identifiable, then immediately we’re into GDPR territory. How is that recording controlled? How can the individual student enact their eight rights? Do we need to enlist an army of lecture recording editors? Oh no!

How we navigate between personal data (“any information relating to an identifiable person”), intellectual property and associated rights that may be transferred and exploited in some way, and moral rights is resulting in a rather complex web of ifs, buts and maybes. If a student says something in a lecture that is recorded, do they have rights under GDPR if that contribution is interpreted as data about them as an individual, does that depend on the content of the contribution, or does the fact they have made a contribution act itself as data about their engagement with the course? What if the lecture recording is instead a student presentation, is that data about the individual, or if we don’t identify the student, how are their moral rights addressed? If a student contributes to an online space under a creative commons license, their moral rights of attribution may be in conflict with how those contributions are stored anonymously if anonymity is the approach used to conform to GDPR.

Now all these examples may be just confusing the situation. However, it is precisely the imprecision in understanding about all these legal and policy contexts that lead to defensive and protectionist approaches that appear too risky for educators to even consider.

Mark Glynn tackled lecture capture in the ALT GDPR webinar. Referring back to the concept of consent, consent cannot be assumed. Neither is it practical to obtain consent for every individual in a lecture for every recording. Therefore (and this is what I believe Glynn suggested), lecture recordings must be part of the contract an institution has with the student in the delivery of education. If a university has made the commitment to provide an opt-out lecture capture service for the benefit of student education, then including this within the university regulations as to how that data will be stored and managed should be something to celebrate, rather than shy away from. Lecture capture is a perfect example of how rigorous research can support justification. There may still be consequences for changes in lecture capture use, for example (I think Mark Glynn suggested) that reuse of captures in subsequent years, which does sometime occur due to staff absence, shouldn’t be permitted if student contributions are captured. This is because you are exposing student contributions to cohorts they are not involved in, which is probably more of a moral issue than data protection, as the student is unlikely to be identifiable.

What I’m not sure is clear at the moment though is how students should ‘buy-in’ to the ‘contract’, or package of education if you will, on offer from an institution. All examples of data processing must be clearly indicated to students at the point of registering, surely? Yet, students might not understand or recognise some of the use cases within higher education. Their experiences of school education do not prepare them for the learning approaches they will need to adopt and develop in higher education, we know this already. I’m not convinced that policies alone are enough to enable informed consent or being an informed party to a contract, and there is certainly a role to play for explaining and justifying higher education processes to applicants and students.

Better policies

Thinking about some of the data subject rights of access, portability and erasure, these are all difficult to manage with lecture capture. However, policies for institutional services such as this should, by default, contain clauses that address these issues. Indeed this ‘tightening up’ of the fundamentals that should be included in institutional operations and policies is one of the key messages from JISC guidance (Reeve, 2017). By ensuring an institutional approach, educators can get on with teaching and supporting learning without trying to come up with processes themselves that are compliant with legislation. Higher education institutions are however renowned for localising implementation of policy at department or school level. Discipline differences often do require adaptation of policy to ensure both a pedagogic and professional fit. Yet, there are some practices justified under local interpretation that really should stop. The University of Cambridge’s working group for GDPR produced a very useful document which looks at linking legisliation to actual practices taking place in the organisation, with actions to amend these practices based on GDPR. This is a practical document which reflects institutional policy and explains why institutional approaches are valuable (and hence well worth a read). Awareness of GDPR principles, if not the detail, does need to be a requirement of professional practice in education. Institutional policy does not need to be unnecessarily restrictive, and it is this coarse misinterpretation of the detail of the legislation that constrains innovation.

The VLE cannot die, but innovation can

We can all expect to see even more cautionary warnings not to use ‘unsupported tools’ for fear of personal data being shared or mishandled by whoever, not meeting data protection requirements of UK or EU legislation. GDPR does not make personal learning environments formed of a mix of online tools a thing of the past (or rather a thing of the future, that didn’t really take off mainstream). GDPR does, however, allow institutions to bang the drum for championing their institutional services such as the VLE and other online tools for which some form of agreement (sometimes quite loose) has been agreed. Rather than enabling educators to develop their professional skill set and support them in making informed decisions about which tools to use, there is a risk that institutions will put a blanket ban on the use of third-party tools. Educators should be empowered to take risks with the pedagogy, without fear of reprimand, whilst still minimising risk in terms of student personal data. There are caveats to this, but is it the responsibility of institutions to allow both innovation and data protection risk management to be part of professional practice.

The arguments against using third-party tools seem to come from two places. First that data will be shared from the institution to the third-party. Secondly that there is an ethical obligation or duty-of-care to students to ensure they are not obliged to hand over personal data to organisations they do not wish to.

The first issue is clearly handled by GDPR in the form of data sharing agreements and data protection impact assessments. As such, the use of third-party tools is not prohibited by GDPR, but for some uses there are restrictions. Many of these restrictions are to do with ensuring correct handling of data and data protection at a standard that meets UK and EU legislation. My understanding here (and again I remind you I am not a lawyer and accept no liability for your application of my musings) is about working out who the data controller is and who the data processor is. I would heartily encourage interpretation yourself in light of the Data Protection Act 2018, s32 and s.55-65 (these sections are for law enforcement, but provide some insight into the roles of controller and processor). If you collect the data, “determines the purposes and means of the processing” you are the controller. The controller is going to be in a position to deal with any data subject requests to do with data subject rights. If you pass data to a third-party about someone else, then there is a process to follow which includes data sharing agreements, informing the data subject and having a legitimate basis for processing. Similarly, if you have contracted a company to act on your behalf to deliver a service, for example a survey platform or cloud VLE, they will be processing data which you will be the controller for under contract to deliver a service. Another example is providing email addresses from your data set to a mailing list service.

If the third-party manages and controls the data, you are probably in the position of a data processor, in which case you still have obligations, and again this might be in the form of a contract or perhaps terms and conditions. This is, as far as I can tell, how an educator might use Facebook, Twitter, Padlet, YouTube, WordPress, etc. Students may be submitting personal data to these platforms, but as an educator you do not have access to all this personal data (you won’t have IP addresses, personal details or anything the student hasn’t willingly shared) and you won’t be able to manage this data (delete, control its use or address data subject requests). As an educator you may be processing what personal data is shared by the student to the third-party platform (for example names, photos, contributions), under the terms of use of the service website. That, in itself to me at least doesn’t appear to break GDPR, as long as the service conforms to data protection regulations.

“Personal data should be processed in a manner that ensures appropriate security and confidentiality of the personal data, including for preventing unauthorised access to or use of personal data and the equipment used for the processing.” (EU, 2016, Recital 39)

Whether and how a third-party service adheres to data protection may be the risk factor that is leading many to discourage third-party services without contractual obligations in the delivery of their educational service.

What also seems to be of concern amongst educational institutions is the moral implications of asking students to engage with a service which the institution has no control over the data or perhaps can clearly convey how the data will be used. This is where open discussion and a collective agreement about what the teaching group feels will best support their learning comes in. Indeed, being conscious of your students’ perceptions about using third-party online services is important to avoid divided cohorts and false assumptions about willingness to engage with a learning activity (Wenham and Cornock, 2012).

When I delivered social media training to students back in 2012, many students were not aware of the way their data was being used by the likes of Facebook and Google, in particular to do with how these platforms make money. However, with increased, if not somewhat delayed, media attention, individuals are perhaps more digitally aware of the misuse of personal data by big online platforms than previously. GDPR compliance by third-party websites helps to alleviate some concerns, but in terms of risk management, educators may be under pressure to conform to institutional services that provide equivalent learning processes, deeming third-party sites to not meet the criteria of ‘necessary’ for performance of contract.

I would strongly argue that using the breadth of learning technologies and online tools available is necessary, particularly as many institutional platforms are not available to students on graduation. Students should be equipped with the skills to use and have demonstrable experience of third-party services that are authentic to ‘real-world’ employment and discipline-specific professional practice. Whilst an increasing number of universities have arrangements with Google and Microsoft for GDPR compliant cloud applications, there are many more platforms and tools that students should become familiar with. GDPR isn’t a barrier to that.

Improving professional practice

Even with my encouragement to embrace third-party tools in mind, there are some tools and practices that cannot be justified. GDPR is clear about reducing unnecessary data collection and unnecessary data retention. By reducing collection, retention and distribution, you reduce risk of data breaches and can meet data subject requests fully by evidencing you have adhered to institutional policy. There are some clear cases that Glynn explored, which are worth reiterating here.

Glynn made a very valid point that just because something has always been done, it doesn’t make it right. The first example is transferring student assessments onto personal devices for that regular occurrence of ‘marking at home’. Assessments will contain personal data (even just exam numbers) and mark sheets tend to hold more personal data. If your university has Google Drive and you use Dropbox instead, you have no justification. In fact, if your institution doesn’t have any cloud based file storage agreement and you use Google Drive, Dropbox or anything else, you have no justification. Data in these cases is being transferred to a third-party and unless that service meets GDPR standards and students are aware of the use of that service (as a minimum) then you cannot just send data around how and where you like because it’s more convenient to you.

Another example looks at definitions of anonymity. Glynn raised the issue of exam results posted by student number, a pseudoanonymous institutional identifier, This paper-based practice of pinning results to a board or emailing out a PDF falls foul of GDPR, because the data is not truly anonymous. The identifier is still unique to an individual student, doesn’t change by assessment and may end up likely known to peers. With poor pseudoanonymous practices in play, in one swoop, Glynn counteracted notions of ‘the VLE is dead’ with this example and how VLE assessment tools provide private access to results just to those whom the results are about. The devil is in the detail though, and whilst VLEs have lots of toolsets for various processes and learning activities they rarely address all needs in all cases. Further, student data sits and is often duplicated across many institutional systems, not just in a VLE. The VLE could still die, just better integration with institutional records would need creating with other service providers, as well as the aforementioned data protection, risk assessments and data sharing agreements all in place.

As a side note, I would like to point out the ICO guidance on pseudoanonymity also comes with a suggestion that it may be permitted if the route to de-anonymisation is particularly complex.

“Personal data that has been pseudonymised – eg key-coded – can fall within the scope of the GDPR depending on how difficult it is to attribute the pseudonym to a particular individual.” (ICO, 2018b)

The ICO guidance document on anonymisation is certainly another worthwhile read, particularly for researchers (ICO, 2012), though I’m not sure if this requires updating itself in light of GDPR.

I’ve also reflected on my own practice of using learner comments from our online courses. Massive, open online courses can struggle to form a community of learners. Interactions between educators and learners and recognition of achievement of learning are two ways that help to develop this community. On registration learners agree to terms and conditions which state course comments may be used under a Creative Commons license (though I expect that will change). In order to engender a community of learners and build a connection between educator and learner, a few comments are included in the weekly course emails, named and linked back to the original course page. In light of GDPR, the practice of copying comments into emails now requires consent. This not only significantly slows the process of creating the weekly communique, but does result in learners comments no longer being highlighted and circulated with the whole cohort. Whilst I have always been careful about the type of learner comment chosen, to ensure it does not contain personal data or paint the individual in a negative light, I have not considered that some people may not want to draw too much attention to their contributions to a course. Thinking about the face-to-face analogy, it would be the same as projecting onto the board whatever a student spoke in class, for all the rest of the group to scrutinise. Now, discussions are still highlighted in emails, but linked to instead. The content is not circulated, but individuals still get their recognition and positive contributors to the learning experience. GDPR, even in a heavy-handed implementation, has changed practice for the better.

Final thought

Implementation of GDPR should not be prohibitive, but permissive. There is certainly room for tightening up the handling of data, but this needs to be grounded with the principles of the regulations, not become a stick with which to beat non-conformity out of education. Institutions need to avoid going down the route of lists of ‘must nots’ and start equipping their staff, whether willingly or not, to engage with the legislation that affects their practice. GDPR does not kill innovation in education, uninspiring management takes care of that all by itself.

Updates to this article

Quotes from sections of the Data Protection Act 2018 specifically relating to law enforcement and security services changed to equivalent general EU regulations. The UK legislation makes stronger emphasis over data security, but explicitly within legislation for law enforcement and security services use of personal data.


EU (2016). Regulation (EU) 2016/679 of the European Parliament and the European Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation).

Glynn, M. (2018). A Learning Technology professional’s guide to GDPR in the classroom, ALT Webinar, 8 May 2018.

ICO (2012). Anonymisation: managing data protection risk code of practice, UK Information Commissioner’s Office.

ICO (2018a). Lawful basis for processing, Guide to the General Data Protection Regulation (GDPR), UK Information Commissioner’s Office.

ICO (2018b). Key definitions, Guide to the General Data Protection Regulation (GDPR), UK Information Commissioner’s Office.

Reeve, D. (2017). Preparing for the General Data Protection Regulation (GDPR). JISC.

Data Protection Act (2018)

Wenham, A. and Cornock, M. (2012). Engaging students in online social spaces: experiences of using Facebook for teaching and learning. Presented at the University of York Annual Teaching and Learning Conference 2012, 16 May 2012, University of York.





Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.