Learn more about the impact of the GDPR in the Higher Education sector in our blog.
General Data Protection Regulation (GDPR) is an European Union (EU) regulation intended to strengthen and unify data protection for all individuals within the EU. The enforcement date is 25th May 2018, at which time organisations in non-compliance will face heavy fines.
What will that mean for Higher Education Institutes (HEI) and other public institutions that need to deal with students' and prospects' personal data on a daily basis?
These and other questions were the focus of an event organised by FULL FABRIC, where a series of subject matter experts explain the impacts and the opportunities created by this regulation.
This is the first video in a series of six, where Ardi Kolah, head of GDPR transition programme at Henley Business School, explains the challenges a HEI has to address to be compliant with GDPR.
Thanks very much. And this is a maiden voyage for me because I've spoken to a huge number of multinational companies. I was in Switzerland earlier in the week to consider chief legal counsel some of the biggest organizations in Europe but I've never really addressed my own sector which is the education sector some particularly delighted to be here. What I'm here to do really and in the hours that I've been given which is really generous and thank you so much the organizers for inviting me to do this is to kind of go through if you like or thinking in relation to GDPR. And the first thing we're going to say there's a huge amount of noise out there isn't there. If you look at the web. This is a huge amount of ill informed comments. If you look at LinkedIn there are people actually getting this stuff wrong.
00:00:49 Everyone's claiming that they've got the best program in the world that can actually turn you into a DPO or an expert in the space of four hours. It's all very exciting actually. The reality is very different. The majority of organizations that I've been working with. And been training understand that there are three things that they understood that they need to get right in relation to this regulation and this applies to everyone in this room too. We have to understand business continuity. We have to understand risk and we have to understand technology. Simply being told what's in the regulation as interesting as it may be doesn't actually get to the outcomes that you need to do is to deliver as a result of complying with the regulation. So the thing I'm going to be focusing on is how do we join up the dots in relation to business continuity risk and technologies.
00:01:37 That's point one. Point two is that there's a lot of really great opportunity in relation to the GDPR. All we ever hear about. I'm sick of hearing about 4 percent of global turnover and 20 million euros. Yes we all are. Interestingly it doesn't apply necessarily to us because technically everyone in this room is actually a charity aren't they.
00:01:58 So most major universities are actually with charitable status. We're still going to get fined. I happen to also work with the Elizabeth Dedham, the Information Commissioner. So I do know what's going on in her mind. Yes we can be fine but that that 4 percent and 20 million applies to commercial organisations. Let's just be clear what we're talking about here for all points of view for everyone in this room. This is maybe a regulation but it is more about compliance and regulation is much more than that it's about reputation. All of us in this room can't continue to have a successful university or academic practice without reputation and that this applies completely. So we have to do things in a way because it's the right thing to do. So hopefully that's another message that will come through from this presentation.
00:02:48 We all live in a world of uncertainty and so are the people on that screen. They're also living in a world of uncertainty. We find ourselves in a situation where who is going to apply to come to our universities and colleges post 2019 do we know? Have we got a sense of what that looks like? Have we got a sense of how we're going to continue to market the fantastic opportunity that we have for people to come to our university. Well yes we do have the opportunity but we've got to do that in the context of European law and it is law not 2050, not 2019, 25th of May 2019, next year. It's law now. We're in a transition period so we need to start to think up our our strategy in relation to how we can get people to come to our universities and colleges and our programs wether that's undergraduate whether that's postgraduate.
00:03:39 So we need to start thinking about this now because it's law now. So what does the future hold. It's uncertain but we do know certain things and one of them is if we are looking to get more students from the European Union we need to be able to process that data in a lawful way. That's that's just you know the basics really. So this presentation will hopefully shine some light on that. Talking to Elizabeth Denham. She's from Canada. She was appointed by parliament, the back end of last year. She's very very clear in terms of what she wants to see happen. What she will see happen is that Britain continues to have the influence it always has in relation to data protection and privacy development not just within the European Union but across the world.
00:04:26 So one of the things I do is I edit the journal of data protection and privacy, which I also created alongside this program at Henley and I get articles from all over the world from some of your colleagues and others as well which is really interesting not just from the European Union but from other jurisdictions as well. And what's really clear to me looking at all this stuff is how important Europe is in relation to the development of data protection and privacy. How important Britain has been in relation to that whole discussion and the laws and regulations we have a lot of that is actually as a result of the work done in this country. So this whole imposition of Brussels telling us what to do is crap. Actually a lot of the things that we're doing a lot of that really good thinking has actually happened within the UK.
00:05:12 So where are we going to be post 2019. Well we hope to have special observer status at the European Data Protection Board level. And I think Elizabeth is going about it the right way and she's hugely respected by the European Commission. One of the things the European Commission has asked Elizabeth to do is to write the guidance. There's lots of bits of guidance flying around but one of the guidance notes which will probably be one of the most important is on sanctions and fines that's actually going to be written by the United Kingdom and then decimate you know shared with all 28 member states across the European Union. That's how important we are in relation to the thinking around data protection. So we're not leaving the stage.
00:05:53 There are other things that she's going to be looking at from a global perspective like a privacy framework which can actually apply globally. So it's not all doom and gloom. There's definitely a huge amount of opportunity and we can definitely lead the way in this area if you like. So you know will they come here to study? One of the things we don't know is when we do negotiate our exit out of the European Union can those students if they come and study at all universities can they get work. Can they then continue to be in this country and work. Because that's been a real driver hasn't it for recruitment. Yes. So a lot of the business schools that I teach at that's really important you can come in to your degree.
00:06:33 Then there's an opportunity of having some work experience. Now we don't know whether that will continue but that's been a really good driver for bringing people in. So we don't know if they aren't is the answer to that question but we hope that that will be the case. So one of the things I'm particularly keen on is to think about opportunity rather than threats. Everyone goes on about the threats everyone goes on about this kind of way of compliance on their shoulders. And yes we do have to comply but we have to also think about how we make the best use of this opportunity. And there are opportunities. Building deeper digital trust with students, you know. This person if you like, you met them, you've worked with them, you want more of these people to come. So some of this some of the research shows that 57 percent of students will share information if they know it won't be sold off or shared with other people that they don't know you're sharing it with.
00:07:26 53 percent will share information if they can be guaranteed that it's protection safeguards in place. They are no different from you or me. And that's the point really. The GDPR applies to that person on the screen but it also applies to our own employees. We can't forget our own employees.
00:07:45 They have the same rights under the GDPR as the students that were transferred into our universities. So we have to always remember that as we go through thinking about compliance. So we're saying goodbye to the data protection directive which gave birth to the data protection act as you know if you're up to date with you when used in the Queen's speech that's going to be a new data protection act that will be in alignment with the GPDR because it has to be, from a legal point of view. Germany has just passed a law recently ahead of 2015 May next year. It's slightly one case because it's not totally in line with the GDP or that can't be allowed to stand. They'll probably be a legal challenge and it will fail they will fail. They have to change that. So the GDPR is bringing this level of consistency and harmonization to thinking of how we go about processing personal data.
00:08:37 So remember, it's about personal data and special personal data. It's not about other types of data only about snow and special personal data. So we're saying goodbye to the Data Protection Directive, and we're saying hello to the GDPR. Now interestingly the GDPR has taken four years to cook. It started its journey in 2012, finished last year. We now have two year transition period and that finishes in the 25th of May. Sounds like a marathon doesn't it. Yeah. Actually if you think about it if you haven't started your compliance journey you've got something like two hundred working days left. Sounds a bit short doesn't it. So this has been the most scrutinized piece of regulation ever. There are organizations now waking up to the idea that maybe they should start thinking about what they should be doing in terms of organizational technical changes that they require.
00:09:30 Well let's hope they can put their foot on the accelerator on that. Hopefully none of us are in that situation but if we all we need to start thinking about what we do between now and the 25th of May and I'll give some indication of what I think is important from a prioritization perspective.
00:09:46 So it is about a higher standard of data protection and privacy. And there's a lot going on at the moment where universities are absolutely being targeted by hackers. Just put your hands up has anyone had a recent data breach as a result of any hacking activity at their university. Anyone? No? Wow that's impressive! Or unless you just want to admit it in. That's OK too. We are definitely susceptible. You know because we make headlines you know nothing nothing better than a story where the university is called students and someone's hacked in and all sorts of all sorts of things start to happen. That's a great story isn't it. Yes that gets the BBC really interested so we don't want to be in the headlines for the wrong reasons.
00:10:33 So I think we need to start thinking well actually if we do the things that we need to do does it make it less risky for us. Well the answer to that question is definitely yes. Security of processing is in there but there are other things as well within the GDPR that we need to be cognizant about. So it is about taking a risk based approach. This is very different from the data protection act which is very different from the data protection directive which was much more of a tick box exercise and data protection was something we "oh yeah data protection - tick". It now has to be absolutely central to our thinking. So it's taking a risk based approach it's outcomes driven. It's not tick box. So the way in which we thought about data protection before and how we've learned about it we have to kind of reboot our thinking around it.
00:11:19 This is evolution in many respects but they're all new things that we have to do and that means rebooting how we think about this stuff. OK. So that's pretty important. There's some really interesting research out there. Again I'm not one to go into great detail but you might want to look this up on the Internet. The various global data breach report it talks about cyber denial of service as we know crime where etc. etc.. So you can see if you like in the whole landscape of vulnerabilities the kind of nasty things that could happen. And these are happening on a daily basis. So we can't ignore that. We can't ignore that. Why the need for change? I get asked this question a lot. Weren't the laws great enough in the first place so it just needs to be enforced better.
00:12:06 Well we've had 20 years of pretty ineffective data protection laws and privacy. 20 years of a patchwork quilt across the whole of the European Union where they do different things but we didn't actually know how data was to be protected in Germany versus Italy versus Spain versus the UK. Did we go jurisdiction shopping? Did we put our main establishment in Ireland because we think they might interpret data protection laws in a much more sort of business friendly way. All of that is just swept to one side because it wasn't working. There were all sorts of problems and now we have one regulation, one regulation. We have a data European Data Protection Board that polices that regulation. We have supervisory authorities previously called Data Protection Authority so it has a member. So Elizabeth will sit on that European Data Protection Board.
00:12:57 There is a member from each of those supervisory authorities there, so they are working together. I tend to talk about them holding hands properly metaphorically but you never know. So they are all holding hands, they're more joined up. So that means there could be a decision made in Italy which could have an impact for us here in the UK because the whole jurisprudence around this is different now. So just bear that in mind. It's much more of a harmonized approach to it. It's not perfect. It's not you know, it's not impossible but it's not a hundred percent foolproof but it is definitely a step forward. What's also interesting if you go back in time and look at actually what's the genesis behind this regulation it's in fact not tied to privacy per se it's in fact competition.
00:13:49 Because one of the things they want to do was to actually create a legal framework for the world's biggest single digital market. That's the European Union. Five hundred million people said they needed a framework which is going to help make that market function better. One of the things that we're particularly keen on in 2012, when this was first put forward by the European Commission was to bring the cost of entry down to make it easier for organizations to offer goods and services into the single digital market. And that was to increase competition and choice. So you can see where that was going. And then the pendulum swung all the other way, away from competition to data protection and privacy because of all the problems that we all that we know about.
00:14:34 But there are still elements of competition. There's still elements of choice and freedom that are still within the GDPR. So it's not just about data protection. It is about putting more power into the hands of you and me and our customers our clients our students our supporters. If we're a voluntary organization and our employees it's about putting more power into our hands and having more control over what happens to our personal data. So really speaking that's a good thing not a bad thing. So where are we on the compliance to do it. So I thought well maybe it's like the yellow brick road you know which one are we. Are we Dorothy? Are we the thin man. Perhaps the Lion or the Scarecrow. Maybe we're the Scarecrow. Maybe we're a bit of all of them. Maybe we're a bit of all of them.
00:15:30 But we've got to think about this as a journey and actually the 25th of May next year is the first small step if you like from that point in terms of the journey we're actually on the journey now. And if you're not on the journey now. It's probably time to think about starting that journey. So we need to think about this is not a one. You know it's not a one hit wonder. Not like we've done it now, we can go back and doing our real job.
00:15:56 So this is going to be with us and according to the European data protection supervisor who had a conversation with he thinks this regulation is going to be around for the next 20 years I think that problably optimistic. I mean there may be a revision maybe after a decade or so if we're still working the jobs we've got today. Maybe I haven't retired.
00:16:16 Certainly this is to say. OK? This isn't some sort of fad this isn't some sort of thing it's going to be here today and gone somewhere and then get on with the rest of our lifes. This is pretty important stuff. Fundamentally it's going to change the way we think about how we process personal data, but also we're going to have an impact on culture. I was just having a drink out there and it was really interesting talking to some of you about culture because you know you can impose rules and regulations but I've worked in university, academic people that've work with us as our colleagues, they are very independent minded aren't they. Yes. OK so how do we get across the point about actually we need to slightly change our behavior. It's very difficult.
00:16:57 But you have to start from the point of reputation not regulation. You have to start to think about "our reputation" and what does that mean globally what does that mean compared to other organizations' reputations. Because there will be people who are not in this room who will get this right who will be if you like in competition with us for talent because ultimately that's what we want, we want great students we want great outcomes. Yes. Because that builds reputation. That's how we are successful, as universities. That's how Henley is now number one in Europe for GDPR training senior execs because I've been on this two and a half years. I can make sure that who ever we have in terms of our faculty is absolutely the best. That we understand technology and risk in a way that others don't do. That we really do everything in colloquial English not legal gobbledygook, not technobabble.
00:17:54 Everything we do is in colloquial English and it's online. So it's easy to get education across. And dare I say it can be entertaining. Well we try. So will we be ready in time? That's a question that you put your hands up. That's a question I asked in Switzerland earlier this year earlier this week and no one put their hand up. Well that was pretty you know it's a loaded question really wasn't it. The point is this. The priority and I mentioned that right in the beginning what's the priority? The priority is to identify very high risk processing that all of us are involved with right now to mitigate that to a residual risk which doesn't cause harm or damage to data subjects. That means students that means employees that means anyone that was trying to process their personal data.
00:18:47 That is what you're expected to do. If you don't do that then don't be surprised if you do get a sanction and a fine because doing nothing is known as an aggravating factor. So this is five points. So if you're if you're taking a view that you're mitigating the needle goes down if you're not doing anything or you're doing things which are a bit dodge the needle goes up the K is really quite simple but they take that into account and I explain that a bit more. So that's what you've got to do. So forget everything else I say what do we do which is very high risk. How do we mitigate that? What steps do we need to take? I'll explain what that is that's technical and organizational measures that you need to take to reduce that.
00:19:28 Risk to mitigate that risk to a residual risk which won't cause a problem. OK. All with me so far. OK so I'll carry on then. That's really important. So let's peek inside the vault of data privacy to see where other organizations didn't quite hear that, and of course let's start with the ransomware, denial of service attack. Yeah we all know about that, we're still living through the repercussions of that. So when you look at the news reports you kind of think "oh poor old NHS they were victim". What actually were they the victim or were their patients the victims, were those people that appeared on TV saying that they had to cancel their operations, that their families have come down from all parts of the country to book hotel rooms so they could be near the person taking the operation.
00:20:13 They'd lost all that money, were they the victims? So from Elizabeth Denham's point of view she was pretty annoyed because I had a conversation with her about this. Clearly there were not technical and organizational measures in place to stop that happening or at least reduce the risk of that from happening. But actually, if you think about it in that case, it was human error. They were the human factors not the technical ones, the human ones. We were using software which wasn't being supported. We weren't updating software we weren't updating patches, all the kind of things we need to think about doing which are actually important. In fact it was Windows 7 rather than Windows XP that was the vulnerability. So if we know that why don't we update our software what we actually go to a platform and use things.
00:21:01 Well it's a question of priorities, isn't it. It isn't about money. You've got money it's how you spend that money. It's what's important and that's the point. They didn't think it was important enough. They didn't think about what the repercussions, we're they trained to understand that? did they really understand that? There's a question mark. So clearly lots of lessons to be learned from that particular data breach. Chalk Talk of course about to change the business model, they don't use puppets anymore. 270000 customer accounts hacked for example not just in the U.K. but in Poland as well. Then we got Sports Direct. And that's an interesting example because that actually was a hack which affected their employees personal data not their customers personal data. And they kept quiet about it. Not a great idea.
00:21:54 Tesco Bank. Sure some of us bank at Tesco's. Except of course those 40000 customer accounts were actually hacked last November and 20000 of them had money taken out of them. So what's going on. Why are these things happening with such frequency. What are they doing? What are they not doing? Three mobile phone operators. Now this is quite interesting because what you have to do is you have to produce an initial personal data breach report. Then there's an investigation and then you have to put a final personal data breach report. And guess what happens. They compare the two. Just to make sure that you knew what was going on. And if there's such a difference between the first and the second it was clearly you didn't know what was going on you weren't in control.
00:22:42 And. It goes up it doesn't go down as an aggravating factor. So that really comes back to training again. How do we define what personal data is. How do we define what praocessing is, from our perspective. Who's involved in that. The value chain. Are the right checks and balances in place. Do we have the right technical and operational measures in place to make sure that we are compliant. It just makes sense. Really. And if there isn't then don't expect mercy.
00:23:13 Sony Playstation, I'll put in a few of these gaming ones because I'm actually off to Malta on Monday, to actually talk to the gaming industry in Malta, because they've got millions and millions and millions of data records that they're processing. So again this is particularly of interest for them as well as the Sony Playstation: 76 million. That's more than the population of the United Kingdom.
00:23:36 Sega: 1.3 million. And of course I thought not to be left out. We have had our moments and this is back in December 2015. University students across the UK have been unable to submit work after the academic computer network is Janet.
00:23:53 So now we have to go back to 2015. The good news is we've obviously learned our lessons, which actually is quite important. One of the things about the GDPR is it's expecting all of us in this room to learn through personal data breaches that we may have had but also near misses as well. So if there's been a near miss we need to understand what happened and how we can learn from that how we can make sure it doesn't happen again. This didn't happen under the Data Protection Act. This didn't happen on the Data Protection Directive. So this learning, this learning thing is great, because we were all interested in learning aren't we. So we want to apply the things which are part of our DNA as being an educational establishment that's about learning.
00:24:43 So we can apply those great things that we know to something like this so we can learn from our previous mistakes. That's actually really important, number one. And number two whatever we do, please record that. So training is recorded. So should there be a personal data breach, we've got a narrative that we can put in front of both the regulator and the supervisory authority which demonstrates that we are taking things seriously, that we're not one of the bad guys, that we're actually the good guys and we aren't to be treated harshly compared to others who don't do that. So just think about what we're doing. Record what we're doing to make sure that we're constantly thinking about risk mitigation. OK? Remember this is a risk based approach to data protection and privacy.
00:25:37 David's point is really well made and of course what David just said is about reputation irrespective whether that was technically a personal data breach which David has explained it came close, it was probably a near miss, it still has that kind of reputational impact doesn't it. So these things are important. As I said right at the beginning, and thank you David. It's about reputation not just about regulation. And of course we've got the biggest one ever, which is Yahoo. And of course they sat. Quiet for three years. OK. So quick quiz question. What's the extraordinary answer. What's the length of time that each one of us have got as an organization once we know there's been a personal data breach.
00:26:34 What's the length of time? Can anyone have a go? You have to just put your hand up please. Yeah but you know that. So it's 72 hours. OK so for extra point what was the original time scale and all three previous drafts of the GDP?
00:26:54 Yeah you're very good, aren't you? You should be up here. It was 24 hours. OK so for a third point, and you may not get this one. What's the average length of time normally it takes for an organization to know he has a personal data breach? So we've got remember the time they've got to report is 72 hours. But what's the average length of time for the organization to know it's had a personal data breach?
00:27:19 What do you think that is? And you've answered too any questions, anyone else wants to have a go. Three days. It's like a quiz game. Is that higher or lower higher? Higher! You know the answer don't you? Anyone else? 138 days.
00:27:44 In fact someone said to me last night actually 200 I said well let's just say 138 days. 138 days here, 72 hours here. Go figure that one out. OK. You've got 72 hours to report you know there's been a personal data breach. Once you know it. And actually it takes 138 days. So what's going to change? What's going to change, is we hope, is that you've got the right policies processes and procedures in place that you've got the right technical and organizational measures in place to allow you to comply with the GPDR. And if you can't do, that forget about personal data breach. You can still get sanctioned and fined.
00:28:27 We've counted, there's probably around about a hundred trip wires within the GDPR. It doesn't require any personal data breach happening where you can be sanctioned and fined. OK? So you need to start talking to your I.T.. Remember what I said earlier, it's about business continuity, it's about risk and it's about technology. This isn't just a compliance and regulatory issue. You've got that now haven't you, because how do you deal with that. So, that was a very interesting case study. So it's about the focus on reputation, not just on regulation. And you can't leave it to chance. I had a conversation with someone and they said "This GDPR, we can afford the fines". They literally said that to me: "we can afford the fines". Of course that wasn't a university. And I said OK.
00:29:17 So what about your reputation but also you could always appeal the fines. But what about if you had what's known as a stop order, slapped on you. So in other words you didn't have the technical organizational measures that you should have done. The ICO supervisory authority said you gonna have to change what you're doing, how you're doing it. We're going to give you six months to do tha,t we're going to come back and see what you've done, and you still haven't carried out those technical organizational changes. Guess what. We put a stop order on you: stop processing personal data on a temporary or permanent basis. I don't about you but I don't know that many people that have come back from the dead. So actually you're dead. Forget about the fines. All this stuff about the fines.
00:30:10 But the real threats, the real threat is that. It isn't all about fines. It's about that. And we can't work then, we're stuck. So actually you've got to take this seriously. You've got to start thinking about reputation, you've got to start thinking about our duties and responsibilities. We've known about this since 2012. It's not like: "Oh my god, what's going to happen on the 25th of May?" We've known about this. We've been given a two year period to get our house in order.
00:30:40 Because it touches a deep tissue of all our organizations. That's why they've given us two years. And it's counting down.
00:30:50 TalkTalk. Anyone who is a member of TalkTalk? This is quite interesting as another sort of sobering example of course doesn't apply to anyone in this room. TalkTalk personal experience. It wasn't the first, it wasn't the second, it was the third. The third data breach they've had. What makes that, OK that's pretty bad, but what makes it even worse is in fact they were told about their technical vulnerability, by the regulator before it happened. Was there any learning going on there, you think? So they've carried on, had the massive personal data breach. The result was the highest fine given to a commercial organization: £400000. The maximum is £500k. They obvisouly got a discount. So £400000. So, another quick quiz question. If this had happened under the GDPR, what would have been the fine that they would have probably got?
00:31:50 Any ideas? What would have been the fine? 70 million? Is that higher or lower? You all think he's right. Well I made it 70 million so I would give you a round of applause for that. Well done. He actually got it spot specs. He's probably more right than me and we'll let him off. 73 573 Million. There's a difference isn't there. So and of course, that then had an impact on it's share price, that had an impact on shareholder sentiment a lot of unhappy shareholders. And they haven't recovered.
00:32:43 I checked last night they still haven't recovered, about 40 percent down on their share price, of course were not companies. Do we care? Of course we care, because actually the real tragedy of this story isn't the lack of money that they've lost as a result of customers walking away. The churn rate that they had the decrease in the value of their share price. The real tragedy is it could have been avoided. Now that does make a difference for us. It could have been avoided, had they trained their staff to think about these things and brought in those measures. Training is definitely the first line of defense in this. If you don't train your staff, who are processing personal data, it is a breach. And if you have a DPO and some of you may be DPOs, if you are not keeping your own education and knowledge and training up to date, that is also a breach, and you have to do that in an independent way so it can't be at the university you are at. Fortunately however that may be others who do that.
00:33:42 So you've got to think about this as being a training and education issue. But that's what we do. So hopefully it's not such a stretch. Quick point about international application. This isn't just about the European Union: if we are using organizations to process the data of students and applicants in India, for example, so that could be Infosys, or Tata consulting services or anyone else, so if they're processing that personal data, they are also subject to the GDPR. This is actually quite important. So an awful lot of us are international, right? We all have international offices and stuff. So pre-processing data that is coming out of the EU, than those international offices, those entities are also subject to the GDPR.
00:34:30 They also have to comply. So that's quite important.
00:34:35 So a little bit about what's in the GDPR. Well there are certain principles. Along those principles are ones that we're very familiar with and some that are new if you like. The first one we know about lawfulness fairness and transparency that hasn't gone away. But remember what I said earlier there are two forces that go through the GDPR: is transparency and accountability. And for Elizabeth Denham, because we're in the U.K., accountability is really important. So don't lose sight of those two major forces that go through the GDPR. Purpose and limitation is really important, so when we're thinking about processing personal data, we need to think about the purposes of it, that's really really important. And there's lots of rights within the GDPR, and there's only one absolute rights. There's only one absolute right for the GDPR and that's the right for a data subject to have a data privacy notice. OK. So that's one of the things that you must do.
00:35:37 Whether that's internally with your employees or externally with students or others that we're working with they going to process their personal data. They must have data privacy notice. To be an intelligible English, not legal gobbledygook, has to be very clear in terms of the purposes, the rights that these people have, right to rectification, duration, etc. The rights to change their mind. And there needs to be a consent mechanism which needs to be clear and unambiguous. It needs to be affirmative action, no pre-tick boxes please. And clearly, an expression of their wishes. OK. This is all about the individuals, all about taking power back. It is about seeing the world through the eyes of the people that we are dealing with. And that's very very important. Data minimization. So this is again, it's been around, but this is really important. All of us are guilty of over collection of personal data.
00:36:34 Some of our colleagues in H.R. will tell you that they're keeping data going back a long way. One of the questions I asked some of my clients is "what's your oldest piece of personal data". Why are you keeping it, why you processing it? Processing has changed. It just means looking at something on a screen is processing, not actually sending it anywhere. So just living it at rest, as they say, is still processing. So if that data sits in your organization and should there be a hack or should there be some problem, it's a serious issue because actually it's not that valuable but yet it would expose a vulnerability for us and a fine and a sanction. We need to think about safety minimization. What kind of data we're holding, do we need to hold that any further?
00:37:19 Do we actually give them a notice about how low we're going to hold that data for? Because that is also part of the data privacy notice and once the purposes have been extinguished, then we should get rid of it. We should get rid of that data. It's not our data. We were controlling it, or processing it, but it is not our data. So what's the purpose is gone, fine we can go back and ask for other uses for that data, as long as that is done on the basis of consent, contract, then fine but we need to communicate with that individual. One of the things that Elizabeth Denham did, going back to what happens in the U.K., she fine 10 charities all in one go. Remember that? Do you know why she did that?
00:38:04 So what was happening I think was that charities were profiling high net worth individuals. OK, you don't need consent in that situation but what they do need to know as a person is that it's happening and they have the right to object. They didn't know, they had no choice. That's why they got fined. She said: "I gave them a 90 percent discount on the fine". So that's good because otherwise all the money would have gone to the Treasury. But she was still making the point. In many respects if you're a charity or a public body you are actually working to a higher standard because you have that trust, don't you. You have that trust, particularly if you're a charity or a university, you're expected to get these things right because actually, that's what we do, isn't it.
00:38:48 That's our job to get things right. So actually that level of trust is higher than say a private organization, when it's just for shareholder value in the making, and if I go out of business so what? Fine. But actually for public authorities and public organizations like ourselves we need to step up to that mark to protect our reputation and to build trust. So that's really really important. Data minimization also applies to how many people have access to personal data.
00:39:20 You've got to reduce that.
00:39:24 So that sort of principle of least privilege, P.O.L.P. Principle of least privilege is quite important. So what you don't want is how every one in the university having access to everything because that is just going to be a disaster. But it also creates huge amount of risk and will get whacked if we do that. So we've got to do on a real need-to-know basis. So again, it doesn't cost us anything to do this, right. It just means we've got to think about it, think about how we are processing personal data and to make sure that we all compliant with the principles. There are seven. You can't cherry pick the ones you like and the ones you don't like, by the way, you've got to actually comply with all of them.
00:40:08 So that includes accuracy, retention, integrity and confidentiality. These are the things that we should be doing already. And I'm sure we are. It's just reaffirming that. So accuracy is important to get in there. There's some new stuff around accuracy. What it means is if there's a student who has a personal record and they want to add information to it then they have a right to do that, to make sure that that record is accurate. And of course if it isn't accurate, they have the right to have that record rectified. That could be important.
0 00:40:41 That could be improved for all of us.
00:40:44 The seventh principle is only a sentence. But again I've made the point about it is about accountability. It's just one little sentence. I happen to think that one sentence within the GDPR is possibly one of the most important. And it's certainly the most important from the UK's perspective.
00:41:00 So accountability. That means recording what you're doing. If you're changing what you're doing, please record it. Make sure you've got notes about that. So if there's a personal data breach further down the line you've got a narrative, you've got evidence to show what you have done. OK. So that's really important too. So accountability and transparency are two of the things that come through this. New relationships. So if you're using a data processor, like I said earlier, they have to guarantee now that they're compliant with the GDPR, not the 25th of May, they have to do that now. And if you're using a data processor to process student data or employee data or any other type of personal data and they're not guaranteeing compliance you are in breach as the data controller and they are in breach as well. Joint and several liability.
00:41:54 So under the old regime was a matter of contracts whether the data processor was going to be taken to court or not. Now it's a matter of contracts and law. Data processors cannot act on any other basis apart from written express instructions. If they move away from what you're telling them to do, they're either a joint data control which opens the door to a whole raft of other roles and responsibilities and duties under the GDPR which may mean they have to do a DPIA, data protection impact assessment, and employ a DPO, or they're in breach of contract with you as the university using them as a data processor. So you need to look at your contracts now with your data processors.
00:42:38 That's a really important thing to do.
00:42:44 Some rate the legal obligations on you, as assets control are significant and the legal obligations on the data processor are significant as well. These are new.
00:42:55 The definition of personal data you go into great detail but that's changed as well. So it's much wider and you know that includes genetic biometric metadata, a whole range of things even radio frequency identification. Because technology has really changed how we can how we can capture information about all of you very very easily. So even items Internet of Things, will be deemed to be personal data as well. So this applies, anything that could be described as personal data, is covered by the GDPR. Things which could have their identifications stripped away from them, and maybe there's a key that would unlock it that's know as pseudo-anonymization. Just a bit of a mouthful isn't it. Had to practice that, before coming here today. Pseudo-anonymization. But that means you have a Key and you can unlock and you can still identify, that's still covered by the GDPR.
00:43:53 But for goodness sake don't put the Key in the same place where you've got the pseudo-anonimized data, because if that's hacked they've got access to the whole thing. Believe you me that does happen as well. If you want absolutely no risk, if that were possible then you anonymize your personal data. So it's no longer personal data and therefore the GDPR not apply. However there's a health warning to what I've just said. There's a possibility you can reverse anonymization. But it's still pretty much robust way of doing it. How valuable is that? Well. That's what our data lake is about, you know? What you can do with analytics around anonymized data fine but you might actually need to have information of a deeper context.
00:44:40 So just remember the kind of data you're processing and how. You're going to define what is personal data for your own organization. You're not told what this is. This is in the regulation but you've got to think about what is it your processing, in terms of personal data and special personal data, which will come to this in a second. What the risk is in relation to that, okay? And how that links to sections and fines, you've got to go through that equation. That's what we teach people at Henley. That's really important, understanding that. We've identified 15 different personal data types as a result of the research we've done. 15, not one or two, 15. The majority would fall into two categories either very high risk or low risk. There's not really anything in the middle. OK? It's either very high or low. But you need to make that clear to everyone in the organization everyone the university understands what's processing, what's personal data and we use those definitions consistently all the way through our data protection policies procedures and training.
00:45:44 And all organizations are different.
00:45:47 OK? So that's a quick screen grab of the 15 data types. And that's something we go through. So they're the principles of data protection. So it's a bit about personal data that sort of special personal data, if you like. You may regard that as being sensitive but the word sensitive was only used once in the GDPR. So we need to make sure we use the right language. That's special personal data. Things like sexual preferences or political beliefs or religious beliefs that kind of thing things, which are really personal to someone, especially data that carries a higher level of protection under the GDPR as well.
00:46:29 So that means we're going to process special personal data. We need what's called explicit consent. OK. There was this whole debate about one of the health difference unambiguous and explicit, it is the same thing you know is it just terminology. Well actually it's the context. So unambiguous for other types of processing which are not special and if it's special, it's about explicit because it has to be linked to the purpose and that has to be clearly spelled out in detail so you can imagine an operation to remove a kidney stone or something, you know, is clearly requiring explicit consent because it's just for that. OK. So just think about the context as we think about that. And then you've got the individual rights that all of us in this room have.
00:47:18 And again very quickly because I've got another 20 minutes left. Right to be informed. That's article 13.
00:47:24 Right of access. Again, these are not absolute right so there are caveats to these, there are some exceptions where these happen but actually they're limited in scope. So these are rights that people have the right of access or subject access request. The rights certification. Let me just talk a little bit about subject access request because in fact after here I'm going to go somewhere else and talk about subject access request. A subject access request for the information commissioner are a litmus test. What I mean by that is if you don't respond to a subject access request somebody wants access to know what you're doing with that data. And of course it's now gone from 40 days to 30 days. 30 working days. If you don't do that, that tends to indicate that there are deeper problems within the organization. So that sounds very straightforward doing a subject access request. It's very important. So the information commissioner is looking at complaints that people can make but they're also looking at social media because what tends to happen is people talk with each other. They have a bad experience.
00:48:34 So social media is sometimes the very first time that the information commissioner knows when there's something that is not quite right, within the organization. They're monitoring social media so just bear that in mind and as a university of course we encourage people to communicate because they communicate through social media. So everything is out there. Everything's out in the open. So what I'm saying we've got to be much more transparent and accountable on how we do things. So that was a bit about right of access. Right to rectification, I've already covered that point that and a lot of lawyers forget to say this is not just about rectifying the record, it's adding data to it.
00:49:12 Adding data to it. Right to erasure, the right to be forgotten you all heard of that one, yeah? So that was a case that's now been incorporated but that principle is now corporations into the GDPR. That's going to cause a lot of problems because an awful lot of organizations, that maybe you, it may not be you, don't know where all their personal data is. I want to ask a question: "Are you confident?". But I've sat in meetings with some very big organizations where actually they've been worried about that. So if someone was to exercise their rights to be forgotten the rights of erasure, could we actually deliver that. Can we find all the instances of their personal data and be able to deliver that. Because if we can do that, it's actually a breach of the regulation now. So that's something to think about. How do we get past that point? Well you might need to think about some form of tagging of personal data so we understand where it is.
00:50:12 The right to restrict processing. So we may not exercise our right to erasure, which is like pressing the nuclear button and it's all going gone.
00:50:22 We might think actually we're going to restrict the processing of personal data rather than press the button to delete it completely. So we have a right to restrict processing for certain situations. The right supportability, now this is very interesting. Because this is new, this is the idea that we have personal data, which is actually valuable, right? So we can use that to maybe get a better deal. How do we do that? Well we can share our personal data with a competitor, maybe another university, see what they can offer us in terms of what we're doing. How would they compare and we can allow one university to send up the data to another university in all for that process to happen. So Data Portability and there are certain caveats around that.
00:51:07 You know, has to be machine readable that kind of stuff. But at the end the day this is going to be pretty basic, really. Financial services industry are already doing this. Will probably end up doing it too as the higher education sector. So this whole cult of you know use of data personal data backwards and forwards to different people. So actually consumers, students, can make informed choices of what they can do by using their personal data. Again it goes back to what I said about, you know, 30 minutes ago, the genesis behind the GDPR was about freedom of choice, about products and services, about, you know, restrictions and so removing barriers to entry in the market. So this kind of fits with that kind of narrative. The right subjects, we all have a right to object to having our data processed.
00:51:59 So that's a very important right. It's not the only right because you may be forgiven if you didn't know the GDPR, you think the whole of the GDPR was about that, the right to object being processed. It's not, it's much much bigger. But that's part of it. And the right not to be subjected to a decision based solely on automated processes including profiling. OK, so that may create legal effects etc.. There's a right to have some sort of human intervention in that. I'm not going to go into great detail about that but if someone's applying for a mortgage and was turned down by the computer then actually that doesn't kind of sit comfortably with the regulators. They ought to be some way in which that decision can be reviewed and challenged and discussed.
00:52:43 So what does this all mean. So we've got the principals we've got the rights of data subjects we need a compliance framework, a data governance framework. So I was talking to my co director who's Professor Andrew Abassi who reliably tells me he's just been appointed by Parliament to look at the performance of the government in terms of how it meets performance standards. It's going to be quite interesting. That will be published and his reporting into the prime minister. So I was talking to him yesterday about that which was really interesting. Andrew's point to me was that we're talking in this program about data governance not corporate governance. So we need to keep our terminology completely clear. And he's absolutely right. It's about data governance. So an effective data governance framework is essential in managing complex value chain and is evidence of the data controller's intentions to deliver data protection by design and by default in accordance with Article 25.
00:53:43 Does that really bother you right now? Well it might do. What it means is this: anything you offer in the market or outside of the digital market must have data protection baked into it's not as an add on, but baked into it. And anything you're offering which doesn't have data protection baked into it is technically unlawful and you can end up a cropper because if people were buying a product didn't have data protection baked into it, those contracts for that could be void, or voidable. So I had an interesting discussion with Hazel Grant, who's the lead at Phil Fisher, the law firm on data protection, one of the top people probably in Europe on it and she would agree. So this is something which again people have to have on their radar. My goodness, that radar is getting bigger. But this is important. So Article 25 is important.
00:54:36 Is there an internal disconnect between what we need to do and our technology that we're using. I'll leave that for you to work out.
00:54:44 Role-based access control in universities. I said this earlier about the principle of data minimization for example, the principle of least privilege, POLP. And of course thinking about your governance, your data governance framework. Nothing is totally risk free, OK? And the regulators understand this. The people who wrote the regulation understand this. So nothing's completely risk free. So what they're doing is they're looking at what measures you need to take which are appropriate, for your sector, for your organization, OK? So that's quite important. So it's the appropriateness of that. In order to mitigate those risks. So it's about your risk appetite. So if you're in a regulated market you've got a very low risk appetite, if not a zero risk appetite. If you're an unregulated market, like a picture restoring business or advertising agency or something, then you probably get a larger risk appetite.
00:55:49 So they're judging you in terms of your organizational and technical measures on the basis of who you are where you sit within you know that particular sector. Now from all points of view in a HE, higher education, we've got codes haven't we. Yes. So the way to demonstrate compliance with the regulation from a regulators point of view is that you have a DPO, data protection officer, which I would have thought does apply for all of us. We've carried out data protection impact assessment which I think I talked about a few moments. And we also comply with codes of conduct. They're the three things that they're looking for for all of us in this room, as a way of demonstrating. And of course the code of conduct could be an ISO standard, or a BSI standard.
00:56:35 Interestingly I don't know whether you know this but there's a new British standard that's just come out which is meant to help organizations comply with the GDPR. I'm just in conversation with the guys. Literally it has just come out. So have a look. Go to the baseline Web site. You may find that useful. I haven't read the standard yet but it might be worth having a look at.
00:56:57 So you need to think about the codes of conduct that won't necessarily make you compliant but it's the culture of compliance that it demonstrates should there be a personal data breach. It's going to be a mitigating factor in your favor.
00:57:13 OK? So you've got to, as I said earlier, understand what is the very high risk. What things you need to do to mitigate that residual risk and make sure that doesn't cause harm or damage. Now what's quite interesting about the regulation, again it's not just damage, not just economic damage, it's harm or damage. So actually you know the line is produced in terms of getting over that harm or damage it's very easy to trip up over that.
00:57:45 And yeah so it's BS 10012. It costs 120£. So a data governance framework is pretty important. Legal regulatory requirements, business governance framework, specific to yourselves, best practice standards. This is all pretty sensible stuff. I would say that most of you are kind of on this already.
00:58:15 And understanding roles and responsibilities.
00:58:19 Some of the things you might think about doing is to get an external adviser involved. That's because you've got individual accountability at board level. So even the person at board level, I had this conversation with David Tyler at Sainsbury's. He's the chairman of Sainsbury's. So they take this pretty seriously as you can imagine Sainsbury's being a huge organization millions of data records and they own archives and they've got the store card, etc. So it's pretty important that the board know that actually whatever is being proposed internally within Sainsbury's is actually the right thing to do. So hence you might consider having some form of assurance from someone like myself or others who can give you that perspective of what you're doing actually is in accordance with best practices. One of the things that you must also do, having got this machine built, is to keep it under review, test it. Make sure your processes do work, you know? Try to test it because you won't know how it works and it may be too late. If tt has to happen for real, and it doesn't work. So testing and practicing. This kind of, resilience approach, if you like, to process personal data is absolutely critical. So you've got to practice, and that's part of your training regime, if you like, for those people who are processing personal data. And make sure your employees know what the incident management process is, because it is an incidence isn't it before it becomes a breach. And use of terminology the use of words is really important because the clock starts ticking. If someone in your organization sends an e-mail which has the word breach.
01:00:02 Because they can see all this, they are entitled to, they have the power to look deeply into what you're doing. So be careful how you do that. The way to protect yourself. Going back again to the point I made earlier about business continuity and risk is about training, so make sure people know how they go about doing this stuff.
01:00:21 And it's pretty much what you do at the moment. Scenario planning and testing, instant detection notification, investigation analysis and resolution, OK? Very very linear. Very very straightforward. But make sure it's compliant and it is within the context of the GDPR. And we've already answered that question. It is 72 hours so this is important. You've got so next May. Think about making sure that you have some form of oversight around this is important to. And a sort of final note, I've come to the end of my time, there are things within the regulation called derogations, which is a very strange word. A derogation means that the member state has the opportunity to open the door walk through the door and do their own thing. It doesn't mean exception as you can apply. You can comply with an exception. It means doing your own thing. So under this walking through the door and doing your own thing that's a derogation, they can pass laws that will make breaches of the GDPR under criminal sanctions.
01:01:38 So they can do that. And Elizabeth is thinking about doing that. So just bear that in mind. There are derogations that you will have to comply with that's known as individual member state law. So it's not just about the GDPR, there are other rules and regulations. I haven't got it to talk about the new e-privacy regulation, Packer for example that will come out later next year. That's about using e-mail marketing and all that kind of stuff. So you all also expected to comply with other regulation and other directives.
01:02:11 And we gave evidence to the. Well actually the oversight for this is the public culture media and sport. And I wouldn't even go there with that or why they had that discussion paper. I said well put your thoughts together about the derogations or exceptions but it's not the exception. You can come within an exception it's a derogation it's all sorts of confusion in terms of terminology. So really speaking those are the derogations within the GDPR. I'm sure you now realize that it's our responsibility.
01:02:46 I'll leave on this. It's the final word from Elizabeth Dunham the new legislation creates an onus on companies and organizations to understand the risks that they create for others and to mitigate those risks. It's about moving away from seeing the law as a tick box exercise and instead to work in a framework that can be used to build a culture of privacy that pervades the entire organization. So good luck. And I'm around during coffee. Thank you.
01:03:58 Well it's actually not the right question to ask. What we have to understand is the GDPR puts responsibilities and duties on us as a university both on an organizational basis but also on an individual basis. So we are accountable to the way in which we deal and process with people's personal data. The GPDR doesn't protect us in that way but what it does do if you die, so your question is almost in my view correct, is it creates a framework for us to think about how we do that. Next question.
01:05:02 Well clearly if we are processing people's personal data in accordance with the law, because we're under a duty to do that, in order to be compliant with the GDPR we need to give a data privacy notice to that person which will clearly explain how we're going to use their personal data, for how long we're going to keep that personal data for which of course is also processing the purposes. Those purposes are clearly identified and they have consented if you like. They've acknowledged that that's what we're going to do with that data, then we're fine, OK? But they can always change their mind. So the question is are there any other legal requirements that we have to comply with? H.R. rules or other rules for example in order for us to maintain that data that might be in which case that's perfectly legitimate, but I would make sure that we put that in the data privacy notice so they understand the.
01:06:23 One of the ways to think about it is how would you feel if that was your data? How would you feel about it is quite a good test actually. Would you feel that's a bit weird? Would you feel uncomfortable about that or would you feel worried about that? And it is a quite a good view. Put yourself in the shoes of the data subject. Take yourself out of your university situation put yourself in the shoes of the data subject. So basically everyone's got a right to have that are personal data protected in accordance with the law and process in accordance with the law, to make it lawful, OK? So it's just being transparent about it.
01:07:19 And they've they've left if you continue to have their information they're still processing that information.
01:07:28 So your policies and procedures that's the key, OK? So it's just being very very clear about what happens, should someone leave. What happens to the e-mail account. Maybe there's a reasonable period of time that maybe after, for argument's sake, after six months your account will be deleted and all the information will be deleted, OK? So you need to think about policies and procedures in accordance with the principles of data protection which you now know in some detail.
Well provided you given that student a data privacy notice where you clearly explain what you're going to do with their personal data and what your obligations are, you're fine. It goes back to being transparent.
There is an exception for historical technical archiving for statistical purposes, so if you can fall within that exception as a lawful basis for processing personal data, you should be fine.
01:09:16 Well processing it's actually spelt out on article 4. Such a definition there. It includes includes pretty much anything you do. Looking at someone's personal data on a screen is processing personal data.
01:09:52 What's interesting, before I go, well if you got Alexa, that's sucking up huge amounts of personal debt and special person. It can't tell the difference between your voice and someone else's voice that's actually processing personal data. No consent, no data privacy notice, it's totally in breach of the GDPR and as part of our program at Henley, we do a one day workshop where we talk about artificial intelligence and the Internet of Things. It's one of those things to think about, because lots of these devices now have that but you've got to bake in data protection in that. So these things will change.