Return to site

Privacy Insights in the Era of COVID-19

Mister Beacon Episode #111

· Mr Beacon

By 2023, those subject to a national privacy law will increase from 10% to 65%. Is this frightening or reassuring? What does it mean to us as entrepreneurs and as individuals?

This week, we have a fascinating conversation with Trevor Hughes, President and CEO, of the International Association of Privacy Professionals, or IAPP. The IAPP is seeing exponential growth, with 57,000 members amongst its ranks and adding up to 1,000 new members a month. These include the giants like Amazon, Walmart, and Deloitte. IAPP is the ‘world’s largest and most comprehensive global information privacy community’. They put on conferences, and act as a resource for knowledge assets and certification. In this episode, we delve into why privacy compliance is essential and the financial risk of getting it wrong, up to a staggering 2% of global revenue in fines for violations. Peeling back another layer of the onion, privacy has a become a brand issue, with more people associating the word ‘privacy’ with the word ‘trust’. Trevor Hughes describes privacy as being measured on a ‘continuum of expectation’. Tune in to hear the discussion on how privacy can actually accelerate technological progress and innovation, how privacy differs through cultures and generations, as well as the implications of COVID-19 on privacy standards.


Narration 00:07

The Mr. Beacon Podcast is sponsored by Wiliot, scaling IoT with battery free Bluetooth.

Steve Statler 00:16

Welcome to the Mr. Beacon Podcast. This week we are going to be talking about privacy. Privacy is just so important for those of us in the world of IoT and beacons, we're dealing with some very sensitive information about where people are, where things are what's happening. And really, this can be make or break for entrepreneurs, whether you're an entrepreneur in a big company or a small company, financial repercussions, massive fines, reputational risk, so we have to get it right. And that's why I am very pleased to have Trevor Hughes with us. He's president, CEO of the International Association of Privacy Professionals. Trevor, welcome to the podcast.

Trevor Hughes 01:07

Hi, glad to be here.

Steve Statler 01:09

So I would love to talk to you about a whole range of things, three buckets of topics. First one is state of privacy at the moment, things are changing. And I think it's worth having COVID-19 as an important, second topic, it's kind of related to that, I guess. But also, it would be good and maybe we should start with this with you just introducing us to ay ay ay ay pp. The International Association of privacy professionals. I was like, couldn't believe it. I heard you talk the other day at a seminar. It was excellent. And then I started browsing through your website, and it is a cornucopia of resources. You're generous in terms of what you make available. Tell us a bit about your organization.

Trevor Hughes 02:06

Sure, so great to be here. Hi, Steve. So I'm Trevor Hughes, CEO of the IAPP, we are the world's largest privacy organization. The IAPP has been around for 20 years, we're celebrating our 20th anniversary this year. We are a not for profit professional association that represents all the people in the world who work in the field of privacy. Sometimes data protection is the way that we frame it, but privacy and data protection. So all the chief privacy officers, all the privacy engineers, privacy analysts, all of the privacy lawyers, they all belong to us. And we are unremarkable in the sense that there's lots of professional associations in the world. So the American Bar Association represents lawyers, the American Medical Association represents doctors, we represent chief privacy officers. What's remarkable about us or what's different about us is that we're a new field, we're a new profession, the IAPP is only 20 years old. And we are growing like crazy. As the issue of privacy explodes around the world, we too, are exploding with it. So the pandemic has slowed us down a little bit, as it has many in the marketplace. But we are adding over 1000 members a month. We currently have 57,000 members and over 140 countries around the world. We have 150 chapters in cities around the world, we have over 25,000 people certified. And we do all the things that professional associations do. So big conferences, you mentioned our website, we have thousands of knowledge assets available for our members. And significantly, we certify individuals in the field of Information Privacy. And I'll just end with one important thing and that is we are not a lobbying group. So the IAPP is not an advocacy organization, we don't go to Capitol Hill or Brussels or any of the big public policy centers around the world to make a case for a particular law or try to stop a lot. Rather we like to think of ourselves as a very, very big tent. So advocates and academics and regulators and public policy professionals and industry. They all come under our tent, and they they fight it out, and we kind of like the fight, but we don't pick sides.

Steve Statler 04:23

Very interesting and your members mean these top shelf, Amazon, Walmart and stern young Deloitte, you have an incredible set of corporate members as well as many, many practicing professionals. what's the what's the mission? You're not trying to change the law? It sounds like lobbying to change the law. What's what, why why is it

Trevor Hughes 04:50

so the mission is to define, promote and improve the professional privacy globally. I guess I started by saying we're not a lobbying group or an advocacy But there is one thing that we believe strongly that we actually do say actively in the world. And that is, you cannot do this with law alone. And you cannot solve privacy and data protection with a widget or a gadget or a piece of technology that you plug into your system. You need people who understand the issues you need professionals. And so we very much are focused on the development of the growth of the improvement of the profession of privacy and data protection around the world. So we take it as our mission, to support all the people that are in the field doing that work today, wherever they might be, however they might be doing that work. We also take it as our mission to identify those pathways that can create entrances to our profession, whether that's through higher education through training or other means,

Steve Statler 05:54

and what are the certifications that you can get under your organization's framework?

Trevor Hughes 06:02

Sure, so we, like many professions, identify those people who have enough skill and knowledge to why to be identified as, as as certified are part of our profession. our profession is an interesting one, because it's hybridized. It's not a single thing. 38% of our members are lawyers, which means that a majority are not 62% are something else, but marketing professionals, IT professionals. They are HR professionals, they are business managers and analysts. As a result, our certification schema falls into three major categories. C IPP is the certified Information Privacy professional. And that is for people who are focused on the law and policy of privacy all around the world, we actually think that's requisite knowledge for just about anybody who works in the field. The cipm is the certified Information Privacy manager. That's the second domain. And that basically assumes that there's a bunch of laws, but then they have to be given life within an organization. And that requires all of the tools of business management. It includes privacy, impact assessments, data inventories, data flow audits, it requires measures and metrics so that you understand the performance of your program. So it's a management certification within our programs. And then finally, the C i p t, is the certified Information Privacy technologists, the certified Information Privacy technologists tip T is for all those people who are working in it, and need to know enough privacy to get their job done. So it includes things like privacy by design, consent, management, controls, data, subject access, right controls, it includes all of the domains that an IT professional might need. So those three domains cover the waterfront for our certifications. And it's a pretty robust program.

Steve Statler 08:05

It's very interesting. Yeah, you cover multiple domains. And that says something about what we all need to consider when we're taking action within our own companies. You can't just delegate it to one person clearly. So last thing about your organization, just these qualifications, how long does it take to if I decided I wanted to be a privacy consultant? And I wanted to be focused on the technology side? Or how much studying and what the exams?

Trevor Hughes 08:38

So of course, the answer is it depends. It depends on your background, your education, what you have done previously, what work you have done. Our programs all have a two day training program that you can, that you can pursue through us or any of our training partners around the world. But we do recommend additional reading through the textbooks that are associated with all our programs. And for someone who is coming completely cold to the field of privacy, you may need to spend a lot of time doing some studying a couple of weeks or more. For someone who is a lawyer or has been working as a as an information security or an IT privacy engineer, you may be able to jump in and get through it much more quickly. Somebody who's been doing privacy for some time may not even need to do any training and they can go straight to the exam. So the answer is it really depends. One of the things that we are finding is increasingly universities, graduate programs, law schools, computer science programs around the world are adding data protection to their courses of study. I have a son who just a year ago finished his freshman year at Bentley University. And in his business ethics program. They actually talked about data privacy, they talked about privacy, and that's a really big development I think so I think the amount of time necessary To train for our programs actually will reduce as we see future generations that have had data privacy, data protection and privacy as part of their core course of studies.

Steve Statler 10:12

Interesting. Let's kind of segue now on some of these other topics we wanted to cover. But before we dive into those, you know, what, what are the consequences of getting this wrong? Why, why what? What are the reasons why you're getting this tidal wave of people coming to you and certifying and joining? Can you talk about what the stakes are?

Trevor Hughes 10:36

Yeah. So without question, the stakes are high, the stakes are incredibly high today. And it it's, it's, I wanted to say it's unfortunate, it's not unfortunate, but it is a fact it's a reality, that often organizations are responding to privacy because they perceive the risk of privacy. Now, those risks come in a few forms. First and foremost, organizations have a compliance risk, there is a regulatory risk associated with privacy, whether you're operating in Europe, and it's the GDPR, or you're in the United States, and it's one of the hundreds of privacy laws that may apply to even the United States. If you're in Canada, and it's HIPAA, or one of the provincial PIPA laws. Australia, New Zealand, Singapore, Hong Kong, many, many major jurisdictions around the world have privacy laws in place, and they have privacy regulators that are ensuring that those laws are being adhered to. So you have a compliance risk. Now, let's put an exclamation point at the end of that sentence, because that risk has been exploding recently. Many of you probably saw that last, he spoke was fined 5 billion. And that's with a beat guys $5 billion for privacy violations associated with the Cambridge analytical scandal, but also related to a prior consent agreement that Facebook had had with the Federal Trade Commission that sent shockwaves through the entire marketplace, not just the privacy field, but every CEO and board of directors around the planet sort of stood up and took notice when that happened. That one fine, was more than double the total of all previous privacy fines from any jurisdiction in the entire world in total. In other words, it was really, really big. So we need to pay attention to the privacy regulators around the world because they are enforcing and they are enforcing actively. But privacy risk doesn't just come from existing privacy law. We're also seeing antitrust and competition law emerge as a risk vector. For many organizations that have large aggregations of data that European regulators, Canadian regulators, American regulators have started to target and investigate large tech companies and other data aggregators to see if there's a competition issue there. And I think that's a notable thing as well. One more big financial risk is that for those companies that are publicly traded, if you have a privacy risk, you need to disclose it as a material risk within some of your public filings within financial markets. So the same day that Facebook was fined $5 billion. It was also by the Federal Trade Commission, it was also fined $100 million by the Securities and Exchange Commission. And that was because Facebook had not disclosed appropriately, the material risk associated with this privacy concern in their financial filings. So there are multiple vectors of financial and compliance risk. That's the first big bucket go ahead before

Steve Statler 13:50

we go on to the next bucket, just the financial thing, because that you got my attention. And okay, the billion number is super scary. Although I guess Facebook can probably afford it. They're probably not going to be filing for chapter 11 as a result of that, so I and that was under GDPR. I'm assuming the GDPR

Trevor Hughes 14:11

No, no, no, no, it wasn't. That was not a GDPR. Fine. It was from the US Federal Trade Commission. And notably, notably, it was not under a national privacy law, because the US doesn't have one. The US doesn't have a national privacy law. So what had happened was there was a prior privacy agreement with the Federal Trade Commission and Facebook. And when the Cambridge analytical scandal emerged, it turned out that that was a violation of the prior agreement that Facebook had had with the Federal Trade Commission, and that's why the fine was so big.

Steve Statler 14:48

So I think the thing that worries me is is fines that we can't afford and I guess it's probably bad for the American government or the European government put companies out of business with the fat fine. So presumably the fines will be proportionate. But one can imagine them being incredibly painful. Certainly embarrassing for any executive that needs to explain how they got the company and how that situation. Now what's what's your feeling about the, the size of these relative to the ability to pay? And, and what are we because we've covered GDPR, I guess maybe a year ago on this podcast. And I remember at the time, we didn't really have a good sense of how aggressive enforcement was gonna be what what's, what are you seeing in terms of that level of aggression and how punitive these fines can be relative to the ability to pay.

Trevor Hughes 15:53

So let's start with a framing of the fining authority under GDPR. The regulators in Europe have the ability to bring fines up to 2% of global turnover, which is to say 2% of global revenue. And for a large organization, a large multinational organization, that's a lot of revenue, that's a lot of revenue, that that is subject to a fine. And I believe there are situations in which that may actually be increased. So the fines can be sizable, can be sizable. We are now two years into GDPR. And there has been, I think, many have identified in the in the in the regulatory world, a comparative lack of fines, we just haven't seen as many as we thought there would be this massive piece of regulation. And we've been waiting and waiting and waiting. Well, it turns out that all of those data protection authorities were basically being given new tools for the first time many of them had never had binding authority before. And so just like every other organization in the world that had to build up their GDPR compliance efforts, the data protection authorities, the regulators, all 2728 of them in Europe 27 now after Brexit, 28, pre Brexit, they had to build their enforcement capability. So they had to hire people build the programs and structures for them to bring cases, as they started to bring cases that takes time as well. They had to initiate investigations conducted investigations negotiate the result with the investigated party. And then they have to report it up to a harmonization process at something called the European Data Protection Board, the edpb. And this was an important part of GDPR. And it was a mechanism to try to create consistency and enforcement across Europe. All of those processes are now starting to finalize and play out. And so what we are seeing or what we have experienced is basically a lag, there was a latency before the first wave of compliance initiatives would start to come to the market. That's starting now. So I would watch the next six months and hereafter, because we will start to see a steady drumbeat of these enforcement actions across Europe, we've already started to see some notably the Irish regulator, the data protection Commissioner Helen Dixon in in Ireland has a number of cases with large tech companies that are expected to come to light anytime now. And so within the next weeks or months, we're all waiting with popcorn in the privacy field, because it's sort of an exciting moment we have we're holding our breath because these are going to be big announcements.

Steve Statler 18:43

Yeah, it sounds like like a gladiator forum where you're looking for the first blood to be shared. So, okay, so very significant financial consequences, and I interrupted you as you're about to move on.

Trevor Hughes 18:58

Yeah, so so that's the first bucket and look, most general counsel VPS of compliance, you know, VP of risk chief risk officer, they're going to have their radar lit up by the fact that there are major fines, certainly a risk and Governance Committee of a board of directors is going to care about that as well. But here's the really important thing, privacy is also a brand issue now. And if you are on the wrong side of privacy, that can affect your brand, really substantially really quickly. You know, as we went into the pandemic, I certainly felt a little bit of sympathy for zoom, the the the video conferencing platform, because all of a sudden they went from you know, from from going 10 miles an hour to going 1000 miles an hour with a number of people accessing their platform, and it exposed a lot of privacy issues and so they had to scramble really quickly. lest they lose the moment in which they had this opportunity to grow aggressively, that can really hurt your brand. Now, I think zoom has done an admirable job to try to wrap themselves around some of the privacy issues that emerged. We're also seeing something interesting emerge in the marketplace, though, and that is organizations competing actively on privacy. Look at anything Apple says or does right now. And they talk about privacy being a fundamental human right. Look at anything Microsoft does right now. And they talk about their strong embrace of privacy and data protection around the world. It is not a coincidence that Microsoft, Google, Facebook, apple, these major tech companies, are actually marching to Capitol Hill in Washington and saying we need national privacy legislation. There is more love for privacy in the commercial marketplace right now than there has ever been ever before. And that's because organizations have recognized that privacy is just another word for trust. And trust is the mechanism through which you make your brand sticky. You bring customers to you, and you keep them for a long time. And so there has been sort of a lightbulb moment among CEOs, chief marketing officers and others who have said, Oh, I get it, this privacy thing just isn't about complying with the law, it's actually about doing things that our customers expect, and treating them with respect, and making our relationship sticky and long lasting and profitable for us and meaningful for them. So watch privacy expand as a as a competitive issue, because if you get it wrong, you can rest assured that your competitors are going to beat the heck out of you in the marketplace for being on the wrong side of the issue.

Steve Statler 21:53

And I kind of on one hand, I love Apple products. And and I think Tim Cook is an amazing See, I kind of feel like he's using this privacy thing as a weapon, though. It's sort of almost a cynical way of poking at Google. competitively. And

Trevor Hughes 22:13

look, yeah, it Look, it's it, it certainly is a brand and product feature of of Apple, and they are embracing it strongly. If that plays to their benefit competitively in the marketplace, I'm sure they're not upset by that. I'm sure that is part of the strategy. But again, if you're an organization that is struggling or does not have a good story to tell, with regards to privacy and data protection, your competitors are going to eat you for lunch. And whether they're doing that in a mercenary way as sort of active market combatants, you said gladiators, let's use that metaphor as gladiators in the arena. Okay, or whether that do they're doing that because they they really believe privacy is a human right, and they're embracing it, the result is the same. And that is you get eaten for lunch.

Steve Statler 23:09

Well, this is the thing though, I on one hand, I want privacy too. But what I hate to see is good ideas killed for frivolous reasons, because we're kind of being driven into a direction because of, you know, the 1% of people who really do read the small print. And, you know, in the past we've talked about privacy is so challenging, because we're all connected. And so yeah, only 1% of the people that read the small print, and they read your privacy policy, and they spot that you're really either not living up to it, or you have a very weak one, and then they tell everyone else and then suddenly, you've got a product boycott on your hand. So that's kind of, I guess, wallowing a bit more in what goes wrong. But the thing you know, so this podcast is very much about Internet of Things and tagging things and, and I get excited about it. Because I'm an optimist, I see how having a tag in your passport, and you lose it. And I don't know if you've ever lost your passport, but it's like the worst feeling in the world. You know, it's just horrible. And if we can, there's things we can do that really make life better. And some of it can be sailing very close to the wind where you know, we're looking at putting tags in pill bottles, so you open them up and you can keep track of your loved ones, whether they're taking their medicine, there's a privacy issue there. Potentially, you put tags in people's clothing, they leave the jacket and a BA they can find it. But so I feel like if we get driven into privacy, Puritanism that is completely unrealistic, we we kill the potential of what we can achieve. So that's really just me. venting, but you know, what advice would you give to us? Yeah, this community about how we can unlock some of these very, very cool things and not get stifled because we're worried about being fined or boycotted, because, you know, some small fraction of people find what we're doing offensive.

Trevor Hughes 25:19

Well, so let me start with, I think you're presenting a false choice. It's not an either or, I think the reality is, is that privacy is a necessary design feature for any new innovative technology. I want all of the things that you described, I want my passport to be deactivated automatically, if I lose it. And to make it easier for me to find it in somebody else not to engage in identity theft, I would love to have an RFID tag in my jacket so that if I leave it at a bar, I might be able to find it later. Or, you know, a Bluetooth beacon on top of a prescription bottle. So I know if my parents are taking the pills that they need to take, look, we want the digital economy to serve us, we want it to serve us in powerful and meaningful ways. We want it to make our lives better technology should serve us. At the same time, we want to feel a sense of trust and engagement. So let me use a couple of examples. First of all, I often describe going into a hotel room and switching on the lights. Now when I go into a hotel room and switch on the lights, I may be in a foreign country, I may never have been to that hotel before. And the light switch has a very simple user interface. It's binary, on or off, up or down. And that's all I know. But when I flip that switch, I want the lights to come on, I want that, that function to serve me. But think about all of the protections and code and control that exists. within that system. There likely was a licensed electrician who pulled the wiring through the wall, there was a building code that described what type of wire and how you routed it, and how it meant it needed to be connected to the switch box and to the light fixture. The light itself probably has an underwriter laboratory seal on it because it has been safety tested somewhere. Basically, when I flip that switch, I don't want to start a fire, I don't want something to explode in the room, I expect it to work to serve me. But there are lots of controls that exist within that system to help me trust that when I flip that switch, everything's going to be okay, privacy is the same way. I want to open up my smartphone and be able to say, call my mom and have my smartphone do that. But I don't want it to do something that would surprise me that would feel creepy, that is outside of the parameters of the engagement that I have with that technology. So that's the first example is the hotel lights, which the second one is actually much more contemporary. And it's playing out right now around the world. And that is contact tracing and exposure notification. Within the covid 19 pandemic, we are seeing the explosion of these. And in fact, the IAPP is going to have just a phenomenal keynote conversation between Apple and Google, up on our website soon. A little teaser on this exact topic. But here's the thing. We have also seen in the midst of the black life matters, protests, efforts by local authorities to use some of those contact tracing mechanisms to identify which protesters were showing up at protests. Now that's a use of that technology that is 100% outside of the realm of expectation of the person who may have downloaded something, or participated in a contact tracing app. And if that is indeed happening again, the IAPP is not an advocacy organization, but I personally find that deeply offensive. And so we need to ensure that we bake privacy into these technologies, so that we can embrace them, we can use them and and and that they are trustworthy. And if you'll let me tell one more story. I have a great analogy to sort of wrap this up with a bow. So we often get this pushback in the field of privacy. I started over 20 years ago as privacy leader and privacy lawyer inside the ad tech industry. And without question, there was a lot of pushback when every time we raised a restriction or something that that we would have to do that the product people or the marketing people desperately did not want to do. And then one day, the former chief Privacy Officer of Microsoft told me a story. And it sort of fixed that dynamic for me forever. So here's the issue. I want you to think of brakes on a car. And I want you to think of what brakes do. And the most common answer is that they slow cars down, right? That's what brakes are there to do. They slow cars down, it actually turns out to be exactly the opposite. When automobiles were first developed, brakes, were not a concurrent technology. So we had internal combustion engines and automobiles before there was an effective braking mechanism within vehicles. So much so that you couldn't drive fast. In fact, in the United States, and elsewhere, there were laws about what you had to do when you were driving a car, because it was quite literally a death machine, there was something called Red Flag laws in the United States, where someone had to walk in front of your automobile waving red flags to warn people that you were coming down the street with this death machine. And your car could only go 1015 miles an hour, because otherwise you would be putting yourself and everybody else at risk. So what happened when we introduced brakes? Well, cars could stop. But cars could actually go faster. It allowed cars to go faster, because they were safer. They were better suited to to the purpose at hand. And people both driving them and around them in society could trust them more. I think privacy is like brakes on cars, you might think that it slows you down. But in fact, if you do it right, and it's a functional feature of the product that you're introducing into the marketplace, to build trust with your customers and go faster eventually.

Steve Statler 31:47

Yeah, I think that's a very good metaphor. And I think without legislation and standards, we end up with the craving to have the guy with the flag in front of us, because we're so terrified that something bad's gonna happen. But if we have sensible frameworks in place that are consistent, then that helps. And of course, they're not. They're not consistent. I mean, America is very inconsistent country, 50, different legislatures, and, and so forth. So I'm tempted to talk a bit about that. Let's talk about so many things I wanted to talk to you about. But let's talk about that inconsistency. Now. How how much of a practical issue is there because we have, you know, this California privacy legislation, which is groundbreaking, we have the GDPR. How much fragmentation is there around the world? Because typically, we work with technology companies, very few of them are just focusing on Europe, they want Europe and the and the US and they probably want South America eventually and China and how, how much of a deal is that fragmentation? Should we be focusing on highest common denominator in terms of the best standards, full disclosure, transparency, security and looking after the data? how feasible is it to bring those different frameworks together and have one policy that doesn't create a nightmare for us?

Trevor Hughes 33:27

So Steve, I have good news and bad news, which would you like first?

Steve Statler 33:30

Oh, give me the good news.

Trevor Hughes 33:33

Okay, so. So, here's the good news. And that is that the GDPR has clearly become a global framework. If your organization, the organization that you're working in, is subject to the GDPR that clearly around the world has become a baseline expectation of operations. And many major laws around the world, like the lgpd in Brazil, which goes into effect on August of this year, are largely based on the GDPR. And I think that model will continue. So one of the greatest exports of the European Union right now is their regulatory frameworks. And the GDPR is a good example of that. So in terms of a highest common denominator, lowest common denominator, common framework. I think many organizations today are starting with GDPR. So that's the good news. The bad news is, is that it's never going to be enough. So the GDPR is always a starting place, but then there will be jurisdictional variation all around the world forever. privacy laws emerge out of the cultures that write those laws. And it turns out that privacy is a culturally normative value. So privacy is different for a person in India than it is for a person in Mexico than it is for a Canadian like myself than it is for Belgium. And that is just a fact. So the laws are always going to be different. around the world. So as much as GDPR has become a common framework, I think it is very clear that the complexity and the volume of laws that we need to deal with around the world and the number of variations derogations to GDPR, those are going to increase as well. Gartner, the major research firm did an interesting study. And in January, January of this year, they estimated that 10% of the world's population was subject to a national privacy law. They also estimated that within three years by 2023, that 65% of the world's population would be subject to a national privacy law. What that tells me is that it's likely that the complexity of our current world is going to get multiplied six and a half times in the coming three years. And we shouldn't run and hide from that we should be building robust foundational platforms of compliance and management within our organizations. So that we're ready for it because it will catch some organizations flat footed to be sure.

Steve Statler 36:05

Interesting. The housekeeping level do the hard stuff at the top of the hour? Or can we go a little bit longer? Okay, that's good.

Trevor Hughes 36:13

We can go longer if you'd like.

Steve Statler 36:15

I'd love to because there's still stuff that we need to get through. That's important. I want to still talk about COVID. And we've been talking a lot about what's the topic, I guess there's nothing more topical, than, you know, what's happening at the moment, Black Lives Matters, the tragedies on the on the streets. We're recording this on June 10. So we'll was kind of celebrating a major transition in moving from slavery to to something better, although far from perfect. I do want to go back to this point that you raised earlier about the facial recognition. And that is a I think it's fascinating because on one hand, it's clear that we shouldn't be using that to persecute people exercising their right to free speech. And but sure, Flipside is terrorists who you are trying to track who have murderous intent. Facial recognition can be very useful for that. And so how, how do we balance that, because I'm a little concerned that you know, some of this technology is, is going to be turned off when it's required to keep us safe. And yet, on the flip side, I don't want people to be to be terrorized because they're doing something they shouldn't be doing, which is pointing out terrible injustice.

Trevor Hughes 37:50

Yeah. So let me start by saying that, first of all, I don't have the answers to many of these questions. And some of these, I think, do not have clear answers yet. But let me give you a bit of a little bit of the lay of the land and then come back to this tension, this balance between use of technology and and public security and a sense of security individually, with individual rights and liberties and, and the value of the right of privacy, the human truth of privacy. So first of all, facial recognition is an incredibly, incredibly contentious technology in the marketplace today. It's in the wild already, there are broad platforms that exist where actually organizations are scanning social media and other online services to develop archives of facial recognition. So there are technologies in the marketplace today. We are concurrently seeing moratoriums, and bans on facial recognition, in some cases, from municipal governments, up through states through smaller jurisdictions. And then notably, in the past couple of months. Some of the major tech companies, Microsoft, Amazon, and others have stepped up and said, We are either going to put a moratorium on facial recognition in its entirety or a moratorium on selling facial recognition tools to national governments. The tension there, the tension, there is the is the tension that I described previously, with the COVID-19 contact tracing and exposure notification, that when we develop a technology for a single purpose, and I think everyone would agree that public safety and stopping terrorists before they can do harm in society is a valid and worthwhile goal for society to pursue. I think we could agree that if we could contain it in a box, and it would never ever come out of that box that that technology makes sense for that purpose, or could be used legitimately for that purpose. But it almost never is can be And within that box, there is almost always scope creep. There are mechanisms through which that that technology gets used to impinge on other rights. And it took us more than 10 years after 911, for Edward Snowden, to basically disclosed to the world, that the NSA was doing some data collection, things that didn't seem consistent with the law, or with the expectations of consumer or citizen privacy within the United States and beyond. I think there are many who look at facial recognition and say, Look, but before we have to wait 10 years after the implementation of a technology to find out that it was misused, let's put some meaningful controls on it from the outset, let's understand how it's going to be used. And let's make sure that we have an architecture of governance that wraps around it, so that those organizations that do use it are accountable for it to use, that they are held to account, to only use it for the purpose for which it was developed and launched. And for no other purpose. So again, I don't have a perfect answer to this. But I do know that better accountability, better governance, better control, better transparency, are all going to be important tools as we try to embrace these emerging technologies. Because the other thing I think that history has shown us is that a technology that is created is very hard to contain once in the wild. So it's not like we're going to be able to pull facial recognition back. And certainly given the complexity of the governments around, pop up all around the world. And in a global digital economy. That means it's all around the world all the time.

Steve Statler 41:48

Yeah. It's challenging thing. And I went through immigration, just before this whole COVID thing was erupting. They were shutting down the flights from Europe, I had to switch flights to get back into the country. And but once I went across the border, they did they use facial recognition, I didn't have to use my fingerprints, so much more accurate. So much better. I got through super quickly. But yeah, hopefully, I mean, the telephone can be used for terrible things from a privacy perspective, you just talked about some examples there. But we haven't banned the telephone, we just, we do have regulations around that. So hopefully, some of the professionals in your organization can help us navigate that balance. And you would use the word balance. And I think you probably use it a lot. Because it seems like that's a big part of your professional lives and helping to exercise that balance. I thought about it, particularly actually, yesterday, I was on a call with the sustainability Consortium, which is actually a pretty big group, Walmart and many big large brands a part of this thing and they have this project is very cool, called wherever. And wherever spelt, w e AR, is about putting tags in clothing to make the clothing more sustainable to move from fast fashion to sustainable fashion so that you can pass your garment on from one person to the other. And part of that is having tags in the clothing that can measure when they're worn and how often they're worn. And how many things do you have in your wardrobe that you haven't worn for a year, there's probably I would guess, unless you're much too many, I have a significant percentage of the stuff that you don't wear. So there's clearly some privacy issues there. And so they had done a survey. And I thought it was fascinating. And they'd asked, How much do you see benefit in these applications, being able to get advice on what you wear and help save the planet by because clothing manufacturer can be very toxic to to the environment. So a lot of good potential there. And then of course, they asked the question about privacy. And I was pleasantly surprised that the answer was no, you know, moderate, moderate concern, so not high concern. But of course, you know who the audience was, it was a study done by some great universities with students who are young. And so, you know, you have this balance between young people with who are exposing incredibly intimate pictures of themselves on social media. And then old fogies, like me, who, you know, some of my friends don't even use social media. So how do I balance a strategy and and address those things. Who constituencies? And maybe a follow up is I guess, do we have to expect that this will change as, as all people die, and these younger people have been born with the internet? Take over.

Trevor Hughes 45:16

So, boy, a bunch of answers there. First of all, the framing of the question to the, to the group in that survey probably is really important. If you frame it, hey, we're going to do all these great things for you. And we're going to use your data for all these purposes, might you be okay with that, from a privacy perspective? You know, my answer is probably going to be the same. Yeah, you know, privacy is a moderate concern. But if you're trustworthy, and you're doing the right things with it, great, great. If you then said, Well, hey, we're going to monitor what you're wearing. And if you're wearing a lot of something that's going to become a marketing metric for us, and we're going to send you messages to buy more of whatever it is that you were wearing a lot of, you know, I might have a bit more sensitivity. And then if you said, Hey, if you're wearing a particular type of clothes, we're going to sell your data forward so that others can contact you, the answer probably would be, heck, no, you are, you're not going to do that. So privacy is always on a continuum of expectations. It's always a dynamic and fluid reality. That's the first thing. So how you frame those questions is incredibly important. And for all of your listeners, I would highly recommend a law review article, a piece written by Jules polonetsky and Omar Tennant. And it's called a theory of creepy, and what they were trying to do so many people around the world have tried to wrap their hands around the idea of privacy and why it's so important. And like when you cross a line, and what Omar and Jules did in their piece was they said, you know, when we ask people about when privacy is violated, how they feel, the word that they often use is that feels creepy. And so what you don't want to be is creepy, you don't want to be creepy, you don't want your customers to feel surprised or shocked, or to feel like you're doing something with their data that they didn't expect in the terms of the deal that they negotiated with you. It's the classic story of the before when we had flashlight apps on our smartphones, that there was a flashlight app that was collecting location data. And when people found that out, they were like, Well, wait a minute, this is a flashlight app, all I needed to do is switch on the flash on my on my, on my smartphone, so that I can put my car key into the car door. You shouldn't need my location data. That's creepy. So that's the first thing Don't be creepy. The second thing is, we should always recognize that privacy has individual variability. So Steve, you and I probably have different expectations of privacy in different contexts in our lives. It turns out that I run the IAPP the world's largest privacy organization, and I would put myself sort of in the middle of the pack. I mean the peloton have privacy sensitivity, my wife is way out ahead of me, like she has real sensitivity around her personal privacy. And that's not right or wrong. It's just a fact. It's just a fact. And so know that there's individual variability as well. That's one of the reasons that providing your customers, not with forced choice where they have to read a ton of stuff. And they need to make decisions about things that they don't necessarily understand. But rather giving them control panels behind the scenes so that they can play with the dials if they really want to why that's a really powerful idea. Last point I'll raise is that, yes, there is generational difference age difference when it comes to expectations of privacy and personal or generational concerns about privacy. But don't ever think for a second that the youth today do not care about privacy? Because it turns out that every every sociological study that's been done shows that they do, they do, but they actually are using these technologies in a much more native way than we ever did. And so, the youth today, young people today probably are more facile with these technologies and have a greater sense of them than we might have ever had. And they're embracing them on their own terms. But without question privacy is something that they care about. And they will use the same word that I started with, which is creepy, and they will switch off apps programs platforms. When they find that they're violating their privacy,

Steve Statler 50:03

I really like this advice you're giving, it's so reasonable. But I am glad I've got you on the show because I do have a bone to pick with you and the privacy population GDPR came in, in my mind, undoubtedly a good thing. But with the exception of those cookie notices, on websites, I just drives me insane. I said sort of ingest slightly, but it does drive me crazy seriously, and part of me thinks, is this a legacy? Have we gone beyond this? I think we all know that websites have to try and track us to personalize the experience. And how do you feel about that notice? Are you like, yes? Are you like, really?

Trevor Hughes 50:57

Well, um, how do I feel about that? Notice? I will admit there is a sense of defeat. How about that? When I started as a privacy professional over 20 years ago, I mentioned that I was heading up privacy at a tech company and ad tech company. So we were doing behavioral advertising in 1999 2000, which was some of the earliest behavioral targeting, preference based targeting on the web. And guess what we were arguing about cookie consent. We were arguing about the cookie consent, we were negotiating with regulators, the FTC state a GS with the European Parliament, on what is the appropriate level of consent associated with cookies. And I personally in 2002 2003, was in Brussels going around with a super heavy laptop, showing members of European Parliament, what the web would look like, if we had to have consents pop up, prior to the use of cookies. In fact, I was so sophisticated, I used all of the local papers and brought up a page of the sports scores for like the local football team, the soccer team, from whatever MEP, it was that I was talking to. And I showed how many cookies were on that page. The debate is exactly the same today. So there is a sense of frustration and futility, almost a sense of defeat that we haven't solved this yet. And, and I'm both disappointed and defeated by that. But I also think there's a lesson in it. And that is, there is a concern there that we have not yet solved, there is a lack of transparency, there is a lack of awareness of how that data is being used. And quite frankly, if you look at the ad tech world, and the complexity of it the way that data moves around the way that your eyeballs are bought and sold through ad exchanges, I think there remains a reckoning that will occur with regards to, to AD tech and to the use of cookies. With that said, state management is absolutely critical. The web does not work without tools like cookies. And so I do worry that if we end up with regulatory or legislative barriers to the use of cookies as a particular technology, then it will actually we will find that that we move towards other state management tools that are less transparent, less manageable, less controllable. And and that's not necessarily a good thing.

Steve Statler 53:36

It is interesting. Okay. Before we wrap up, I do want to just spend a few minutes on COVID-19. And just get your kind of executive highlights on what the implications are for privacy. And you have this Ernst and Young report on your website. There's a lot of data, anyone can access it. But you know, what does? How is COVID-19 impacted us from a privacy perspective,

Trevor Hughes 54:01

like so many other things in the world, that pandemic has slowed things down? So from a legislative perspective, from a regulatory development perspective, even I would say from an enforcement perspective, it's just become harder to do things because we're all working virtually. And much of that is slowing down. There are other initiatives distracting public policymakers, too, we have seen fewer new laws and other things emerge. However, however, enforcement is still happening, new laws are still happening and the trajectory for privacy is Ascendant and more complex. So don't think for a second that COVID-19 has kind of ripped the guts out of this issue. And it's it's kind of going to go away and die. absolutely not the case. And I fully believe the Gartner report that says within three years 65% of the world's population will be subject to national privacy law. So that's privacy writ large slowed down for a little bit, the road may be a little twisty and turny and bumpy. But make no mistake, it is upwards and ascendant in the future, this issue will only get bigger and your organizations will have to deal with it. Then specifically with COVID-19 itself, it turns out that in the digital economy, privacy is almost always the first social issue that needs to be tackled, in order for us to get to the functionality that we hope to get to. So as COVID-19 emerged, of course, many of the responses that we have our heavy uses of data. So whether it's health screening at airports, or your places of employment, retail establishment, you're gathering health information, how are you gathering that? How are you storing it? What are you creating? Where is it sitting? Who gets access to it? What other uses might there be? health screening is an absolutely massive issue we saw a proposal initiated, I think it was in Washington state, where people would have at a restaurant would have to create a log of everyone who came into the restaurant and when they were there. Now, that may sound right, from a contact tracing perspective, you want to know who was sitting in that restaurant. And if someone is diagnosed, you can see when they were at the restaurant, and who might have been around them at that time, so you can go back and notify them. But what if you are having a lunch with a lover or a mistress or someone that you don't want the world to know that you're having lunch with, and that documentation is sitting there? What if the fact that you were at that restaurant is not something that should be shared broadly, there weren't really controls on that we got to build those privacy controls, contact tracing, exposure notification, it's also called or that's the extension of contact tracing, that's going to show up on our smartphone soon, it will be a necessary component to our solutions to the pandemic. And we're going to have to figure out the privacy implications, there will be a massive, massive, urgent desire to use that rich trove of data for other things other than contact tracing and exposure notification. And this is a Bluetooth tool, it's something that the IoT world may actually intersect with pretty actively. contact tracing, I can tell you now will have massive privacy issues associated with it. Google and Apple have released a shared platform, a Bluetooth based platform that has privacy controls baked in, but watch that space. Here's one more and that is immunity passports. Before we get a vaccine, one of the ways that markets will reopen is through individuals having been diagnosed and then receiving an immunity passport because they have antibodies, and they are not subject to where they're immune from. From COVID-19. The privacy controls around those immunity passports, the way that we distinguish online and in person who has them who doesn't, who gets access to things. There are many, many privacy issues there as well. The list goes on and on and on, we created an issue spotting tool for our members. And there's at least 20 unique issues it even goes into and this is how detailed and complex it gets into local courts. So if you or I were arrested for speeding or something, and we had to show up in court, that used to be a done with a lawyer and you could speak in the hallway privately with that lawyer. Today that's being done over zoom. It's being done over video conferencing platforms, and the attorney client privilege that previously existed in our private conversation in the hallway, where the lawyer may say, so what are you thinking? Do you want to negotiate with me, please? Should we do this? Take a fine, what do you want to do? That conversation can happen in the way that it used to happen, and it's happening in hearings in real time, in front of a judge and with others in that video conferencing platform. We have many privacy issues in the pandemic. And for me, it's just an example of how privacy is never going to go away. It is a persistent, eternal negotiation within the society because it defines the boundary between the individual and the larger society and as long as that boundary exists, and it has for all of human history and will for all of human history. We will be negotiating that boundary.

Steve Statler 59:52

There are so many issues. I think one of the ones that you your organization's pointed out is the fact that we're working from home More. And so suddenly, you're talking to someone about something very confidential. And that conversation could be echoing around the holes of an environment where teenage kids could be listening. And I don't know what we do about that.

Trevor Hughes 1:00:18

Yeah, look, I it's another very good example. And it's one of the many that we identified for members and that we are exploring. There's a fantastic academic named Helen nissenbaum, at New York University, and she wrote a book called privacy in context. And really what the thesis of the book was that, for much of our analog history, we had contexts of privacy, there was a context of privacy at the workplace, there was a context of privacy. on Main Street, in our towns and cities, there's a context of privacy, in a locker room, in your living room, in your bedroom, in your bathroom, there's all these different contexts of privacy. And while we didn't necessarily have laws, we had normative values in those places as to how to manage privacy. One of the things the digital revolution has done is it's broken down the boundaries between those contacts. And what you're describing is the decontextualize ation of the workplace. So if I'm working from home, where am I? Am I at work? Or am I at home, which privacy rules control? I don't mind sharing with you and your listeners, that when I'm at home, that my boxer shorts, now, that's clearly not appropriate, in a work from home context, right? I should be wearing pants you can't tell right now. But I am. But that that is a context of privacy. And we are blurring those boundaries. I try and wearing shorts, but it's a context of privacy. It's a context of privacy that's been blown up because of the decontextualize ation, driven by the digital economy and now exacerbated by the pandemic, because all of us are at home, and we are working at the same time.

Steve Statler 1:02:10

I have to say I was expecting our privacy standards as people, you know, what we were willing to accept to be moderated by this. But it doesn't seem that's the case. I mean, I grew up in England, where there's cameras everywhere. And, you know, it wasn't a big deal, because it can be rough out there. And if it keeps you safe, then you were willing to accept that I feel like that standard is less willingly accepted in this country. And I thought that because this was a matter of life and death, that people would be willing to give up more information about who they had met. Because sharing that information could mean that that thousands of people's lives saved, I am somewhat surprised at the reaction to to these apps that I thought people would welcome and I don't know, whether that's a sign that privacy really is more important than life? Or is it just a general lack of acceptance, that the stakes really are as high as they are? What are you saying in that context? Do you do you see think our standards are being raised or lowered? Or what's happening?

Trevor Hughes 1:03:32

Um, I don't think they're being raised or lowered. I think they're actually being incredibly consistent, I think our expectations of privacy, that there are normative variations, to those expectations. But generally, we want technologies to do the things that technologies are supposed to do for us, we want them to make our lives better. But if it steps over into creepy, or if there's a lack of transparency, and we don't understand what's happening, or if there's a clear violation of our expectations, and my contact tracing app on my phone, is now documenting the fact that I'm at a Black Lives Matter protest. I care a lot. And if those controls aren't built in from the very beginning, well, I might be a little hesitant to embrace that technology out of the box. If if there is not a sense that those types of controls and protections exist. And I will note that there have been at least two COVID-19 privacy bills introduced on Capitol Hill to try to wrap around exactly some of these issues to provide some boundaries into which data use can be robust for addressing the COVID-19 crisis. But also proscribed so that that data use doesn't exceed what our expectations might be. So it really does come back to how are we using these technologies to make our lives better, and our We ensuring that that those technologies are within those boundaries of a consumer citizen expectations. They're not straying into a world of being creepy. And they really are making lives better all around the world. That's a constant negotiation that's never gonna go away.

Steve Statler 1:05:21

Well, I think that messages come through loud and clear, you've given some really good advice. It's, it's been very engaging. I got I got to thank you very much. Trevor Hughes, President, CEO of International Association of privacy professionals, we really appreciate your spending time with us.

Trevor Hughes 1:05:41

Steve, it was great. Thank you.

All Posts

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!