Trevor Hughes 25:19
Well, so let me start with, I think you're presenting a false choice. It's not an either or, I think the reality is, is that privacy is a necessary design feature for any new innovative technology. I want all of the things that you described, I want my passport to be deactivated automatically, if I lose it. And to make it easier for me to find it in somebody else not to engage in identity theft, I would love to have an RFID tag in my jacket so that if I leave it at a bar, I might be able to find it later. Or, you know, a Bluetooth beacon on top of a prescription bottle. So I know if my parents are taking the pills that they need to take, look, we want the digital economy to serve us, we want it to serve us in powerful and meaningful ways. We want it to make our lives better technology should serve us. At the same time, we want to feel a sense of trust and engagement. So let me use a couple of examples. First of all, I often describe going into a hotel room and switching on the lights. Now when I go into a hotel room and switch on the lights, I may be in a foreign country, I may never have been to that hotel before. And the light switch has a very simple user interface. It's binary, on or off, up or down. And that's all I know. But when I flip that switch, I want the lights to come on, I want that, that function to serve me. But think about all of the protections and code and control that exists. within that system. There likely was a licensed electrician who pulled the wiring through the wall, there was a building code that described what type of wire and how you routed it, and how it meant it needed to be connected to the switch box and to the light fixture. The light itself probably has an underwriter laboratory seal on it because it has been safety tested somewhere. Basically, when I flip that switch, I don't want to start a fire, I don't want something to explode in the room, I expect it to work to serve me. But there are lots of controls that exist within that system to help me trust that when I flip that switch, everything's going to be okay, privacy is the same way. I want to open up my smartphone and be able to say, call my mom and have my smartphone do that. But I don't want it to do something that would surprise me that would feel creepy, that is outside of the parameters of the engagement that I have with that technology. So that's the first example is the hotel lights, which the second one is actually much more contemporary. And it's playing out right now around the world. And that is contact tracing and exposure notification. Within the covid 19 pandemic, we are seeing the explosion of these. And in fact, the IAPP is going to have just a phenomenal keynote conversation between Apple and Google, up on our website soon. A little teaser on this exact topic. But here's the thing. We have also seen in the midst of the black life matters, protests, efforts by local authorities to use some of those contact tracing mechanisms to identify which protesters were showing up at protests. Now that's a use of that technology that is 100% outside of the realm of expectation of the person who may have downloaded something, or participated in a contact tracing app. And if that is indeed happening again, the IAPP is not an advocacy organization, but I personally find that deeply offensive. And so we need to ensure that we bake privacy into these technologies, so that we can embrace them, we can use them and and and that they are trustworthy. And if you'll let me tell one more story. I have a great analogy to sort of wrap this up with a bow. So we often get this pushback in the field of privacy. I started over 20 years ago as privacy leader and privacy lawyer inside the ad tech industry. And without question, there was a lot of pushback when every time we raised a restriction or something that that we would have to do that the product people or the marketing people desperately did not want to do. And then one day, the former chief Privacy Officer of Microsoft told me a story. And it sort of fixed that dynamic for me forever. So here's the issue. I want you to think of brakes on a car. And I want you to think of what brakes do. And the most common answer is that they slow cars down, right? That's what brakes are there to do. They slow cars down, it actually turns out to be exactly the opposite. When automobiles were first developed, brakes, were not a concurrent technology. So we had internal combustion engines and automobiles before there was an effective braking mechanism within vehicles. So much so that you couldn't drive fast. In fact, in the United States, and elsewhere, there were laws about what you had to do when you were driving a car, because it was quite literally a death machine, there was something called Red Flag laws in the United States, where someone had to walk in front of your automobile waving red flags to warn people that you were coming down the street with this death machine. And your car could only go 1015 miles an hour, because otherwise you would be putting yourself and everybody else at risk. So what happened when we introduced brakes? Well, cars could stop. But cars could actually go faster. It allowed cars to go faster, because they were safer. They were better suited to to the purpose at hand. And people both driving them and around them in society could trust them more. I think privacy is like brakes on cars, you might think that it slows you down. But in fact, if you do it right, and it's a functional feature of the product that you're introducing into the marketplace, to build trust with your customers and go faster eventually.