No Match Found
Take on Tomorrow, the award-winning podcast from our management publication strategy+business, brings you Episode 1: “What’s the price of privacy?”
Data is one of the most valuable and most dangerous resources a business can collect. Though it helps businesses better understand customers, it also exposes concerns about privacy, security, and trust. And if trust is lost, it is very difficult to earn back. So, does taking privacy seriously make for better business? In this episode, we’ll hear lessons learned from working across high-level privacy and cybersecurity concerns, and find out why security is more important than ever for business leaders.
Podcaster and journalist
Columnist and former senior political advisor
Chief Privacy Officer, Apple
Cybersecurity and Incident Response, PwC Ireland (Republic of)
2023 Webby award nominee for "Best Podcast" series
Sharp, actionable insights curated to help global leaders build trust and deliver sustained outcomes. Explore our latest content on the global issues affecting organisations today from ESG to value creation, technology and cyber to workforce transformation.
Lizzie O’Leary: Hello, and welcome to Take on Tomorrow, a new show from PwC’s management publication strategy and business. I’m Lizzie O’Leary, a journalist in New York.
Ayesha Hazarika: And I’m Ayesha Hazarika, columnist and former senior political advisor based in London.
Lizzie: And this is our first episode. In this podcast, we are looking at the biggest issues facing our economy, society, and planet, and asking what business could and should be doing to respond. All season long, business and government leaders, academics, and disruptive innovators will discuss the solutions to society’s biggest challenges. We’ll hear about what these issues mean for business and how you can respond.
Ayesha: Issues like climate change, AI, disaster-planning, cryptocurrencies, the future of the workplace, and more. We’re going to look at these issues by really talking to people in the thick of it.
Speaker 1: I’d say it’s less about, “there’s a problem with innovation,” and more that, in particular areas, we need breakthroughs faster, and more of them.
Speaker 2: So that PR-led strategy, where it’s just talk and there’s no action. Really, you’re getting all the downsides of the culture war and none of the upsides of actually making your business better.
Speaker 3: By the time you assess and do all that stuff, it’s about 72 hours before you start moving stuff. I’m like, “Well, if we know a bad hurricane hit, why don’t we just go?” And in a lot of cases, I found out, the private sector is already doing stuff, and we need to pay attention to them.
Jane Horvath: We don’t want our customers to be surprised. We want them to know what we’re doing with their data. So, we ask, ask, and ask again.
Lizzie: And we’ll be joined each week by a specialist from PwC to help us unpack our topics, too.
Today, we’ll be discussing data privacy with the chief privacy officer at Apple, Jane Horvath. We heard her just then talking about data. We’ll be discussing everything from what our iPhones are doing with our health data to ransomware attacks and more.
Seemingly, all of our gadgets, our phones, watches, laptops, are tracking our every move, from our route to work to where we go to buy our groceries. It feels like we’re in a world where everything we do has become completely transparent, whether we like it or not.
Ayesha: Data is such a powerful tool for a company but also a dangerous one. Think, for instance, about a company that collects your financial information, then misuses it or has it stolen. For companies dealing in private information, there are good outcomes but also really bad outcomes.
Lizzie: Data privacy and data security are closely linked—as Jane will tell us—and companies need to be thinking about both issues. That conversation with Jane is coming up, but now we’re joined by Pat Moran, head of cybersecurity and incident response for PwC Ireland. Hi, Pat.
Pat Moran: Hi Lizzie. How are you? Good to be here.
Lizzie: You are a specialist in the realm of cybersecurity and IT forensics. Can you tell us a little bit about what you do and how it relates to privacy?
Pat: Sure. So, yeah, I’ve got a really interesting job, I suppose. As I say to my mother, who asks me on a regular basis, “What is it, Pat, that you actually do every day?” And I say, “Well, I try and protect people from all the bad guys that are out there.” And she gets really interested and worried in that. And she thinks I carry a gun maybe, and I’m fighting wars on different fronts.
But, actually, what we’re trying to do at the end of the day is, yes, we are trying to protect people from the bad guys.
Lizzie: Have the risks for privacy breaches or cybercrimes increased as we live in a world where more and more of our personal information, our health information, our personal data has been digitized? Because it feels like so much important stuff lives on networks, in clouds, et cetera.
Is that more fragile now?
Pat: The risks are definitely going up, Lizzie. I think the fact that—even a simple thing, like we’re all working from home—I think people have become more of a target. I think there’s a lot more people being lured into clicking on links, because they’re expecting packages to be delivered to home, because the software that they’re using on their home laptop needs to be updated.
Ayesha: We’re going to come back to you shortly, but first, we’re going to hear Lizzie’s conversation with Jane Horvath. Jane’s been chief privacy officer at Apple for ten years but brought with her decades of experience in both law and privacy before she was with Apple. Jane was the first-ever chief privacy counsel at the Department of Justice in the US, then global privacy counsel at Google.
As a lawyer and as a computer scientist, Jane was following data privacy long before it became the concern that it is today. Lizzie, you were able to get some time out of Jane’s incredibly busy schedule.
Lizzie: It was really interesting. Jane comes from an environment that pioneered privacy long before the public was thinking about it. Steve Jobs, Tim Cook set that precedent as a corporate value. And we spoke early on in the interview about how Apple designs products with privacy in mind, long before it gets to the consumer.
For someone who’s never met you before, how would you describe your job?
Jane: I would describe my job as probably the best job in privacy. I work for a company that has a deep respect. Privacy is a fundamental human right at Apple. And it flows down from Tim Cook. Tim Cook is very committed—both personally and throughout the company, that very much flows through to everything that we do.
And as someone who’s been, who was an early starter in the privacy field, this is a job where having executive support and having the entire company supportive of the issue, is just an incredible place to work. So, when I describe my job, I’m the luckiest privacy professional out there.
Lizzie: What does that look like in practice, to have it be so embedded in the company’s mission? Like, can you give me a tangible example of that?
Jane: Sure, I can give you an example. I don’t know if you wear a watch or…probably, it was probably my third year. I’ve been at Apple for ten years, and I got a knock on my door one day, two engineers standing at the door and, they said, “We want to develop a wearable, and we’re very cognizant that wearables may impact privacy. So we want your team embedded from the very start of this, because we’re really focused on health data.”
And so, that is how it flows at Apple. When they have an idea, they bring the privacy professionals—not only my team, but we also have a team of privacy engineers that also embed in products.
Lizzie: So, what did your team find? Did you have moments where they said, “Oh, actually it’s tracking this, and we didn’t intend that to happen”?
Jane: No, it’s actually, we’re actually embedded from the very start. So, there’s no tracking. It’s actually in the design itself. So, when we’re counseling, and when we’re looking at products, we’re working from four main privacy principles. So, these are what guide everything that we do.
The first is, when we’re working with the team, we’re trying to challenge them to collect as little information as possible. It doesn’t mean “don’t collect information,” but it means that, if you don’t need the data in a personal forum, meaning, as Jane Horvath, you don’t need to collect it. So that’s the first principle. The second principle is, a lot of people, when they’re interacting with companies or interacting with companies in the cloud, the company knows everything that they’re doing with their products and services.
They’re collecting all the data they’re tracking. They know what I’m doing, when I’m interacting. What we try to do is, we try to make your device, your iPhone, your iPad, really, really smart. It’s a very powerful computing device. So, your iPhone and iPad should know a lot about you. Apple doesn’t need to know a lot about you.
Lizzie: It’s been about a year since you all rolled out app-tracking transparency. And this is sort of the idea that, you know, if you have an app, you will be asked to, to opt in to your data being tracked when you move along to a third-party site. And I wonder what your data has told you about what most people have chosen to do?
Jane: Well, it probably won’t surprise you: we don’t actually track what people are doing and how they’re interacting with the prompts. So, we don’t know. I mean, I can read anecdotally through the press what people are saying, but, you know, I think the important thing here, again, it wasn’t that we were opposed to advertising.
It was that we noticed that there was a data industrial complex, for lack of a better word. It’s basically businesses that trade in your data.
Lizzie: To me, it seems Apple has staked out a relatively, I would say, outspoken position on privacy. You’re doing this interview. Tim Cook has obviously been quoted extensively talking about privacy.
I think of the 2015 San Bernardino shooting, and you all refused to unlock the shooter’s iPhone, and Silicon Valley and privacy advocates saw that as a win for privacy. And then other Americans thought you were protecting a terrorist. How do you square those different views?
Jane: You know, it’s interesting. For the most part, across the regulatory landscape, and I’m privileged enough to be able to work across globally, the efforts that we’re making on privacy are generally very, very well received across the world. You raise the San Bernardino issue—that really wasn’t about us not wanting to work with the government. It was the impact to security that would have come from what the government was asking us to do.
So, our commitment to privacy, literally across the world, and, and in talking with regulators, it’s important to their citizens. And so, they’re very supportive of the efforts that we’re making.
Lizzie: How do you, Jane Horvath, conduct your life online in this job? Like, I wonder if that, that seeps into your consciousness. You mentioned that you’re a mom?
Lizzie: Do you think about that a lot?
Jane: I absolutely do, and if you were to talk to my children, they’d probably roll their eyes, because they get Privacy 101 quite a bit.
Lizzie: What do you tell them?
Jane: Well, for me, the most important thing is: think before you share. We live in a world, like, once you publish something, it’s very difficult to pull it back.
And, the unfortunate thing—I was talking to my daughter before. We were walking a couple years ago, and we were talking about college—she’s in high school. And she said, “You know, mom, when you went away to college, you could remake your life. You could become anything you wanted when you went, when you went away to college. For me, I’m so shaped by my social media profile that I’ll bring that to college with me. And people will be able to make judgments about me before they even meet me in school.”
And it made me really sad, you know, that kids feel this need to shape some image of themselves online and don’t have the potential to be something new.
Lizzie: Do you think that has filtered into the way you do your work?
Jane: I think it absolutely does. You know, particularly, we’ve built some really strong features for parents to use. We have family sharing. So, on your device, a family of up to six people can share purchases. And one of the best features is, for children under 13, a parent has to actually set up the account to have an Apple ID, and a parent can set “Ask to buy.” And so that account that they set up—any time that child wants to download an app, whether it’s free or not, the parent can be queried, whether they want the child to buy that app or not.
And there are a lot of other—we call them screen-time settings—our parental controls, that give the parent a lot of opportunity to be helpful in protecting their children online.
Lizzie: Your background is very interesting, if I may say so. I feel like there probably aren’t that many lawyers who were also computer science majors. Do you think that has changed the way you think about the regulatory environment, that you can actually write some code?
Jane: A hundred percent. You know, when I got my computer science degree, and when I was trying to decide where to go to graduate school, I was debating between getting a Ph.D. in computer science and going to law school. And, you know, the impetus for law school was, I understand the bits and bytes.
And I do worry, as we look at some of the laws that are being created, that we need to be very, very cautious and make sure that we are talking to people that understand the technology as you’re writing regulations. Because, you know, I think that a lot of lawmakers don’t understand the underlying technology. When I graduated from law school, privacy was not a profession, but I think I gravitated toward it because it is so integrated with the technology.
Lizzie: Often, when you talk to people in law enforcement, they put privacy and security on diametrically opposed kind of axes. I know that you all would take issue with that, but how do you counter that? When you, when you encounter someone who says, “Well, we need to be able to see everything.”
Jane: I think the way I counter that, actually, is that, if you can see everything, so can everybody else. The technology is nonpartisan. So, the technology cannot differentiate between the good guys and the bad guys.
So, if you build a vulnerability in, or what we would call, like, a backdoor—some special access for law enforcement—the technology doesn’t know whether it’s law enforcement that’s getting access to it or a hacker that’s getting access to it. And so, I think, increasingly, particularly when we’re looking at encryption and strong encryption, you’re seeing a lot of support, even from government officials that recognize how important, and actually lifesaving, the ability to have secure communications are.
Lizzie: Do you feel like the sort of global understanding is moving toward a pro-encryption world?
Jane: I think it’s still a very vibrant debate right now.
Lizzie: That’s a very polite way of putting it.
Jane: Yes. I think it very much is critical that we maintain strong encryption.
Lizzie: Before I let you go, I wonder, what do you want someone who buys an Apple product to know they’re getting, when they do that, from what you do?
Jane: Sure. I would like them to know that the devices, and the operating system, is designed from the ground up to try to guide them through to have a private experience. And Apple has a fundamental right to privacy as one of our core values.
And I would like them to pay attention as they’re operating their devices. And also, what I like to practice is privacy hygiene. Once a month, go into settings: privacy. And go through each of the settings and see what apps you have granted access to what. And, even I, sometimes I’ll go back through, and I’m, like, "I did what? Why did I give this app access to this?” But you can always revisit your choices.
So, I think people get frustrated and they feel, like, “Oh, my gosh, I’ve given up all my privacy. There’s nothing I can do about it.” But it’s never too late to take control of your data, and, you know, the older the data is, it’s really not that usable. So, if you start today, taking control of your privacy, you’ve got it going forward.
Lizzie: Jane Horvath, thank you so much for talking with me.
Jane: Thank you.
Ayesha: We just heard from Jane how critically important privacy is to Apple. But you can’t talk about privacy without talking about security.
Lizzie: That connection between security and privacy—the idea that strong security underpins everything else—is something Pat Moran is also familiar with in his cybersecurity and incident response work for PWC Ireland…
Pat, what were your takeaway thoughts from that discussion with Jane?
Pat: Yeah, what really caught my attention with Jane’s conversation is the story that she shared with us around the two engineers coming to her with the idea around their new product, and how they could build privacy in early into it.
I thought that that story spoke volumes, really, of where we are today. And many of the mainstream tech companies and, indeed, many blue-chip organizations that recognize that privacy actually can be a strength—can attract customers, can retain customers, and can demonstrate the culture and values of organizations as well.
Ayesha: Jane talked a lot about how to avoid privacy issues. Pat, you’ve been in lots of situations where there is an issue, and you’ve been brought in to help clients who have had significant data breaches or been victims of ransomware. What’s it like to be in the heat of the moment?
Because it must be a very stressful situation, because not only is it a terrible thing, but it’s something that’s beyond the comfort zone and knowledge of a lot of senior people.
Pat: Many of the organizations that have successfully got through it have said it’s been the worst stage of their careers—the CEOs, the chief information officers—that they’ve had a very difficult time.
In fact, we’ve had some people that have required hospitalization. Because of the long hours that they’ve had to do to try and stem the tide and try to recover as quickly as possible. Let me tell you, Ayesha, about an investigation that we did last year.
On the 11th of May, the hospital within the Irish system, all the hospitals, started getting messages to say that they had fallen victim of a ransomware attack. And the screens were quickly turning to blue, and they were quickly alerting all management that the systems were no longer operable.
The chief information officer, the head of technology, got the call at about five o’clock that morning from one of his maternity hospitals to say this was happening. He went down very quickly, and he could see that the ransomware had been detonated onto the network itself and was quickly, like a virus, propagating across all the networks within the hospital service.
Now, that’s quite a big hospital service for Ireland. There’s probably about 80 or 90 hospitals, as well, there that are impacted. The lights went off.
They had to take the decision to take all the servers down. We had patients in Ireland queuing up into the accidents and emergency department, and literally their details being taken down by pen and paper.
So, disarray going on in an organization, all because of an attack that was activated three months before, allowed the perpetrators to sit on the network. They were able to what they call exfiltrate—which is simply transfer that personal data of patients and then claim that they were going to dump it live on a website, by a certain time, on a certain day, if the ransom wasn’t paid.
Lizzie: What do you say to a client that is in the aftermath of an attack, whether it’s a privacy breach or a ransomware attack? Can they rebuild in a way where their customers will have trust in them again?
Pat: The hospital example I gave to you earlier—they came to us afterward and said, “We want you guys and PwC to do a report on how this happened, because we want to learn from it. We want to put that report publicly on our website, so other people can learn from it.”
And I think they, as a result of that, got very little criticism, got a lot of respect. And I think is now a really good example of how you can become stronger as a result of a crisis.
Ayesha: And, Pat, that example about the hospital. For me, it’s horrifying. It’s a gripping example. How did they stop the attack? Did they—are you allowed to tell us—did they pay up in the end or, or did you manage to stop the attack?
Pat: So, they did stop the attack. They literally had to just unplug everything.
So it was that dramatic. They unplugged every network device that you could possibly think of—and they rebuilt everything from scratch.
Ayesha: I mean, that is absolutely chilling to me. And it’s very, very frightening, and it must provide such a headache to so many organizations. Do you have to prepare people for the fact that they might have to pay a significant ransom?
I mean, how many people do end up paying a ransom?
Pat: We’re finding that the number of people, organizations, that pay ransom is increasing all the time. We’re finding that companies are now taking out insurance in order for them to pay the ransom. We’re finding that companies are setting up their own cryptocurrency, their own bank accounts, to be able to trade with these guys.
It’s about when and not if. So, when they get in, when they disrupt, when they hold you to ransom, let’s get ready for that scenario. Because it’s too difficult to prevent now.
Ayesha: Well, Pat, this has been such an important subject and still a very new topic that we’re all getting our heads around. Thank you so much for giving us all your expertise and those brilliant insights.
Pat: You’re so welcome. Thanks for today.
Lizzie: It’s so scary to think that our personal details can just be weaponized in a way so that any company can be hit by an attack. Or you could innocently click on a link, and that ends up being the thing that steals your data. I am struck by how wide-ranging these attacks are and how mundane. On a very personal note, my husband’s tiny—sorry, honey—nonprofit theater company was the victim of a ransomware attack. It is not a global conglomerate. It is not a health system.
This is just happening all the time. And listening to Pat detail the scope of it was incredibly eye-opening for me.
Ayesha: Yeah, I totally agree with you. And listening to his story about Ireland, the human toll that that took on people, because it was so frightening. And, as you say, the speed of it. It was so fast, it could have been a matter of life or death, because it was a hospital.
Lizzie: What all of this makes my brain come back to is how interconnected privacy and security are. And the way our personal details exist in the internet can be weaponized against us or can just sort of make us feel compromised out there in cyberspace. It’s a very strange interplay of things that, frankly, I don’t spend enough time thinking about.
Ayesha: Absolutely. Me, too, Lizzie. Well, that’s it for this episode.
Join us next week, when we’ll be asking: how can we consistently deliver the breakthrough innovations needed to help solve humanity’s biggest problems?
Guest: I’d say it’s less about, “there’s a problem with innovation,” and more that, in particular areas, specifically health, we need breakthroughs faster, and more of them. And what I mean by breakthrough innovations: innovations that change the game, right? And, typically, those innovations require you to integrate, bring together, exponential trends from different areas and keep them together long enough to do that demonstration at convincing scale.”
Lizzie: Take on Tomorrow is brought to you by PwC’s strategy and business. PwC refers to the PwC network and/or one or more of its member firms, each of which is a separate legal entity.
© 2022 PwC. All rights reserved. PwC refers to the PwC network and/or one or more of its member firms, each of which is a separate legal entity. Please see www.pwc.com/structure for further details.
This content is for general information purposes only, and should not be used as a substitute for consultation with professional advisors.