Privacy operations have been in intensive, costly cycles of readiness and compliance with regulations that came in quick succession: the General Data Protection Regulation (GDPR), California Consumer Privacy Act (CCPA), and more than 2,500 laws governing data privacy. Compliance programs emerged in piece-meal fashion to comply with separate regulations. Eighty-eight percent of global companies say that GDPR compliance alone costs their organization more than $1 million annually, while 40% spend more than $10 million.
The piecemeal, ad hoc approach is untenable.
First, privacy legislation and regulation show no signs of abating. More than 50 privacy and data protection bills are in the works in the US alone. China recently enacted two framework laws. One is the Personal Information Protection Law, which goes into effect on November 1, 2021. Think of it as GDPR + national security provisions. It goes hand in hand with the Data Protection Law, which went into effect on September 1, 2021.
Second, the areas of responsibility for privacy professionals have been continually expanding beyond data inventory and mapping. Privacy impact assessments (PIAs) are now cornerstones of privacy programs, and they are increasingly required in new and emerging regulations. Consumers are increasing their privacy IQ and are more likely to exercise their rights. Many companies are already seeing a rise in data subject requests (DSRs) from consumers, and innovation in the development of privacy-enhancing technologies is coming at a rapid pace.
You need to get ahead of all this for the sake of your privacy team and of your business. But to do so, you need to check out of “compliance” mode and into a long-term, strategic, privacy-first approach that extends far beyond compliance checklists and audits.
Keeping personal data secure and private — and making sure that customers and shareholders know you’re protecting their information, and how — is critical to the success of any business. The good news is that if you’re doing privacy strategically and enlisting the entire company’s support, you’ll likely find compliance is a lot easier.
Meeting consumer demand is the first rule of business — and consumers want to know that their information is protected from unauthorized access and use, and that they are in control of granting that authority.
Companies are listening. Four of the largest tech companies in the US are now emphasizing privacy in their public relations communications, according to the International Association of Privacy Professionals. They’re talking about privacy either to stand out or to reassure users of their products and services. In other sectors, too, companies are putting privacy first.
Sharpening privacy operations should be an immediate priority, especially if your company’s growth and innovation depend on how well you protect the privacy and data of your customers and other stakeholders. Here are four privacy-critical areas to show what we mean.
“Technology does not need vast troves of personal data, stitched together across dozens of websites and apps, in order to succeed,” said Apple CEO Tim Cook. “If a business is built on misleading users, on data exploitation, on choices that are no choices at all, then it does not deserve our praise. It deserves reform.”
Marketing departments have long satisfied their appetite for data with cookies, bits of code that log the websites people visit and help marketers personalize the ads we see. Lately, though, cookies have fallen out of favor with privacy-savvy consumers. Apple’s Safari and Mozilla’s Firefox browsers already block third-party cookies, and now Google has announced it will follow suit, phasing them out by 2023. Proactive marketers are planning ahead and finding other ways of gathering the data they need.
Consumer identity and access management (CIAM) appears poised to take up where the soon-to-be-obsolete cookie leaves off. CIAM is an increasingly popular technology for not only security but also privacy. It authenticates and verifies user identities before allowing them to access your website or applications, then collects and processes, in real time, information about their online activities and preferences — all with their permission, of course.
What’s changing: Instead of relying on outmoded third-party cookies, develop a new customer data strategy that may collect more and better data at a lower cost, and that can assure your customers that they are in control of how you use their information.
“Our approach to privacy and data protection is grounded in our belief that customers own their own data,” said Satya Nadella, CEO of Microsoft. "Our privacy principles include a commitment to be transparent in our privacy practices, to offer meaningful privacy choices, and to responsibly manage the data we store and process."
Many companies are launching features that let them interact and track what people do on their business websites and apps, and that help them personalize the messages and offerings that each customer receives. If a consumer has a good experience with a site, they’re more likely to stay longer and buy more.
Companies most concerned with the “user experience” include business-to-business (B2B) enterprises beginning to offer products and services directly to consumers (D2C), and companies launching an application for the first time. These organizations would do well to take a proactive approach to privacy.
Proactive privacy means encoding privacy into your application or website from the very beginning, even while you’re prototyping ideas, and keeping privacy in mind at every step along the way to deployment and beyond. One way to make sure privacy stays top-of-mind is to embed checkpoints for privacy impact assessments (PIAs) throughout the product development life cycle — positioning you with your product teams to avoid last-minute roadblocks that could slow down your go-to-market and undercut business objectives.
The principle of “privacy by design, privacy by default” increasingly mandated by privacy laws requires the technical experience of privacy engineers. Working alongside product development teams, designers, IT and compliance/legal professionals, the privacy engineers implement appropriate technical and organizational measures such as pseudonymization and differential privacy.
One caveat: Be on the lookout for dark patterns in your design that have the effect of subverting or impairing user autonomy, decision-making or choice. The newly enacted Colorado Privacy Act as well as the California Privacy Rights Act state that user consent obtained through the use of dark patterns is not valid. Privacy-conscious product developers avoid dark patterns that steer users into making certain choices, restrict the number of choices, or obscure essential information.
Consumers generally have little transparency into or control over what happens to their data once an entity has collected it. Only half are even aware of their device controls over data collection, according to a recent Consumers International survey. But enterprises can win customer trust by volunteering information such as “data used to track you,” “data linked to you” and “data not linked to you” clearly and simply. This growing trend, akin to privacy nutrition labels, provides consumers more confidence and helps them make better decisions when sharing their data with companies.
What’s changing: Include your privacy team, especially the privacy engineers, in product development meetings from the earliest stages to make sure that data privacy gets its due, and that the design does not manipulate users or trick them into giving you data that they may not be comfortable providing.
“Privacy cannot be a luxury good offered only to people who can afford to buy premium products and services,” said Sundar Pichai, the CEO of Google and parent company Alphabet. “Privacy must be equally available to everyone in the world.”
Intelligent automation like AI can improve innovation and help you create products highly desired by consumers, but the technology brings inherent data privacy risks. In the EU, officials recently proposed banning or tightening controls over live face scanning and other AI technologies that could endanger people’s safety or violate their rights.
AI “magnifies the ability to use personal information in ways that can intrude on privacy interests by raising analysis of personal information to new levels of power and speed.” If you don’t properly govern your use of AI and inform your teams about what’s legal and what isn’t, your company could be at risk of breaking the law and losing consumer trust. For better governance, you’ll want to bring together data-science teams and other developers with those involved in acquiring AI solutions such as operations and human resources and core teams, including privacy.
Use an accountability approach that rests on these four pillars: transparency (disclosing how you use algorithms in decision making), explainability (providing retroactive information on how algorithms were used in specific decisions), risk assessment (evaluating and mitigating risks in advance using PIAs and other analyses), and audits (evaluating privacy practices after the fact to improve your program).
What’s changing: Using AI responsibly means continuously self-monitoring to help identify potential bias and new sources of risk, and to always be adapting. Build frameworks and toolkits to assess your AI models to make sure that they are fair and ethical and that you can explain how they work.
“Even before the pandemic, data governance was emerging as a critical ESG risk when evaluating investments in the technology and communication sectors,” said Martin Jarzebowski of Federated Hermes. “As industries evolve, whatever their state, ESG evolves with it, unearthing specific vulnerabilities vital to stakeholders and, as such, enterprise itself.”
Leading executives, intent on increasing investor trust, see the linkage between transparency and ethical use of data as a means to further drive ROI from their ESG initiatives. Forty-two percent of the US respondents to PwC’s 2021 CEO survey ranked cyber and data privacy second among 11 areas in which, they said, more should be done to measure threats.
In fact, privacy and data security is an ESG investment priority — a priority equal to, if not bigger than, climate change, according to PwC’s survey on ESG practices and attitudes.
Investors increasingly include data security and privacy factors in their analysis of material risks and growth opportunities. ESG assets are on track to exceed $50 trillion by 2025, more than a third of the projected $140.5 trillion in total global assets under management, according to Bloomberg Intelligence’s (BI) latest ESG 2021 Midyear Outlook report.
When it comes to how data privacy programs are rated, ESG rating agencies are looking for demonstrable accountability. They systematically comb the public space for reliable evidence that companies are meeting the agencies’ privacy expectations. According to one rating system, the Morgan Stanley Capital Investments Industry Materiality Map, data security and privacy account for a considerable share of overall ESG ratings in certain sectors, including Internet and direct retail marketing (29%), communication services (24%), education services (18.8%), hotels (15%), financials (10.3%) and info tech (9.9%). The greater the transparency a company provides in these heavily-weighted domains, the more improvement in its ESG score.
What’s changing: The more details a company is able to attest to publicly about their privacy and security programs, the higher their scores are likely to be across the various ESG agencies. The SEC is expected to soon create formal regulations on ESG disclosures, including those related to cyber and privacy.
Counterintuitive though it might seem, now is not the time to shift your privacy program into overdrive. Now is the time to take a step back and view the big privacy picture. Then, having gained the lay of the land, you can start to make big plans that will position your organization as a privacy leader tomorrow as well as today.
Take stock of your current state and reset your privacy strategy to align with your broad data and business strategies. Determine what it will take to get there, including rethinking your privacy operations model and underlying resources/talent.
Tie your privacy strategy to your data trust strategy. Evaluate the four capabilities defined by our data trust framework: how well a company governs, discovers, protects and minimizes the data it holds. Data governance is the process, data trust is the outcome: data that decision makers can rely on, data use that is ethical, safe and trustworthy.
Today, no more than half of organizations are mature in their data trust practices, i.e. they have formalized processes and have fully implemented them, according to PwC’s April 2021 US Digital Trust Insights snapshot survey.
Gain a broader perspective on the tasks and challenges faced by business partners — including marketing, data governance, data analytics, IT and cybersecurity — sooner than later. You can better understand which parts of the business strategy they’re driving and where you fit. Bring together teams from across the enterprise to connect dots on the most urgent opportunities requiring privacy support, including quick wins that could be attained with less effort and collaboration. Many companies are chasing similar objectives, including more digital/mobile experience and first-party data strategies, so now is the time to move forward on opportunities to deploy privacy initiatives.
Resetting your privacy strategy to improve its effectiveness means building a privacy program in which your business strategy, your program strategy and your resource strategy dovetail with each other. Synchronization of all three is the ideal future state.
Much of the work that your privacy teams are doing now could likely be handled more efficiently and even more effectively by digital technologies. AI can perform an increasing number of tasks, especially those that your teams do repeatedly such as privacy impact assessments, answering rights requests from individuals (also known as data subject requests, or DSRs) and data mapping. Automation makes it easier to get, and stay, in compliance with existing and new laws and requirements.
Now that you’ve revamped your privacy strategy, your executive team must determine which skills your privacy team will need to accomplish your organization’s goals, and how to derive the most value from your staff. Here are three models from which to choose.
Execute ongoing privacy program monitoring and control reviews across the organization to identify and escalate issues
Facilitate end-to-end completion of Privacy Impact Assessments (PIAs)
Facilitate periodic reviews of internal policies and notices with your key stakeholders to validate they remain aligned with privacy practices and legal requirements
Triage with information security stakeholders to complete ongoing data protection assessments
Monitor results of data inventory and PIA activities to identify and escalate new instances of cross-border transfer
Serve as liaison for privacy office to document and resolve privacy-related incidents and breaches
Create and/or maintain inventories and mappings of systems and PI processing activities
Facilitate ongoing completion of third-party privacy due diligence, periodic monitoring and assessments, and contract remediation efforts
Monitor Individual Rights Request (IRR) intake queues and triage responses
Facilitate ongoing privacy training and awareness program and compliance monitoring
PwC colleague Joe DeMarzio contributed to the research and writing of this article.