Ten digital trust challenges

Norbert Schwieters Global EU&R Leader Global Power & Utilities Leader | Dec 04, 2015

Three pillars of digital trust

We live in an era where technology is changing our lives at a rate that every part of society is grappling to keep up with. Technology creates enormous opportunities and new risks alike. Many people are worrying about their privacy; companies are seeing their business models come under attack from new digital entrants and their corporate behaviour magnified in real time through social media; and governments are facing diminished control as new forms of communication empower their citizens – to mention only a few chan

It’s not just the challenge of keeping up to speed with technological developments. It’s whether and how the current design of our private and public institutions needs to adapt to cope with these changes and to restore the trust of society – digital trust. In June, to mark the 800th anniversary of the Magna Carta, we looked at how institutions (organised and purposeful interactions of people based on contract, law or culture) must create and maintain trust through legitimacy, effectiveness and transparency, and how global megatrends like technology are driving the need for a design transformation and a bold new charter for our digital world.

Here we take a look at 10 digital trust issues that in our view institutions must grow new capabilities to address. These issues centre around the ethics and control of data access and use, interaction through the Internet, digital risk resilience and value creation in the digital age. The emergent and interconnected nature of these issues – and the regulatory response to them – highlight the challenges organisations face. How – and how quickly - will they have to adapt their design to create trust against this backdrop of digital transformation? We consider a few questions below in the context of the three pillars of a trustworthy institution.

Legitimacy

  • How can organisations better align with their stakeholders interests?
  • Which stakeholders will be affected and how should they be involved?
  • Should organisations follow the letter of the law or be guided by ethics-based principles?

 

Effectiveness

  • Should organisations follow the letter of the law or be guided by ethics-based principles?
  • With the immediacy of social media, how can real-time accountability be achieved?
  • Will new public institutions have to be created in order to ensure digital trust?

 

Transparency

  • With so many digital processes operating at a level we can’t see or understand, what level of transparency is needed?
  • How will institutions need to monitor and report on digital trust, both internally and externally?
  • Is auditing required and how should it be done?

In this brave new world, organisations must fundamentally rethink how they create value. The explosion of digital data and the new and innovative ways it’s being leveraged have huge implications for the economic value chain. In a world of (near) zero marginal cost of production and rising relative value of intellectual property, classical cost-based business models and the implications for income generation and distribution, in short social welfare, need to be re-examined.

Data ethics and control graphic

Data ethics and control

(1) Data privacy and usage – getting the balance right

To what extent should data be gathered, analysed and used? The digital revolution has given companies the technologies to collect enormous amounts of data about consumers and employees. It’s also provided the analytical capabilities to create new knowledge and insights, affecting how this data is used. The public has been slow to realise just how much personal information companies are collecting and using, but as they become more informed, managing and protecting privacy and data use will become an important barometer of how much companies are trusted by their wider stakeholders.

Is it sufficient to legitimise data access and analysis through the legal terms and conditions of a customer application for goods or services, for example? What level of transparency should the customer be entitled to? With personal information now an economic commodity in its own right and potentially the solution to some of society's greatest scourges like disease, how can we achieve an acceptable balance between individual privacy rights and the wider interests of society? The ethical use of data is another issue that needs to be addressed – and something that regulators are increasingly calling to question. And what measures are needed to educate the public on risks and benefits of technology and actions that individuals can take to manage their technology footprint? As the privacy debate is reframed by digital disruption, organisations must look at which direction and to what extent they evolve their information governance strategies and capabilities, and ensure that the mechanisms chosen to balance competing interests are ones that generate trust and confidence amongst their stakeholders and in the general public.

For more information, please contact:

David Burg | Joint Global Leader, Cyber security and Privacy, PwC US | Tel: +1 (703) 918-1067 | Email

Peter Cullen | Privacy Innovation Strategist, PwC US | Email

Stewart Room | Global Leader, Cyber Security and Data Protection, PwC Legal (UK) | Tel: +44 (0) 20 7213 4306 | Email

Grant Waterfall | Joint Global Leader, Cyber security and Privacy, PwC US | Tel: + 1 (646) 471-7779 | Email

The new ethics of people data graphic

(2) The new ethics of people data

The use of people analytics is allowing organisations to attract, develop and retain key talent through the use of HR data. This is turning talent management into a much more scientific, fact-based function than in the past, and more closely linked to business strategy and performance. But while technology has completely altered the operating environment, rules of behavior are still playing catch-up. With so much data about employee performance and actions available to organisations, what governance guidelines and processes are needed to ensure ethical monitoring and data use? Are radically new policies needed to manage trust issues around managerial use of information, ensuring and enforcing legitimacy, compliance and transparency?

 

Take wearable technology for example, which can exponentially increase the amount of information available about employees. How much of this information is necessary or acceptable? Or consider how difficult it is nowadays for employees to keep their personal lives separate from their working existence: employers are using personal information shared via social networks to make character decisions when hiring and to evaluate the risks existing employees pose to the organisation. Moreover, as an increasing amount of personal information becomes more accessible, cyber threats can shift to personal threats for employees such as blackmail. Employers need to be wary about both protecting data and choosing what data they capture – and to be aware that there are demographic and cultural differences in what data employees are willing to hand over.

For more information, please contact:

Jon Andrews | Joint Global Leader, People and Organisation, PwC UK | Tel: +44 (0) 20 7804 9000 | Email

Jon Williams | Joint Global Leader, People and Organisation, PwC Australia | Tel: +61 (2) 8266 2402 | Email

Predictions and profiling of people graphic

(3) Predictions and profiling of people – what are the limits?

Using big data to predict behaviour and profile individuals offers obvious business benefits, for example in segmenting customers and predicting their spending habits, in strategic workforce planning or in preventing fraud. But these techniques can cross the line when it comes to individual rights and privacy. Retailers, for example, can map individual customer information to patterns they’ve detected in their vast data stores to derive new information about these shoppers that may make them uncomfortable or – justifiably or not – invoke countervailing measures that the customer isn’t aware of.

Profiling methods also open the door to discrimination and to punishment that may be unfair. If we’re able to predict behaviour to a great level of accuracy, should we act before that behaviour is exhibited? Should the movement of individuals with certain profiles for example be restricted even before they commit a crime? How should we balance these considerations in the interests of national security when threats are growing across the globe, and where should the line be drawn between data privacy and safety? While obvious areas for debate include policing, national security and immigration, they can also impact the workplace. Consider employers who use predictive analytics to anticipate future risks and shape policies that could have punitive effects on specific individuals or groups. Or employees who are given wearable technology to encourage fitness and wellbeing in the workplace. Should they trust that their employers are being genuine in looking after their health or worry that digital health profiling could leave them out-of-pocket for health insurance costs - or even out of a job if they don’t conform to certain standards of health and fitness? Where should profiling techniques legitimately be used, how transparent should they be, and what checks and balances are needed – keeping in mind different cultures, outlooks, traditions, expectations and laws?

For more information, please contact:

Paul Blase | Global and US Data and Analytics Advisory Leader, PwC US | Tel: +1 (312) 298-4310 | Email

Anand Rao | Business Analytics Innovation Leader, PwC US | Tel: +1 (617) 530-4691 | Email

Algorithmic regulation graphic

(4) Algorithmic regulation – seeing through the black box

More and more of our daily lives are now governed by automated rules – algorithms to achieve particular outcomes that are constantly adjusted as new information comes in. Think of ‘sat navs’ or credit card approvals or websites that suggest what content you might like to consume. In the digital world, code is the new law. But unlike traditional laws, there isn’t the same degree of legitimacy let alone transparency. How can we trust something when we don’t understand how it works? The enormous power of algorithms brings with it a need for transparency and accountability - and the level of scrutiny that organisations who wield such power will come under from their stakeholders is only going to increase.

Without transparency, how for example can we be sure that decisions are based on causal relationships as opposed to correlation? Or consider the blockchain public ledger technology which underlies cryptocurrencies like bitcoin: the technology creates data that by the nature of its origin can be trusted, but is trust based only on effectiveness enough? Or take the sharing economy. Its new business models enable better use of existing capacity – but have also fuelled debates, in particular the clash between legal regulations that bind existing service providers versus the algorithmic regulations (based for instance on feedback loops) used in these new digital service offerings. How can trust and reputation, so vital to sharing economy, be maintained when some users are wary about how some of these rules are defined and implemented? Transparency is also important from the viewpoint of governance, risk and compliance: to what extent can we trust machines to control machines? Should we regulate algorithms with algorithms, or have humans doing the job – and do they have the right technical skills in sufficient numbers and speed to regulate effectively? This is particularly relevant given for instance the rapid development of flash trading in the financial markets.

For more information, please contact:

Paul Blase | Global and US Data and Analytics Advisory Leader, PwC US | Tel: +1 (312) 298-4310 | Email

Anand Rao | Business Analytics Innovation Leader, PwC US | Tel: +1 (617) 530-4691 | Email

David Burg | Joint Global Leader, Cyber security and Privacy, PwC US | Tel: +1 (703) 918-1067 | Email

Stewart Room | Global Leader, Cyber Security and Data Protection, PwC Legal (UK) | Tel: +44 (0) 20 7213 4306 | Email

Grant Waterfall | Joint Global Leader, Cyber security and Privacy, PwC US | Tel: + 1 (646) 471-7779 | Email

Building safeguards aruond artificial intelligence graphic

(5) Building safeguards around artificial intelligence

The increasing use of data and algorithms to automate processes and drive decision-making in business has led to greater efficiency and accuracy across a range of business functions and sectors. And with the creation of algorithms that can learn and make predictions from data, independent of any human input, machines are becoming smarter. But how can we ensure that our algorithms don’t go ‘rogue’ and take over? To what extent do we need to understand these algorithms and – ultimately - have ‘controls’ or ‘emergency’ buttons to wrest control back from machines?

The artificial intelligence (AI) debate shows no sign of levelling off, with widely differing views on when computers will have intelligent behaviour that’s equivalent to that of humans, or the extent to which AI is an existential threat to the very future of the human race. But one thing is clear: the ethical and trust implications of AI must be explored. One crucial aspect will involve better integration of human and machine collaborations in the workplace, to oversee and maximise the efficiency and effectiveness of algorithmic processes. Governance frameworks will also need to be assessed to ensure a robust system of checks and balances. While many societies are increasingly comfortable with the growing role of machines as an essential tool to improve everyday life, such measures will be vital in building a greater degree of trust.

For more information, please contact:

Paul Blase | Global and US Data and Analytics Advisory Leader, PwC US | Tel: +1 (312) 298-4310 | Email

Anand Rao | Business Analytics Innovation Leader, PwC US | Tel: +1 (617) 530-4691 | Email

Interaction through the internet

Interaction through the Internet

(6) Governing the Internet without breaking it up

The Internet connects people and harnesses information to an extent never seen before, but today it’s unrecognisable from the egalitarian open exchange of information envisaged just half a century ago. Take the so-called Net Neutrality debate in the US – pitting the major Internet Service Providers who want to create a premium-priced superfast Internet for business against consumer rights advocates who argue that there should just be one, equal Internet for all users. In the international arena, an increasing number of countries have enacted more restrictive laws governing web access, either for security or privacy reasons. With today’s business world characterised by a global digital presence and data flows, this means many more companies are being impacted by laws outside of even their major markets. Contrast these developments however to the Digital Single Market Strategy being proposed in Europe which aims to bring down regulatory barriers.

The Internet has also become an important source of corporate and government power and influence, as evidenced by the extent to which they handle and manage our data - raising serious questions for greater society about the transparency, ethics and legitimacy of Internet usage in the future. What rules need to be established – and who needs to establish them - in order to govern the gatekeepers of our digital information and those who control how we access the Internet? And how should regulatory balance be achieved to address growing concern about the fragmentation of the Internet as more countries set up their own structures, safeguards and barriers?

For more information, please contact:

David Burg | Joint Global Leader, Cyber security and Privacy, PwC US | Tel: +1 (703) 918-1067 | Email

Mohamed Kande | Global Leader, Advisory, Technology, Information, Communications and Entertainment, PwC US | Tel: +1 (202) 756-1700 | Email

 

Interaction through the internet

(7) Cyber security and citizen privacy

Cyber security is only going to grow in importance as we shift more core societal functions online and we connect more of our physical life to the Internet. On a micro level the risk could come from hackers taking control of our connected cars, to give just one recent example. On the macro level there’s the spectre of global cyber warfare targeting crucial energy or security infrastructure, commercial assets or mass transport for example – with businesses caught in the crossfire of rising geopolitical tensions. With the lack of international conventions around this type of warfare, what happens if you deploy a cyber weapon?

As cyber security threats increase it’s likely, if history is any indicator, that governments will seek to access, monitor and control even more of our personal data. The same will no doubt be true of companies as they look to protect their vast databases of consumer and employee information along with their systems and digital operations from hackers bent on corporate espionage or even sabotage. Should encryption of information be prohibited in certain cases? Or should there be limits to what data can be accessed? What measures are needed to protect institutions against leaks, losses and manipulation of data that take into account concerns about privacy and ethics? What do governments and companies need to do in order to regain the trust of an increasingly sceptical public that’s demanding more control over their data?

For more information, please contact:

David Burg | Joint Global Leader, Cyber security and Privacy, PwC US | Tel: +1 (703) 918-1067 | Email

Peter Cullen | Privacy Innovation Strategist, PwC US | Email

Stewart Room | Global Leader, Cyber Security and Data Protection, PwC Legal (UK) | Tel: +44 (0) 20 7213 4306 | Email

Grant Waterfall | Joint Global Leader, Cyber security and Privacy, PwC US | Tel: + 1 (646) 471-7779 | Email

Digital resilience graphic

 

Digital resilience

(8) Resilience in the face of digital disruptions

Our reliance on technology to enable day-to-day activities has skyrocketed: we check into flights online, access hotel rooms with our mobiles, do online banking. We don’t really think about – until something goes wrong. In the past, companies could be forgiven for occasional system interruptions. Not anymore. With so much of our lives dependent on technology, interruptions can be disruptive and expensive. Businesses across sectors have been shaken by the impact of several high-profile IT outages and glitches, eroding digital trust. The ability to “keep the lights on” is becoming a business priority. But organisations are challenged by the sheer complexity of their IT services, with layers of infrastructure and multiple service and process interdependencies. They’re also under pressure to deliver with ever greater speed and lower cost - and to spend more time and resources on growth and innovation. And they face the increasing frequency and sophistication of cyber attacks. How can companies gain a holistic view of the IT landscape and all its risks, and design the right governance and risk management processes to better manage them?

Preparedness for threats and disruptions is one aspect of digital resilience. But the speed and effectiveness of response to a crisis, and the ability to manage the recovery process are just as vital. Consider the power of social media, where an overnight tweet in Australia, for example, could cause a product crisis in Brazil. How can companies ensure they can react optimally to events even as they’re unfolding? Their ability to do so will greatly influence their level of trust and ultimately their resilience in an ever-changing world.

 

For more information, please contact:

David Burg | Joint Global Leader, Cyber security and Privacy, PwC US | Tel: +1 (703) 918-1067 | Email

Peter Cullen | Privacy Innovation Strategist, PwC US | Email

Stewart Room | Global Leader, Cyber Security and Data Protection, PwC Legal (UK) | Tel: +44 (0) 20 7213 4306 | Email

Grant Waterfall | Joint Global Leader, Cyber security and Privacy, PwC US | Tel: + 1 (646) 471-7779 | Email

 

 

Value creation in the digital age graphic

Value creation in the digital age

(9) Redefining ownership in a virtual world

Digitalisation is transforming how we think about ownership. As value increasingly shifts from physical to virtual it’s creating a lot of grey areas when it comes to property rights or where value is created, increasing the scope for disputes and eroding trust in the process. Take the digital distribution of works for example. Where providers distribute physically, the customer as owner is free to use the products how and when they like. But with the shift to digital distribution, there’s the possibility that customer rights could be reduced or revoked over time. Or take personal data. Who owns it - companies or consumers? Might personal data “vaults” - which individuals own and manage as a transactional currency – create a business model for realising value, as well as bolster trust and accountability in the way we share data? Cloud storage also poses new questions about property rights. For example what restrictions should there be on the ability of cloud service providers to share information with third parties? Should the location of the cloud server determine which laws are followed?

The issue of where value is created also has large implications for global tax strategies, as corporate assets become increasingly digitalised. Accusations that companies aren’t paying the right amount of tax in the right places can quickly erode trust even if they’re compliant with national tax laws. Whileproposed reform of international tax rules can help address the issue, perhaps the immediate question is whether business simply wants to be trusted to comply with regulation or for making decisions based on principles and values. How can companies better understand the wider impact of their tax decisions? Should they voluntarily take the lead in tax transparency? How can businesses and governments work together to create workable standards for a global economy – including digital business?

 

For more information, please contact:

Mary Monfries | Tax Partner, PwC UK | Tel: +44 (0)20 7212 7927 | Email

Dr. Jan-Peter Ohrtmann | PwC Legal (Germany) | Tel: +49 211 981-2572 |Email

Stewart Room | Global Leader, Cyber Security and Data Protection, PwC Legal (UK) | Tel: +44 (0) 20 7213 4306 | Email

Rick Stamm | Vice Chairman, Global Tax, PwC US | Tel: +1 646 471-1035 |Email

 

Managing the dark side of workplace automation graphic

(10) Managing the dark side of workplace automation

There’s no doubt that automation in the workplace has increased productivity, improved product quality and reduced workplace accidents. It’s also creatingjobs – both directly, in the provision of such technologies and through redeployment of workers, and indirectly, through the non-tech roles required to support and service this growing trend, and the economic stimulation that increased productivity drives. At the same time, however, increased automation will continue to eliminate certain jobs. We’ve seen this in themanufacturing and logistics sectors, and will next see it sweep through white collar sectors like accounting and healthcare. If not managed effectively, the negative impacts of automation on employment levels and disposable incomes could affect economic growth, as well as the very fabric of society.

As our digital society continues to evolve, business and government will have to win back the trust of sections of society that feel marginalised by a new type of digital divide. Will it be enough to retrain workers? Should we see education as a life-long process, with funded schooling for all, so that workers can continually adapt their skills to the changing needs of the economy as technology advances? The key difference between the previous agricultural and industrial revolutions and the coming artificial intelligence revolution is that jobs will be displaced at a much faster pace than what it will take for humans to reskill. But what can we learn from history about shaping new laws, charters and movements to adapt to the social chaos wrought by innovation, as we consider an age where work (as we have known it) may no longer be the norm?

 

For more information, please contact:

Jon Andrews | Joint Global Leader, People and Organisation, PwC UK | Tel: +44 (0) 20 7804 9000 | Email

Jon Williams | Joint Global Leader, People and Organisation, PwC Australia | Tel: +61 (2) 8266 2402 | Email

 

Restoring trust in the digital age graphic

Restoring trust in the digital age

Against the backdrop of these digital trust issues, institutions are challenged to adapt their governance and organisational design both rapidly and effectively, or face risks to their long-term sustainability as stakeholder trust levels erode.

However, there isn’t a one-size-fits-all solution across the globe. An institution can achieve an overall level of digital trust with varying combinations of the degree to which the three pillars of legitimacy, effectiveness and transparency are realised. The political, societal and cultural background of its environment plays a crucial role here.

Organisations – private and public - will have to become comfortable dealing with grey areas and a constantly shifting landscape. They’ll have to find the right balance between how they get and use data and the social consequences of those actions – and accept that the line is always moving. They’ll have to build a great deal of flexibility into data strategies and models. And they’ll have to create a foundation of values to inform compliance, governance and ethical decision-making. Which brings us back to the need for a design transformation.

By its very nature digital technology – like its siblings biotech and nanotech – has set out to create a new dimension of intelligence that undermines the control humans had over technology in the past like never before. Apart from the need to rapidly increase our capability to understand the impact of technology the question arises as to whether the catalogue of human rights needs to be amended in order to restore control and form the constitutional basis for digital trust. As we’re right on our way into the ‘machine age’ this is at the core of a new Magna Carta for the Digital Age.

While technology is transforming how institutions relate to their stakeholders, digital trust is but one aspect of trust. In the coming months we’ll look further into how organisations can take concrete action on embedding changes in their design to achieve greater levels of overall trustworthiness. These will centre around the measurement of trust and of the trust impact of incremental changes in institutional design. This will form the basis for informed investments into trust.

Contact us

Norbert Schwieters
Global Leader, Consumer and Industrial Products & Services, PwC Germany
Tel: +49 211 981 2153
Email

Follow us

Twitter LinkedIn Facebook Youtube Google+