Third parties are of particular concern regarding data security and privacy
When it comes to making use of their data, few companies are able to go it alone. They may have limited internal capabilities, lack data science talent on staff, require contract services to store or send data, or rely almost entirely on contractors to conduct research on their behalf (see Figure 5 above).
These relationships help drive efficiencies, but they also leave data open to breaches.[ix] In the same way that banks require their vendors to meet specific security compliance measures, so, too, should companies require — and audit — adherence to security standards. Contracts should also require vendors to alert a company to any data breaches promptly, and companies should have retainer agreements in place with cybersecurity firms to respond if an event is found to have taken place.
Know that privacy standards vary for business partners
Some nontraditional health companies and new entrants — such as companies offering patient portals — are not bound by the same privacy standards that are outlined in the Health Insurance Portability and Accountability Act.[x] This can lead to the use of individuals’ data, such as for marketing purposes, without their consent.
To gain consumer trust, health companies also must evaluate the effectiveness of their data de-identification processes to reduce or eliminate the risk of the data being re-identified.[xi]
Watch out for risks related to data representation
When asked during PwC’s 180 Health Forum about the extent to which emerging technologies such as genetic testing have penetrated healthcare, Ellen Jorgensen, co-founder and chief science officer at Brooklyn, New York-based Carverr, cautioned that they have only for certain populations. “There’s already a certain built-in inequity going on; part related to money, part to trust,” Jorgensen said. “A lot of these great technologies are coming, but they aren’t coming in the right way for some people.”[xii]
The risks of poor data representation can be magnified as sets are fed into AI projects. As the data economy grows, inequities of the present could extend into the future if engineers and companies fail to find datasets that are as diverse as the population that will eventually use the tools. “If you want your algorithm to work well on a general population, for example, you’ll want an equally diverse mix of people in your research,” according to an article in Wired exploring payments to consumers for their health data. “[The] value for someone from a group often left out of clinical studies — say, women of color — might be relatively high in some cases. White men, who are often overrepresented in datasets, could be valued less.”[xiii]