February’s Wall Street Journal report pulled back the curtain on just how much is at stake when individuals share their personal health information with health and fitness applications. Several of these apps were (perhaps unwittingly) sharing users’ personal health information via a Facebook SDK that was automatically feeding that data to the platform. In one fell swoop, multiple companies damaged trust with their users — perhaps irrevocably.
But the dangers in digital health aren’t limited to rogue SDKs; three days after the Facebook news broke, yet another large health system announced the personal information of more than 325,00 patients had been exposed. All this comes as big tech companies like Apple, IBM and Amazon begin to enter the same space, with plans for huge impact. But even these well-established names enter healthcare with a trust deficit; Rock Health’s 2018 National Consumer Health Survey found that just 11 percent of respondents said they’d be willing to share health data with tech companies.
As we move toward an increasingly digitized world of healthcare — and as early-stage companies and tech behemoths operate alongside one another in the space — how can all involved uphold their responsibilities, follow relevant laws and regulations and maintain the trust of patients and users when it comes to privacy? Companies operating under the highest standards in healthcare are expressly prohibited from monetizing users’ data; how will large tech brand names adapt their business models to act properly?
In order for the promise of digital health to be realized, companies will need to ensure their patients’ data is safe, secure and error-free. Beyond security, healthcare companies operating as providers must also maintain the confidentiality and privacy of that data. Doing so isn’t simply good practice; it’s an existential requirement for companies operating in this space. There is a baseline expectation — from users, and from employers and health plans working with digital health companies — of privacy being maintained.
The success of digital health companies will hinge on whether patients feel comfortable sharing the most intimate data they possess — their personal health information (PHI) — especially when they worry that data could impact their employment. Below are three things digital health companies would do well to keep in mind as they operate in the space.
In 2018 alone, more than 6.1 million individuals were impacted by healthcare data breaches. Many have started to warn of the “data breach tsunami.” Complacency is no longer viable. The increasing frequency of data breaches should become a rallying cry. When it comes to PHI, protecting the privacy and security of patients and users must be a business imperative.
Patients want to focus on getting better, not having to constantly check their privacy settings.
Complying with regulations and requirements for protecting PHI requires a combination of robust privacy and security strategies. The Health Insurance Portability and Accountability Act (HIPAA) sets the baseline for patient data protection. For companies operating under HIPAA, responsibilities, obligations and opportunities become crystal clear. Federal laws and regulations prescribe privacy and security minimums, as well as the exact rules governing collection, storage and transfer of participant data. For health innovators, strong privacy practices and security controls are key to customer trust and to growth.
This also means that digital health companies must be active participants in shaping the regulations that govern their operations. This isn’t a call to hire as many lobbyists as possible to water down your responsibilities; it’s a demand to educate the state and federal policymakers who will be writing the rules of the road that govern your work for the next phase of healthcare. Informed policy that enables creative iteration while putting the needs of the patient at its center is imperative for the continued success of the entire industry. This is a space where regulations can be helpful in clearly identifying what not to do to be taken seriously — and operate properly — as a digital health company.
HIPAA applies to digital health companies — whether they contract as a vendor (a “business associate”) or a healthcare provider (a “covered entity”). Third-parties, especially those that handle PHI, have the potential of exposing health companies to data breaches and non-compliance. Any data breach suffered by a healthcare company will have serious consequences, including reputational damage, government investigations and monetary damages.
Once credibility has been tarnished, it takes significant time to rebuild trust among consumers. Fundamental to this is understanding the difference between operating in technology broadly versus in digital health, and ensuring that your organization is equipped with a deep understanding of all the ins-and-outs of HIPAA and health care data; patients want to focus on getting better, not having to constantly check their privacy settings.
The healthcare industry is already fraught with risk. New laws and market forces only add to the complexities. In order to reach full maturity, digital health companies need to invest, early, in information security experts who understand the intersection of medical devices, software and regulations. Senior leadership teams must empower these experts while staying engaged on best practices and the latest threats. This goes against the rapid growth mindset of venture-backed companies in other industries, but is critical when it comes to healthcare.
If you are handling patient data, hiring a legal and compliance team is a top priority. By implementing a privacy and compliance program, you’ll be better equipped to find and correct potential vulnerabilities, while reducing the chance of fraud, and promoting safe and quality care.