As in past years, the challenges around data privacy and security were a familiar refrain at Health Datapalooza. You could hear chatter and jokes about HIPAA in the hallways and side conversations. Perhaps I’ll be the first (and last) to say, if you’ve heard one offhand “Is that a HIPAA violation?” joke, you’ve heard too many.
But beyond the bad puns, two important themes resonated from this year’s data privacy and security related conversations – access and trust. These important themes echoed those from AcademyHealth’s recent collaboration with the California Health Care Foundation (CHCF), which we have written about previously on this blog, to assist health care entrepreneurs understand their responsibilities under HIPAA. Over the last few months, AcademyHealth has engaged with startups in California and other locations through a series of workshops and will soon launch a short eLearning course. A key takeaway for digital health startups has been the realization that an increase in trust and access can benefit both their users and products.
In his remarks on the first day of Datapalooza, Congressman Michael C. Burgess, M.D. (R-TX), helped kickoff the meeting by emphasizing the importance of providing patients access to their own data, specifically noting how access can help patients and providers make the right health care decisions. This theme was reiterated in the work of the Data Liberator Award winners, OpenNotes, an intiative that encourages health care providers to share their visit notes with patients.
Later that morning, the new Director of the U.S. Health and Human Services Office for Civil Rights (OCR), Roger Severino, called for attendees to share their data privacy and security challenges and examples to help OCR ensure that HIPAA and other regulations are appropriately adapted to new technologies and use cases. He also noted OCR’s continuing focus on HIPAA enforcement, but also highlighted the bigger picture issue of how people’s trust levels are negatively affected by data breaches and security missteps. He noted that data breaches and other security incidents often create fear among patients and consumers about how safe their data truly is and, more importantly, harm relationships (e.g. patient/provider, patient/app, etc.) that rely on trust and mutual respect.
A series of related panels throughout the meeting helped attendees take a deeper dive into these issues. During the Digital Health, Innovation, and Security panel, Aaron Miri from Imprivata, provided my favorite Healthdatapalooza quote – “Don’t be creepy with the data”. This means thinking seriously about consumer choice and ensuring that users have the right to revoke permissions, as well as emphasizing informed consent and educating users as to how applications share data. A final point that triggered a lot of head nodding was to remember to “not take their word for it” when working with vendors and contractors – or, put another way, “trust, but validate.”
As we move beyond the 20th anniversary of HIPAA, an audience member asked why we still have so many privacy and access challenges? While panelists and others acknowledged that information blocking and limited business cases (read: costs) have slowed progress, but noted that we must remember the transformation of health technology over that time frame. This led to some great general advice. The next time someone says “we can’t do that because of HIPAA,” ask them if they have read the regulations or guidance provided by OCR, and if not, perhaps printout a copy of the relevant guidance and leave it on their desk. Simply put, HIPAA is important and has a great purpose, but it shouldn’t be used as an excuse to limit access.
Not surprisingly, a panel on the last day emphasized many of the best practices for privacy for startups we have found in our related effort with CHCF. Most notably, startups must think about data privacy and security (and in turn access) from day one. Not only will this likely lead to lower long-term costs by avoiding expensive reengineering down the road, but it also allows innovators to lead with trust. Longer term, you cannot hide poor or non-existent data policies and procedures from investors or potential buyers. As part of due diligence, they will quickly surmise if you recently scrambled to put together a privacy and security program or if you considered these issues from the beginning.
In closing, we all acknowledge that privacy and security are foundational to data sharing, but certainly, trust and access must be constantly maintained to ensure our great big building blocks of knowledge don’t come crumbling down. It’s up to all of us in the health data community to incentivize best practices in cybersecurity, keep asking ourselves hard questions about why we do or do not do certain things, and of course, to remember that real people are embedded in every data set.