sommai - Fotolia
AI is coming to healthcare, and attorney Rebecca Williams urges vendors to enter into the fray with their eyes focused on HIPAA regulations as they use AI to develop software for the healthcare industry.
Williams, a registered nurse, is the chair of the Health Information Technology/HIPAA Practice Group at Davis Wright Tremaine LLP in Seattle, Wash. She focuses much of her practice on privacy, security and healthcare regulatory issues.
In this Q&A, Williams says vendors need to be well aware of HIPAA issues when incorporating AI into healthcare software. To vendors, she says, "you don't want to find yourself one day having to be in full compliance and you're not. You really want to be planning this and recognizing that complying with HIPAA may be a legal necessity."
Williams offers guidelines for ensuring AI-assisted healthcare software meets HIPAA requirements.
Why are you trying to raise vendor awareness about potential HIPAA issues related to developing AI-assisted healthcare tools?
Rebecca Williams: Healthcare often involves life-and-death and very sensitive matters. It also is a heavily regulated industry. We need to be using AI in a safe and ethical manner and in a way that we can protect the confidentiality, privacy and security of this very personal information. It is important for developers and vendors of AI, as well as healthcare providers and health plans, to understand what laws apply when exploring ways to use AI.
Regarding HIPAA issues, what is the message you want vendors to hear?
Williams: Approach the intersection of healthcare and AI with your eyes wide open. Think about this proactively. Build compliance with healthcare regulatory requirements and ethical obligations into your business and operational plan. It will serve you well in the future and it is the right thing to do. You don't want to find yourself one day having to be in full compliance and you're not. You really want to be planning this and recognizing that complying with HIPAA may be a legal necessity, but it's also going to be something very important for the marketing of your product.
How important is AI in developing healthcare tools?
Williams: We are seeing AI increasingly a part of healthcare products. It can be used for both descriptive and predictive purposes. AI can tell us how we are doing with respect to a chronic condition or overall quality of patient care. A hope is that it may be used to identify trends and factors that may contribute to medical conditions. And this may further allow healthcare to be tracked and provided remotely and available where the patient is.
How do you see AI transforming healthcare?
Rebecca Williamshealthcare attorney, Davis Wright Tremaine LLP
Williams: Healthcare creates a tremendous amount of information. But a challenge is how to manage and make sense of the information. How can it be available when needed? How can it be useful for the specific individual and for the population as a whole?
When you think of how much data is available and needs to be processed in healthcare -- and that would include health plans, as well as healthcare providers -- AI could really make a difference with that amount of data. Obviously, AI needs a lot of data, and the healthcare industry has a lot of data but needs a better way of organizing, processing and learning from it.
What vendor and developer activities might trigger HIPAA obligations?
Williams: Generally, when an AI vendor is dealing directly with consumers, it is unlikely that HIPAA will be triggered although other laws apply to protecting personally identifiable information. HIPAA tends to come into play when AI developers and vendors deal directly with healthcare providers and health plans. If a vendor is providing certain services, activities and functions on behalf of an entity covered by HIPAA and protected health information is created, received, maintained or transmitted as part of those services, activities and functions, then the vendor likely is a business associate and is directly covered by HIPAA. The covered provider, plan or clearinghouse must enter into a business associate contract with any business associate. The AI functionality also should be included in the covered entity's HIPAA risk analysis under the HIPAA Security Rule.
What types of HIPAA issues might lead to violations?
Williams: HIPAA carries both civil and criminal penalties. An AI vendor that is a business associate must comply with the privacy, security and breach notification obligations under HIPAA. That means the business associate must implement privacy, security, and breach notification policies; train workforce on those policies; impose sanctions or disciplinary actions on workforce who fail to comply with the policies; perform -- and regularly revisit -- a HIPAA risk analysis that identifies the risks and vulnerabilities to protected health information; implement a risk management process to bring those risks to a reasonable level; establish an incident response plan; and appoint a security (and possibly a privacy) official. Failure to take these steps could result in enforcement actions. Covered entities also must have business associate contracts in place with each business associate. Failure to do so can result -- and has resulted -- in enforcement actions.
What are key questions related to HIPAA issues that vendors should ask when developing AI-assisted healthcare tools?
Williams: Some critical questions include: To whom is the AI vendor providing services? Covered entity or business associate versus consumer? Is the AI ingesting protected health information, which is broadly defined and can include demographic information and patient-plan lists if linked with individuals' healthcare? To whom is the AI marketed? Who is paying? What activities and services are being performed? The key is to think through the arrangement on the front end. Know what laws are triggered. Respond accordingly.