Guest Blog: IDexperts Chris Apgar.
Every privacy professional knows that risk analysis is a foundation for successful information privacy and security, just as flossing your teeth is a foundation for good oral health. If you’re in healthcare, you also know that risk analysis is one of the five core Office for Civil Rights (OCR) “culture of compliance” requirements, and a prerequisite to receiving “meaningful use” dollars for implementing electronic health records (EHR). But what you may not know, according to nationally recognized information security expert and former HIPAA Compliance Officer Chris Apgar, is that compliance is not the biggest reason for conducting ongoing risk analysis. The biggest reason is that it can save your business.
OCR audits are proceeding, and failure to conduct risk analysis can result in a finding of “willful neglect” with penalties up to $50,000 per incident and up to $1.5 million per calendar year for the same type of violation (and any such finding will typically involve multiple types of violations). That risk, alone, justifies the cost of conducting risk analysis. A thorough risk analysis also provides a strategic roadmap for security spending, but Apgar says that even now, when he speaks to groups about medical data privacy, only about 1/3 of all healthcare organizations that are not seeking “meaningful use” dollars indicate that they’ve conducted risk analysis, and he points out that this is dangerous because by deferring the analysis, they may fail to identify other risks such as lawsuits, civil penalties, and loss of reputation that could damage or destroy their business.
Here are three other things Chris Apgar says you need to know risk analysis:
- Confidentiality is not enough. The three pillars of security are confidentiality, availability, and integrity, and risk analysis needs to account for all of these. Yes, you want to prevent data breach, but that’s not enough. For example, what happens if a patient is in critical care, systems go down, and doctors lose access to critical information they need to make medical decisions? Data corruption can be even more serious because if doctors unknowingly make bad healthcare decisions based on corrupt information, lives can be lost.
- Technical security is not enough. Apgar says that, too often, when an organization looks at risks, they look only on the digital side, but PHI risks extend far beyond technical infrastructure. You need to look at every place where PHI lives, in any form, and everyone who touches it. For example, encryption can mitigate risk in case of a security related incident involving electronic records, but you can’t encrypt paper. So if paper records are lost, by definition, that’s a security incident and potentially a reportable breach. People and process risks also have to be assessed as part of the security plan. One privacy officer that Apgar worked with pointed out that he and other compliance professionals in the organization had to be considered as organizational assets and as liabilities, because at that time, they were the only ones who knew how to respond in case of an incident, and if they were unavailable, the organization would be at risk.
- There’s more than one way to become a covered entity. A new Texas healthcare privacy law goes into effect this month. Apgar says that, in addition to non-compliance penalties over and above the federal, it has a broader definition of covered entities. Under the Texas law, if an organization handles any sort of electronic healthcare information, no matter its role in the healthcare system, it is covered by the new privacy requirements and considered a covered entity. So, for example, a small dental practice that transmits HIPAA covered transactions in Texas is now a covered entity under Texas law. In addition, business associates and subcontractors could now face non-compliance fines from both OCR and state of Texas. Other states, including California and Massachusetts, also have high levels of regulation around healthcare information. A thorough and ongoing risk analysis program is necessary to keep organizations of all sizes abreast of new risks and requirements at state and federal levels.
Apgar has a number of practical recommendations for conducting risk analysis.
- Successful risk analysis begins with a thorough inventory that accounts for all assets: digital, physical, and human. He points out that you need that inventory, anyway, to create a disaster recovery plan, and that keeping that inventory current makes the initial risk analysis and updates relatively simple because you have a baseline to work from.
- Think of things inside the organization that can hurt you. “Threats” are unpredictable outside factors such as natural disasters and hackers that require response plans, but there are “vulnerabilities” that you can address to head off trouble. For example, you can help preventing network attacks by putting in place a process to ensure security patches are always kept up to date.
- The risk analysis needs to rate risks both in terms of likelihood and in terms of potential harm or impact. For example, tsunamis are unlikely in Oklahoma so they don’t need to be part of an Omaha hospital’s disaster recovery plan, and unauthorized access to one patient record showing on a computer screen is likely to cause far less damage than a stolen computer full of patient records in lab’s business office. Once you’ve made a reasonable assessment of the likelihood and potential impact, it will become clear how best to spend your security budget and resources.
- Don’t stop with the risk analysis. Meaningful use requires risk analysis, documentation, a mitigation plan, and implementation of a risk management program. Whether or not your organization is seeking meaningful use dollars, knowing about a risk offers little protection if you don’t act on the knowledge and implement steps to manage risk throughout the year.
- If you bring in experts to conduct a risk analysis or to help your staff conduct one, look for someone who has done this before in healthcare and who has a track record with your type of healthcare business. Make sure their products and services address more than just technical security, and check references, of course, but also ask colleagues about their reputation. Word travels fast in the healthcare industry, and word on the street may tell you things that you won’t find out in a reference check.
Chris Apgar says the most critical thing to realize about risk analysis is that it stretches beyond what the regulations require. “There are so many other risks: the risk of being sued, of losing your practice, of causing harm to your patients. Yes, doing risk analysis costs time and money, but not doing it is a good way to lose more money or lose your business.”