Security Begins At Home
June 23 2016 saw the conviction by a federal jury of Jamie Knapp, 26, a (former) respiratory therapist at ProMedica Bay Park Hospital, Oregon, Ohio.
Knapp was found guilty of wrongfully accessing patient protected health information (PHI) between May 2013 and March 2014. Knapp was fully authorized to access this information. It was the use to which the PHI was put that was illegal.
In these days of HIPAA violations and court cases, focus is predominantly on external data breaches by hacking or other forms of unauthorized system access. However, Knapp operated as an employee. Knapp was authorized. Knapp did not trigger any warning flags. There was no data breach to find. Again, Knapp was authorized.
This highlights an issue which many organizations do not like to face. Namely, that a trusted employee may not actually be trustworthy. It’s a tough one.
This elephant in the room has long been known by I.T. departments. They fight this battle daily, when someone wants access to additional info, or data that they currently cannot see. "Why?” we scream, "Don't you trust me?!” we plead. Well, no. Frankly, they don't. And they shouldn't. It's nothing personal. It's just good I.T. management to restrict data access to a 'need to know' level.
EHR vendors (Including Sigmund Software) routinely provide granular access levels to users that follow role-based privileges set by the organization. Administrators get the most access. They can manage the system as well as add / delete / edit users and their data access levels. Clinicians may access medical data and PHI, nurses can see scheduling info, and financial teams may access billing information. Need to know. A nurse does not need to know about patient insurance cover, for example, and an accountant does not need to know whether a patient took their meds. So, only data relevant to that role is available. This granular access to data increases security, and helps to keep an EHR running optimally by reducing unnecessary data flow across the network. Bonus.
At the heart of the 'need to know' system, the aim for any organization is to minimize the potential for damage. We are forced to play the "What if..." game. What if a trusted employee turns out to be less than trustworthy? What if a high-access user has their password stolen? What if your CEO leaves a laptop in a cab? These are all important considerations for any I.T. team worth its salt. These and other considerations are all built in to the mandatory Risk Analysis assessment prepared by all healthcare organizations... You did do your mandatory risk analysis, right? Phew. Good!
The next time you, as a user, are frustrated about not being able to access certain data that you feel you absolutely must have, remember that there are often some very good reasons why you can't get at it. These are often regulatory, and equally often organizational. Again, this is not personal. The restrictions are there as much to protect you as they are to protect the patients, the organization, and other team members. Think of it this way: If you don't have access, you can't get the blame. So relax.
Knapp is one example of what may happen when trust is abused. Without doubt, Knapp is an exception to the vastly overwhelming rule. Knapp also demonstrates why constant care and vigilance is required, and why we are each accountable for system security. For example, if a colleague says they forgot their password and asks for your login, tell your supervisor immediately. Why? To be safe. To protect. Because that trusted colleague, let us call them Jamie Knapp, could do things with PHI data which has your login fingerprints all over it. Maybe they won’t, not good old Jamie. But they might. You really don't want that. Even if you can prove later that it wasn’t you, they still used your login… so maybe you were you an accomplice. You don’t get off the hook that easily. Mud sticks.
And if you don't tell your supervisor, that rogue colleague may keep asking other colleagues for logins until one of them says yes... you don't want that, either. If a supervisor is discreetly told by three different colleagues that a certain someone was asking for login information, they can take appropriate action. They can't if they don't know. To be clear, we’re not saying that we should all be on guard against our co-workers. We’re saying that one bad apple can leave a very bad taste, and that there are good reasons why advising a supervisor of a ‘give me your login’ request is a standing instruction to all EHR users in all organizations. It’s not personal. It’s good practice. Knapp shows us why.
No patient wants their PHI available over the Internet. In this ever-increasingly litigious society, no organization wants to be on the hook for a data breach. No individual wants to be identified as the source of such a breach, whether as in this case it was intentional, or whether it was accidental.
The overall point of this article is simple. Every person in a healthcare organization is a link in a chain. The weakest link is where any chain will break, be that a person or a procedure. We are honor bound to protect ourselves and the patients in our care. And that includes protecting the system itself.