No time to read?
Save our HIPAA compliance guide and read it later
(Valuable checklists included!)
While most healthcare professionals and stakeholders are aware of HIPAA, many do not fully appreciate the significance of this piece of legislation in their day-to-day operations, particularly in the area of security.
At the same time, there are very real consequences for organizations that compromise HIPAA standards. Procrastinating HIPAA compliance can be an expensive mistake. If your organization collects health information, getting compliant now will help you avoid penalties such as fines, legal fees, or harm to reputation.
As organizations rely more heavily on electronic and online channels to gather, store, and share patient data, practicing HIPAA compliance becomes more complex.
This page will walk you through the essentials of HIPAA, from general HIPAA compliance standards to what constitutes a HIPAA violation and how to prepare for a HIPAA audit.
HIPAA, or The Health Insurance Portability and Accountability Act, is a five-pronged piece of legislation intended to safeguard individuals’ personal health data and their access to health insurance. It is enforced by the US Department of Health and Human Services (HHS) and the Office of Civil Rights (OCR).
When you hear healthcare providers or other non-insurance organizations talking about HIPAA compliance, they typically are referring to Title II. This is the set of standardized, national guidelines governing how organizations secure and share protected health information (PHI) or electronic protected health information (ePHI). (On this page, we’ll use both terms interchangeably.) The aim of Title II is to simplify healthcare administration while preventing fraud and abuse resulting from inappropriate use of PHI.
HIPAA legislation is more than two decades old, but the application of HIPAA standards has changed as the infrastructure of the healthcare industry evolves. Furthermore, the regulations themselves have changed. There are now many provisions of HIPAA that relate specifically to the electronic storing and sharing of ePHI and new updates are expected to be proposed in the coming year.
Securing ePHI becomes especially complex when this data is stored or shared in the cloud. Electronic tools like care management, self-service applications, and mobile health apps all increase the security risks facing health data. Because of this, an organization’s IT team should be a close partner in establishing and maintaining HIPAA security compliance. Particularly when using cloud services, your IT department needs to take specific steps to make your cloud environment HIPAA-compliant.
The short answer to that question is: extremely important. Patients entrust their healthcare provider or provider of other health services with some of their most sensitive personal data. HIPAA guidelines help those organizations maintain that trust and hold them accountable for how they handle patient data. When that data is misused, patients suffer. Furthermore, there are legal and financial consequences for organizations that fail to fully adhere to HIPAA compliance guidelines.
If your organization or employee is found to be non-compliant at any point, you may face civil or criminal penalties from the federal government. Civil penalties apply when the OCR determines that the violation was not willful and can carry a fine of up to $25,000 per violation. The criminal penalty for a willful HIPAA violation by an individual is a fine of up to $250,000 and/or a prison term of between 1 and 10 years per violation. In either case, non-compliant organizations may also face litigation by the patient and reputational damage.
Often, if an organization has mishandled or failed to protect data in one instance, it has made the same errors across multiple cases. If suspected of breaching HIPAA, an organization will be subjected to a HIPAA audit that can uncover other instances of negligence and result in additional per-violation fines. (You can find detailed information on how to pass a HIPAA audit further down the page.)
THE CRIMINAL PENALTY FOR A WILLFUL HIPAA VIOLATION
The criminal penalty for a willful HIPAA violation by an individual is a fine of up to $250,000 and/or a prison term of between 1 and 10 years per violation.
While an organization may not set out to intentionally misuse or abuse patient data, ignorance is not an acceptable excuse. Furthermore, organizations are held accountable not just for how they use and protect PHI but also for how any partners or contractors use the data provided to them by the organization. It is, therefore, essential that your organization agrees with any third-party entities on a HIPAA-compliant data governance strategy and cooperates only with trusted partners. Additionally, organizations should have an executed business associate agreement (BAA) in place with all 3rd parties storing, processing, and/or managing PHI.
The vast majority of HIPAA violations occur through security breaches via theft, loss of portable devices, or hacking. Your organization’s security policy is, one of the most important factors in how successful it will be in avoiding HIPAA breaches. Implementing strict policies for how devices and networks are secured is essential for HIPAA compliance, as is securing all data within your cloud environment.
“HIPAA compliance” means adherence to HIPAA standards with regard to how protected health information is stored and shared. Any discussion of HIPAA compliance, of course, has to start by answering the question, “What is protected health information”?
Protected health information, or PHI for short, is patient data that is protected by HIPAA. PHI that is electronically collected, stored, or shared is called ePHI. On this page, we will use the acronym PHI to refer to both forms of protected health information.
HIPAA legislation defines PHI as data relating to the past, present, or future health of an individual and that can be used to identify that individual. Examples of PHI are:
PROTECTED HEALTH INFORMATION
or PHI for short, is patient data that is protected by HIPAA
PHI that is electronically collected, stored, or shared
Date of birth, date of admission, date of discharge, and other key, individual dates
Biometric identifiers like finger or voiceprints
Device serial numbers and identifiers
Diagnosis and treatment details
Account number, medical records numbers, driver’s license numbers, social security numbers, health plan beneficiary numbers, and certificate numbers
Any other unique number that can be used to identify the individual
IP address numbers and URLs
Names of relatives
Email addresses and phone numbers
Photographs of patient or user
Geographical information, excluding the first three digits of a zip code (if the combined population of localities sharing these first three digits is over 20,000 people)
PHI can be found in many places — internal communications between medical staff, emails from patients, billing information, and appointment scheduling tools are chock full of it.
All of this information can be used by healthcare professionals to identify an individual and determine the appropriate care. Failure to adequately secure this information compromises patients’ privacy and exposes them to risks like identity theft and blackmail.
It may come as a relief to learn that not all data that passes between patients and providers is considered PHI. Employment records of a covered entity, for example, are not considered to be PHI; nor are Family Educational Rights and Privacy Act (FERPA) records.
Remember, the distinguishing feature of PHI is that it is personally identifiable information. Data like number of steps or calories burned, such as may be collected from fitness apps, is not included as PHI because it cannot be used to identify an individual. Additionally, health data that does not contain personally identifiable information (PII) like blood sugar readings or heart rate is not considered PHI.
If you’re wondering whether or not your organization handles PHI, there is a simple test: if your device or application (including organization computers or mobile app) stores, records, or transmits health information that can be used to identify an individual, then you’re dealing with PHI and need to practice HIPAA compliance.
A second key term for HIPAA compliance is “covered entity”. HIPAA legislation uses the phrase “covered entities” to refer to any organization that collects, transmits, or stores any PHI via products or services.
So how does your organization stay HIPAA compliant?
It’s any organization that collects, transmits, or stores any PHI via products or services.
Download our HIPAA compliance guide with checklists to track HIPAA compliance!
The channels through which PHI is stored and shared have changed significantly since the Health Insurance Portability and Accountability Act was passed in 1996. Instead of tucking away patient records in filing cabinets, most healthcare organizations store PHI on computer or cloud databases and interact with that data digitally. In response, Health Information Technology for Economic and Clinical Health (HITECH) developed revisions that limit the way in which organizations can legally share patient information. It also outlined specific technical requirements for how to store PHI.
The HIPAA Security Rule is an amendment to HIPAA law that addresses electronic protected health information (ePHI). The requirements of this amendment are what you need to pay attention to when making a plan for HIPAA IT compliance. The HIPAA Security Rule outlines three main categories of HIPAA safeguards for ePHI:
Covered entities must fulfill the requirements for all three categories in order to fulfill HIPAA obligations.
Getting and keeping your organization HIPAA compliant across all the digital channels and platforms it uses can be achieved in 9 steps.
The first half of 2018 alone saw 1.12 million health records exposed in a total of 110 breaches.
Having a breach protocol in place in order to respond if and when a breach occurs is vital for HIPAA compliance.
Organizations, of course, strive to reduce the likelihood of any data being exposed through a security breach. However, security is never guaranteed.
A HIPAA-compliant security brief protocol should outline:
When to report
Findings of the investigation, including identification of the root cause
Who to inform
Procedure for mitigation of the breach
Investigation of the breach
HIPAA requires every organization to have a designated individual to supervise developing and implementing its HIPAA compliance program. Typically, organizations have both a HIPAA Privacy Officer and a HIPAA Security Officer. Smaller organizations may combine these roles.
A HIPAA Privacy Officer is the person who will be accountable for developing and implementing security policies for correctly handling PHI according to HIPAA standards. This includes:
HIPAA PRIVACY OFFICER
It’s the person who will be accountable for developing and implementing security policies for correctly handling PHI according to HIPAA standards.
Conducting security risk assessments
Investigating possible breaches
Responding to breaches if they occur
Ensuring that patients’ rights are protected as outlined in state and federal laws
Additionally, the HIPAA Privacy Officer typically oversees employee privacy and security trainings.
The duties of a HIPAA Security Officer are similar to those of a Privacy Officer, although the focus of the Security Officer should be specifically on upholding the Security Rule and often involves the more technical aspects of the business. This rule requires covered entities to implement appropriate administrative, physical and technical safeguards to protect ePHI.
Organizations are not always aware of the security risks facing them. Conducting risk assessments regularly helps pinpoint weaknesses before hackers or other cybercriminals exploit them. Organizations should be proactively adjusting their policies and practices in response to security risk assessment findings.
HIPAA requires that organizations conduct a HIPAA risk assessment at a minimum of once a year.
Most organizations digitally communicate important patient information, including PHI, both internally and with patients. Under HIPAA, this is an acceptable practice as long as the platforms in use meet specific security standards.
First, let’s look at email. In order to ensure accountability for communication of PHI and to prevent PHI from being compromised during transit or at rest, HIPAA requires certain controls governing email communications to be put in place:
Texting is more strictly controlled. Generally, it is a violation of HIPAA to communicate PHI via text message except when using an application specifically designed to meet HIPAA compliance standards. These applications are encrypted and store photos, other images, and messages in-app rather than on the physician’s device.
It is important to keep these requirements in mind when considering all the devices and digital communications involved in your organization. If you have a Bring Your Own Device (BYOD) program or employees take work home with them, then your Privacy Officer or Security Officer needs to ensure that their devices are equipped with the appropriate applications, permissions, and encryptions. Outlining clear, written policies for how employees should handle PHI on their devices is required to be HIPAA compliant and will help avoid accidental HIPAA violations.
An example of an employee policy for storing PHI could be:
“Never store unencrypted files containing protected health information (PHI) regulated by HIPAA on your desktop, laptop, USB flash drive, tablet, smartphone, or other mobile devices. Encrypted files can be stored on these devices only after your senior executive officer has given prior written approval. Any device holding PHI must have full-disk encryption to ensure that PHI will be protected at rest. If you are not sure if your device is appropriately encrypted, contact the HIPAA Security Officer.”
While encryption and other security measures are your first layer of protection for PHI, your employees are the gatekeepers for that data. Any PHI protection plan is incomplete without employee education; in fact, HIPAA requires it.
HIPAA training should inform all relevant staff of your organization’s privacy and security practices for handling PHI. HIPAA education should also be a mandatory part of new employee onboarding. These trainings should ensure that each person clearly understands his or her obligations under HIPAA, as well as the consequences for violating HIPAA compliance.
HIPAA law requires covered entities to hold HIPAA trainings only when updates to the law are made. However, technology and individuals’ use of it changes constantly (and memories are not perfect). Best practice is holding refresher trainings on a yearly or bi-yearly basis, as well as obligatory trainings for new hires.
All covered entities are required to send patients a Notice of Privacy Practices (NPP). This document should inform patients in clear, easy-to-understand language of their individual rights regarding PHI and of the privacy practices of your organization. Whenever your organization makes a change in its privacy practices, patients must be informed of them.
NPPs must be available online and in written form. Additionally, covered entities are required to obtain each patient’s signature on the NPP confirming that they have read it and consent to their PHI being used in the ways denoted in the document. You can find NPP templates from the US Department of Health and Human Services here.
This document should inform patients in clear, easy-to-understand language of their individual rights regarding PHI and of the privacy practices of your organization
Your organization likely shares PHI with third parties like cloud servers, backup storage vendors, email encryption providers, or IT security vendors. Under HIPAA, your responsibilities for securing PHI extend to ensuring appropriate use of PHI by third parties, as well. Because of this, a Business Associate Agreement (BAA) is a key component of HIPAA compliance.
This document establishes each party’s obligations to uphold HIPAA law and protects you, as a covered entity, from prosecution for misuse of PHI by a business associate. You can find more detailed information on who needs a Business Associate Agreement further down on this page.
Technical safeguards help protect PHI by regulating access to that data by individuals within your organization and anticipating outside security breaches. Encryption is an essential technical safeguard. Encrypting records once they leave your own firewall means that data will be unreadable to an unauthorized third party.
Other HIPAA-required technical safeguards are:
Establish who has access privileges to PHI and assign each accessor a unique name and password. This both reduces the risk of misuse of PHI by an unauthorized individual and ensures that in case misuse occurs, your organization can identify the individual behind it.
Additionally, passwords used to access PHI must be secured and a protocol for creating and changing passwords needs to be put in place. HIPAA law does not specify what those protocols should be. Current best practice is implementing a 2-factor authentication system for logging in and changing passwords, while some experts recommend changing passwords every 60 to 90 days.
These controls allow you to monitor who has accessed PHI and what actions the accessing individual took. Audit controls let organizations ensure the integrity of PHI in real-time and provide a record of use in case of suspected misuse.
This helps prevent authorized individuals from inadvertently allowing others to access PHI, particularly in the event of loss or theft of a device containing PHI.
Additionally, standard operating procedures (SOPs) and configuration management for cloud services are an important part of HIPAA compliance. Defining a predictable configuration process makes your cloud security environment more robust by reducing human error. SOPs are necessary because it is typically the organization’s responsibility to implement necessary security controls for each individual service.
Once your organization has outlined its policies, educated its employees, and put the supporting infrastructure in place, you are ready to implement your organizational policies. These policies should outline standard operating procedures for implementing and maintaining technical security controls. Policies should also be built around HIPAA-eligible cloud services, and additional administrative security configurations.
HIPAA requires that all covered entities sign a Business Associate Agreements (BAA), or Business Associate Contract, with any business associates that may come into contact with PHI. Signing a BAA is a key step in HIPAA compliance. This document specifies how PHI is being safeguarded and defines responsibilities for both parties.
Software vendors are typically required to enter into a BAA when working with health providers and health systems.
BUSINESS ASSOCIATE AGREEMENTS (BAA)
This document specifies how PHI is being safeguarded and defines responsibilities for both parties.
In the context of PHI, HIPAA legislation uses the phrase “covered entities” to refer to any organization that collects, transmits, or stores any protected health information about individuals via products or services.
Examples of covered entities include:
State and local government agencies that are responsible for administering health care
State and local government income assistance and human service agencies
Hospitals and Medicaid and Medicare providers
Family health centers
Physicians and other healthcare professionals in private practice with patients assisted by Medicaid
Community mental health centers
Alcohol and drug treatment centers
Nursing homes and Foster care homes
Public and private adoption and foster care agencies
Daycare centers and Senior citizen centers
Health applications that are also business associates of the covered entity or that are operated by the covered entity
Any entity established under the Affordable Care Act
The Omnibus Rule, established in 2013, significantly expands potential liability for covered entities. Firstly, it extends the category of “business associate” to include subcontractors of business associates. Secondly, it holds covered entities responsible for acts or omissions by their business associates that compromise PHI if a BAA was not already in place at the time of the breach.
If your organization entrusts any PHI to a third party such as a cloud server, you are required by law to have a BAA.
Common examples of HIPAA-covered business associates include:
Cloud providers (such as AWS, Google, and Microsoft Azure)
Data conversion, de-identification, and data analysis service providers
Medical transcription companies and answering services
File sharing vendor
Email encryption vendor
Patient safety or accreditation organizations
Companies involved in claims processing, repricing, or collections (e.g., medical billing companies)
Health information exchanges (HIEs), e-prescribing gateways, and other HIOs
Third-party administrators and pharmacy benefit managers
A BAA outlines the responsibilities of both parties in their handling of PHI according to HIPAA standards. This document serves as protection for your organization. It gives you the right to take legal action against a business associate that breaks the BAA and can protect you from prosecution if a business associate violates HIPAA regulations without your knowledge. Not signing a BAA with third-party companies means that your organization can be prosecuted for misuse of data by any of those third parties.
The HHS Office for Civil Rights (OCR) regularly conducts HIPAA audits of healthcare organizations. Their purpose is to confirm the compliance of covered entities and their business associates with the HIPAA Privacy, Security, and Breach Notification Rules. Organizations that have had a security breach may be subject to a HIPAA audit and potential monetary penalties. The Audit Protocol was last updated in July 2018.
How to prepare for a HIPAA audit is an important question, and its answer starts with another question — when to prepare for a HIPAA audit?
Getting ready for a HIPAA audit starts on Day 1. If your organization is selected to be audited, you will need to submit certain HIPAA forms and documentation to the OCR that demonstrate how your organization was, is, and will be keeping itself HIPAA compliant. These documents should outline the development and implementation of your organization’s HIPAA compliance program. Because of this, it is very difficult and time-consuming to prepare for a HIPAA audit retroactively.
The Phase 2 HIPAA Audit Program assesses an organization and its business associates for both HIPAA Privacy Rule compliance and HIPAA Security Rule compliance. If audited, you will be asked to provide full documentation of how your organization ensures its workforce and business associates implement HIPAA standards as they come into contact with PHI.
You will also need to provide documentation demonstrating how your organization maintains HIPAA network compliance and secures PHI across information systems (hardware, software, information, data, applications, communications, and people). This category includes backup servers and cloud providers like AWS. (For more on AWS compliance, consult this article.)
Before getting into the specific documents required (there are many), it is helpful to consider the broader questions these documents are meant to answer:
How do we maintain a strong general security stance?
Where are our vulnerabilities and how are we addressing them?
How secure are our workstations and facilities?
Do our employees and BAs understand how to protect PHI?