In 2020 alone, over 70% of healthcare providers in the United States utilized mobile health apps.
Why is this?
The answer is relatively simple; mobile health apps have enabled patients to take control of their health safely and securely through the simple tap of a touchscreen. But, most importantly, these apps have allowed them to do so in a critical year when social distancing became necessary due to the COVID-19 pandemic.
So how do mobile health apps play into the larger conversation of HIPAA compliance in software? This is a critical question for all healthcare providers to consider when deciding whether or not to develop a mobile health application.
We will answer this question, explore related questions around HIPAA compliance in software development, and provide valuable resources to guide you through the first stages of the medical app developer vetting process.
HIPAA stands for the Health Insurance Portability and Accountability Act of 1996, a law enacted by the United States government to protect sensitive Patient Health Information (PHI) and monitor the use, storage, and distribution of such information. HIPAA applies to “covered entities” (CEs), including health plans, healthcare providers, and healthcare clearinghouses.
Further Reading: For more information, see The HIPAA Journal.
Three important examples of HIPAA compliance for mobile health apps include:
This includes secure and integrated messaging tools enabling two-way communication between providers and patients.
This refers to hosting services that provide security against cyberattacks at a minimum but may also provide security against more malicious attacks like DDoS.
This refers to outsourced data storage management services like AWS, Microsoft Azure, and Google Cloud.
Note: These are just three primary examples of HIPAA compliance in software. For a thorough overview of HIPAA safeguards and requirements, read on or see the HIPAA Journal Compliance Checklist.
If you're unsure as to whether or not your application will be regulated by HIPAA, ask yourself these three critical questions:
Your software must be HIPAA compliant if you're a covered entity (such as a physician, hospital, or insurance provider that works directly with patients). Business associates (that collect, store, process, or share PHI) are also subject to HIPAA. Additionally, under the 2013 Omnibus Rule, third-party vendors (including but not limited to developers) assume liability for HIPAA violations.
Note: There are some scenarios where entities do not need to be HIPAA compliant; if your software holds PHI but cannot transmit this data to a CE, then you do not need to be HIPAA compliant. A classic example of this scenario includes wellness tracking devices like bracelets or watches that track steps or count calories.
If your mobile health app uses, shares or stores protected health information (PHI), HIPAA compliance is not optional. Protected health information includes any medical information that could identify a patient and is exchanged between a covered entity and a business associate.
When protected health information is combined with personal identifiable information (PII), it can expose significant risks for HIPAA violations. Make sure you clearly understand exactly what patient data will be gathered and how that data will be managed to protect your patients' privacy and mitigate risk.
Application encryption refers explicitly to the type of technology your application uses and its security standards. For example, if your mobile health application handles PHI, at a bare minimum, you'll need a data security solution that encrypts sensitive patient data. Additionally, this data must only be accessible by authorized parties.
By encrypting your patients' sensitive data at the application level, you can secure it before storing it in a database or cloud environment. App Transport Security (ATS) can also force your mobile health app to link back-end servers on HTTPS (SSL Certificate). This allows you to encrypt data while it's in transit.
Tip: Contrary to popular belief, short message services (SMS) and multimedia messaging services (MMS) are not encrypted.
Now that you have a clear understanding about what entities will utilize your application, how you application will handle sensitive patient data, and the security standards for your application, you can make a clear determination about where you fall in regards to HIPAA compliance. Let's dig deeper into the requirements for covered entities and business associates for building mobile health apps.
Under HIPAA, there are four rules pertinent to compliance: the Privacy Rule, the Security Rule, the Enforcement Rule, and the Breach Notification Rule. Detailed summaries of each rule can be found here, but they outline the expectations and requirements for covered entities and business associates. Respectively, they establish best practices for compliance, data security standards, noncompliance repercussions, and self-accountability requirements.
To mitigate risk and ensure compliance, ensure your mobile health application is designed to collect only necessary PHI. Collecting superfluous information will only expose you to additional requirements and potential risks.
This is key for security measures. Your app should have physical, technical, and administrative safeguards in place to ensure the privacy of users’ information. These can include identity authentication (two-factor and facial recognition examples), device controls (like session timeouts and hidden push notifications), access controls for all relevant personnel, and protocols for destroying PHI once it’s no longer needed.
To be HIPAA compliant, CEs and BAs need to hold themselves accountable. This is done by recording and tracking any actions involving PHI, then assessing any associated risks. In addition, you can use the risk assessment tool the Department of Health and Human Services provided to determine any potential security risks to your mHealth app.
Just as you should have safeguards in place to destroy unnecessary PHI, it is also recommended to have methods for retrieving PHI in case of an emergency. This includes any situation that leads to any accessibility issues with the main system.
Be fully transparent about your policies and procedures involving HIPAA and how they relate to your users' information. Ensure this information is easily accessible to users and remain as clear as possible when providing information about an individual’s PHI.
For more tips on compliance, check out our article, HIPAA Compliance vs. User Experience for mHealth Apps.
The process of selecting an app developer for your mobile healthcare app does not have to be an arduous one, but you should be confident in the selected vendor’s ability to comply with HIPAA. When researching app developers for your healthcare application, don't be afraid to ask detailed questions, to see work samples, or to speak with references.
Here are 3 tips to consider when vetting medical app developers:
Be selective and ensure the app developer you are interviewing has specific experience in developing medical apps. Better yet, ask to see their full portfolio of work or to speak with a client reference. Check out their online resources to better understand how well-versed they are in the subject matter.
A well-versed medical app developer can quickly speak to their security standards and testing processes. Important functions and roles to be aware of include static and dynamic tests, security audits, a HIPAA compliant specialist that can monitor your app's documentation, and pen testing (penetration testing) that simulates a cyber attack. Pen testing is especially important when you launch updates to your application.
It's important to note that there are two significant components to this qualification; 1) The cloud platform itself and 2) How the platform will be used. It's not enough for a business associate to be HIPAA certified and/or a Cloud Service Provider to be HIPAA certified. These two entities must work together to integrate a HIPAA-compliant process for managing patient data.
A knowledgeable app developer can guide you through a HIPAA-compliant risk analysis. They will also have a deep understanding of the cloud computing environment and the service offered by the chosen platform provider.
Further Reading: For more information on HIPAA compliance and cloud computing platforms, see the HIPAA Journal.
Simply put, the repercussions for violating HIPAA regulations are enormous. Beyond the damage done to patient privacy, security, and trust, there are many punishments you could face as a healthcare provider. Among those punishments are jail time and hefty fines, ranging from $10 to a maximum penalty of $1.4 million a year. The financial penalties are determined by a tier system based on a CE’s knowledge of the infraction. The HIPAA Journal lists out each of the four tiers, their qualifications, and the range of fine amounts.
Additionally, when you or your vendor fails to comply with HIPAA, you risk losing the trust and confidence of your patients. They will appreciate the convenience of a mobile healthcare app, but at the same time, they expect their PHI to be treated with the same level of care on mobile apps as it is in the physical world.
HIPAA in the tech world can be a daunting topic, but it’s well worth your time and effort to work with a development team you can trust. If you have any questions or would like to speak to an expert about your healthcare app project, schedule a 30-minute call with a member of our team. For more information on HIPAA compliance, you can find a complete compliance checklist on the HIPAA Journal webpage and a wealth of other resources.