Why We Should Stop Using Zoom in Healthcare
I have been a fan of Zoom for video conferencing since 2014. Unfortunately in the last few weeks security researchers have found that many of Zoom’s claims on security and privacy are false or misleading.
For example, Zoom claimed to be running end-to-end encryption, which is commonly understood to mean that Zoom themselves, being in the middle, could not read the data. (See this definition from 2014 in Wired.) This turned out to be false, and Zoom has since changed their terminology.
More worrisome, Zoom claimed to use industry-standard AES-256 cryptography but researchers discovered the weaker AES-128 with an obsolete protocol (ECB). (See note  below for compliance details.) We have answered many hospital and insurance security questionnaires where this protocol would not pass.
But the most disturbing revelation is of Zoom sending data through Chinese data centres, even if all of the attendees are in other countries. This is a vulnerability for a number of reasons:
- Zoom has designed their non-end-to-end encryption to decipher the video on the server. If that server is in China, then their intelligence agencies could perform legal surveillance on your meetings.
- Even without access to the server itself, hackers could decrypt the network streams, since AES-128 with ECB mode is easy to crack.
- If the Zoom meeting participants discuss health information (PHI) or other sensitive information, that data could then be used for insurance fraud, blackmail, and political purposes.
The fundamental issue here is data sovereignty and cross-border data transfers. As soon as your data crosses a border, the government of that country may have the legal ability to collect that data. If that country is China, with a history of pervasive data collection and state-sponsored hacking, that’s probably not what you want for your sensitive data.
What to do now?
This isn’t the first time we’ve seen issues with Zoom’s security, for example the ability to hijack someone’s webcam without their permission. This latest round of security and privacy lapses is the result of well-deserved attention from researchers like the University of Toronto-based Citizen Lab, and go into more serious territory.
It’s one thing to use a narrow name-space for meeting IDs that allows Zoombombing. It’s another to outright lie about encryption and send data to a country with a well established pattern of industrial espionage.
At MedStack we have made the decision to migrate off Zoom’s platform and are evaluating the alternatives. We are currently testing Microsoft Teams, as we have been impressed with Microsoft’s Azure cloud security.
Why didn’t we know that Zoom was flawed?
What are the lessons here? First, a certain amount of healthy skepticism is warranted when evaluating the security of any vendor.
A good compliance and security program means that we need to evaluate all of our vendors. For example, ISO 27001 sections A.15.1 & A.15.2 require you to evaluate and manage your suppliers, and SOC 2 CC9.2 requires that “the entity assesses and manages risks associated with vendors and business partners.” And so at MedStack we had examined Zoom’s security white paper which as of this writing still contains the false information outlined above.
What’s more, Zoom signs HIPAA BAAs, indicating that they believe they are compliant. Most organizations, even large ones, don’t have the capacity to conduct security research on their vendors. We all rely on a limited number of researchers, like Bruce Schneier, who are constantly threatened and sued for finding vulnerabilities in software. We should thank and support the security researchers for persevering!
Despite a BAA, we (all of us) as digital health services are still responsible to analyze our suppliers, and when circumstances warrant, make responsible decisions about who to trust with sensitive data.
What does this mean for healthcare providers?
There’s a good reason why specialized telemedicine apps exist. Healthcare apps have to comply with a much stronger set of rules than typical enterprise or consumer apps, and rightly so.
Healthcare providers like MedStack customer Maple have to meet much more stringent standards than Zoom, and face more serious consequences if they fail to meet them (see our previous article Understanding HIPAA Enforcement and How to Avoid It).
This is good, because healthcare data is much more valuable and sensitive, and making it easier to meet and exceed these requirements is the reason for MedStack’s very existence.
Major healthcare providers get this. We have answered many security questionnaires from major insurance companies and hospitals that already forbid AES-128 or ECB mode. But Zoom has broken trust by claiming falsely that they are using AES-256.
Fortunately there’s no need to use Zoom for telemedicine. I’m sure that Apple’s FaceTime fully meets HIPAA requirements because Apple leads the world in security, but FaceTime, Skype, and other consumer or enterprise video conferencing systems don’t come with the corresponding secure facilities for health record integration, scheduling, charting, and so on.
There are a broad array of excellent telemedicine providers that do exactly this. And because they comply with healthcare data regulations, their security compliance is going to be much stronger than the baseline corporate standard.
To put it simply, the security and privacy requirements for healthcare are only matched or exceeded by sectors like the military and finance. Even in these exceptional times, your data needs to be safe and guarded by top-grade privacy and security.
 Compliance standard information about AES-128 with ECB mode:
- OWASP’s Cryptographic Storage Cheat Sheet says:
- For symmetric encryption AES with a key that’s at least 128 bits (ideally 256 bits) and a secure mode should be used as the preferred algorithm.
- ECB should not be used outside of very specific circumstances.
- PCI-DSS does not permit any form of TLS encryption that uses ECB mode.
- HIPAA makes encryption “Addressable” which means that you must choose and justify the encryption that you use (if any). In practice, fines are significantly higher if good encryption is not in use.
- GDPR says that encryption is appropriate, but also does not specify any specifics. (HIPAA and GDPR do this because encryption technology changes much faster than the law.)