Audits and the Zoom Data Security Debate



The COVID-19 pandemic has created a massive surge in the adoption of video conferencing tools, with most of the new attention going to Zoom.

Zoom is now both famous and infamous, because it’s easy to use, but has serious security and privacy problems, as I discussed in a previous post.

In this post, we’ll examine the most important, least discussed issue, which I think deserves more attention: Zoom’s SOC 2 report.

Success and backlash

Before I dive into what SOC 2 is and why it matters, a lot has happened lately to put the spotlight on video conferencing and Zoom.

Telemedicine has leaped forward as the healthcare industry responds to the COVID-19 pandemic. With a user base that is growing (although not as much as previously reported), Zoom is being used by a lot of people, including healthcare professionals.

But as usage explodes, researchers have discovered major security gaps, leading to bans by New York City and Singapore schools, companies like Elon Musk’s SpaceX and Tesla, Mercedes-Benz, Ericsson, the US Senate, German and Indian governments, NASA, and the UK National Health System.

Despite this, a number of governments continue to use Zoom for sensitive communications. Canada’s House of Commons is still meeting virtually on a platform described as a ‘gold rush for cyber spies’.

And the UK government continues on, despite their own security service’s warnings.

In healthcare, the situation is equally mixed and messy:

Talk about mixed signals! It seems like half the world is banning Zoom while the other half is jumping on board and driving up usage numbers.

Third-party audit of Zoom’s encryption raises more questions than answers

Why do I say that the most significant issue is Zoom’s SOC 2 report?

The Service Organization Controls (SOC) report administered by the American Institute of Certified Professional Accountants (AICPA) is intended to improve the trust between companies and the service providers that they hire.

Third-party auditors gather evidence on the implementation of:

  1. Security: the protection of information from unauthorized use,
  2. Availability: the ability to access the systems and use them when needed,
  3. Processing integrity: the correctness of the data it produces,
  4. Confidentiality: the limitation of access to certain information for business reasons, and
  5. Privacy: the proper management of personal information.

If you want to read a SOC 2 report, check out the DigitalOcean Trust page, which has the reports from their data centre provider Equinix.

If you haven’t looked at a SOC 2 report before, I suggest you skip over the boilerplate and start at Section 3. On the subject of encryption, it says, for example, that the auditors:

“Inspected digital certificates for the Equinix Customer Portal web servers determine that the web servers utilized TLS 1.2 encryption for web communication sessions. No exceptions noted.”

The key difference between SOC 2 and a lower-level risk assessment is that the auditors check very carefully that a company is actually doing everything that it says it does, whether it be on its marketing website, or in its product support documentation or customer contracts.

In the case of the Equinix SOC 2 report, for example, auditors examined what claims Equinix was making about their data centres, and took an inventory of their hardware, software, contracts, personnel, etc. that they use to deliver their services.

A SOC 2 audit is a long and expensive process! Any company could easily spend a year and $100,000 on its first audit, and that would only be done after it already has mature processes in place.

SOC 2 reports are a gold standard in the digital health industry for privacy and security compliance evaluation: no enterprise has time to exhaustively test the controls of each of its vendors, so it insists that each of its major vendors commission this independent, standard report, and then the enterprise’s security and compliance staff review it.

Then they can decide if they will trust the vendor, whether it be DigitalOcean, Equinix, MedStack, or indeed Zoom.

Given that Zoom has a SOC 2 report, how did they claim that they were using strong end-to-end encryption when independent security researchers found that they were actually using insecure, non-end-to-end encryption?

Encryption discrepancies vs. the SOC 2 attestation report

There are two particularly worrisome discrepancies between Zoom’s encryption claims and what independent researchers uncovered about the service:

Zoom’s claim What outside researchers found
“By default, Zoom encrypts in-meeting and in-webinar presentation content at the application layer using TLS 1.2 with Advanced Encryption Standard (AES) 256-bit algorithm for the Desktop Client.”

– Zoom support article: Encryption for Meetings (archive)

Security researchers found that Zoom was actually using AES-128 in ECB mode. This is much less secure encryption, in fact it’s widely called “insecure” or “weak”.

Move Fast and Roll Your Own Crypto: A Quick Look at the Confidentiality of Zoom Meetings – University of Toronto Citizen Lab

“End-to-end encryption”

Zoom meetings aren’t end-to-end encrypted, despite misleading marketing – The Intercept

When we did our MedStack SOC 2 audit, we had to prove to our auditors that our encryption was strong, so I have some direct experience in this area. Our auditors checked our cryptography policy, which says in part:

All encryption shall use the best reasonably available encryption standards

  • AES-256 cipher
  • 2048-bit keys

Only standard encryption methods will be used.

  • Use independent expert guidance to determine what protocols and configurations to use.
  • Keep protocols and configurations up to date when older versions are found to be insecure.

They then asked us to provide evidence that this was implemented in practice.

To do that effectively, we wrote software that automatically checks every single running system and outputs the cryptographic settings for HTTPS and full-disk encryption.

These settings, checked from live, running systems, show that they are configured for HTTPS with 2048-bit keys (and TLS 1.2+) for incoming connections, and that all disks are encrypted with LUKS/dm-crypt using AES-256.

These are the standards that the security community expects to see.

As a security expert asked me in conversation recently:

“Why didn’t their auditors review this?”

How did Zoom manage to have a SOC 2 report, apparently without noting this major flaw?

Regardless of the quality of their audit team, there’s a bit of a gaping hole here, and a lot of health records are on the line.

This discrepancy between their claims and the actual reality of their internal controls (their actual source code implementation) should definitely have been caught by the SOC 2 auditors.

Weird, right? In privacy and security, anything that makes you say “that’s weird”—is a big red flag.


  1. The auditors noticed the error, and reported it as an exception. If this is the case, why didn’t Zoom make changes? It would have been in their report continuously for years.
  2. The auditors reviewed Zoom’s encryption but didn’t notice the error. If so, that raises concerns about the auditor’s ability to understand encryption.

What about HIPAA and healthcare compliance?

Encryption is a tricky area when it comes to healthcare compliance.

Under HIPAA, the main US regulation covering health data privacy and security, encryption is classified as “addressable” — which means that you can be technically compliant without any encryption at all.

Here’s an interesting example where we found an issue with the security of AWS’s DynamoDB, a key-value database.

In 2018, we were testing it out to use it to store PHI (Protected Health Information, the data governed by HIPAA), based on a client request.

It was listed by AWS as one of the services covered under their HIPAA contract (their Business Associate Agreement).

The documentation listed encryption as an option in general, and the configuration for DynamoDB contained no mention of encryption.

So we put some non-PHI workloads on it – medical data, but not PHI – and we assumed that encryption was on by default, as it is in some AWS services.

We keep an eye on vendor announcements so this one caught our eye: Amazon DynamoDB Encryption at Rest Now Available in Additional Regions.

Curiously, our DynamoDB was in one of the regions listed as now, for the first time, supporting encryption—implying that it didn’t before.

This was, to say the least, alarming. And AWS’s response was especially revealing. To paraphrase, they said:

The implementation of encryption is addressable under the HIPAA Security Rule. DynamoDB has other controls to protect data and is therefore still HIPAA eligible with HIPAA even without encryption.

In the end, the situation was resolved, but it seemed to me that Amazon was claiming that the absence of encryption was fine. Were they right?

Not quite: although HIPAA doesn’t strictly require it, the fines levied by the agencies that enforce HIPAA for exposing unencrypted data are enormous, whereas the loss of properly encrypted data is considered to be not a loss at all.

This is important because Zoom signs HIPAA BAA contracts with their customers, promising to be HIPAA compliant.

But as we have seen with AWS, Zoom could say that they are HIPAA compliant with no encryption, or any sort of encryption—as long as they believe that whatever they have implemented is sufficient, like AWS did with DynamoDB.

To put it another way: there are subtleties to what HIPAA has to say about encryption.

When we found that AWS wasn’t encrypting our DynamoDB, they were technically still in compliance, but in practice, they were exposing customers to a risk that most digital health companies wouldn’t accept if they knew about it.

It would appear that Zoom was using the same flawed argument.

HIPAA requires “best efforts”—a BAA contract isn’t bulletproof

HIPAA—along with GDPR and every healthcare data regulation in the world—has no official certification. There’s no way to prove that a company, or its suppliers or their service providers, are compliant.

It’s up to that company to prevent a breach and to ensure that everyone all the way down the chain is meeting their obligations.  But hospitals and health insurers work hard to manage their risk.

If they have a data breach, they face not just millions of dollars in fines but also a loss of reputation. As a result they put a strong burden of proof on the vendors that they work with.

So with the DynamoDB example, if we had stored PHI in that database, we would have had a situation where PHI was not being protected to the standards that we believe are necessary for HIPAA compliance, and we would have exposed ourselves and our customers to a data breach risk.

And if someone was using Zoom with non-end-to-end, insecure encryption, then by our standards, we would also deem that to be non-compliant.

We’re not the only ones who think so. In a recent Bloomberg Law article, Doctors Using Zoom Face Security Scrutiny During Virus, lawyer Mark McCreary was quoted:

McCreary, however, said the Zoom product lacks end-to-end encryption—at least for now. Until Zoom reconfigures the product to include that feature, it’s not truly HIPAA-compliant, he said.

In the end, me and you, the end users, are definitely on the hook if data is lost or stolen and it’s not securely encrypted. Caveat emptor—let the buyer beware.

Plenty of alternatives

Before I make the mistake of saying that Zoom’s competitors have perfect security: no one has perfect security. That’s why we follow a practice of defence in depth.

And companies can make turnarounds. Microsoft Azure, for example, has a great record. Zoom could turn it around, even if in the past their security has been so bad that DropBox paid hackers to find their bugs for them.

We are testing out Microsoft Teams Meetings and Google Meet, and we find that they both have good quality audio/video and decent security. An open source solution that’s getting attention is Jitsi Meet.

And in doing my research for this article, I discovered to my surprise that Citrix GoToMeeting has had only one published security flaw ever, in 2014. That’s impressive.

For true end-to-end encryption, Apple’s FaceTime gets points because not even Apple can view your conversations, according to Apple. Signal is an open-source app that also has true E2E encryption, and I highly recommend it for instant messaging, but it doesn’t currently have group calling.

Zoom announced on April 22 that they are no longer using the weak, insecure ECB mode of AES, and have switched to GCM mode. This is good. They have also implemented a wide range of other changes as they dedicate their entire efforts towards achieving a secure product.

In the end, every organization needs to make their own call about what software to use.

Security is a complex and rapidly evolving area, and we hope that we can make it a bit easier to understand the risks (like bad encryption) and mitigations (like audit and vendor selection) so that your organization can focus on creating better healthcare solutions.

Image credit: